WorldWideScience

Sample records for agent based classification

  1. A strategy learning model for autonomous agents based on classification

    Directory of Open Access Journals (Sweden)

    Śnieżyński Bartłomiej

    2015-09-01

    Full Text Available In this paper we propose a strategy learning model for autonomous agents based on classification. In the literature, the most commonly used learning method in agent-based systems is reinforcement learning. In our opinion, classification can be considered a good alternative. This type of supervised learning can be used to generate a classifier that allows the agent to choose an appropriate action for execution. Experimental results show that this model can be successfully applied for strategy generation even if rewards are delayed. We compare the efficiency of the proposed model and reinforcement learning using the farmer-pest domain and configurations of various complexity. In complex environments, supervised learning can improve the performance of agents much faster that reinforcement learning. If an appropriate knowledge representation is used, the learned knowledge may be analyzed by humans, which allows tracking the learning process

  2. Odor Classification using Agent Technology

    Directory of Open Access Journals (Sweden)

    Sigeru OMATU

    2014-03-01

    Full Text Available In order to measure and classify odors, Quartz Crystal Microbalance (QCM can be used. In the present study, seven QCM sensors and three different odors are used. The system has been developed as a virtual organization of agents using an agent platform called PANGEA (Platform for Automatic coNstruction of orGanizations of intElligent Agents. This is a platform for developing open multi-agent systems, specifically those including organizational aspects. The main reason for the use of agents is the scalability of the platform, i.e. the way in which it models the services. The system models functionalities as services inside the agents, or as Service Oriented Approach (SOA architecture compliant services using Web Services. This way the adaptation of the odor classification systems with new algorithms, tools and classification techniques is allowed.

  3. Bargaining agents based system for automatic classification of potential allergens in recipes

    Directory of Open Access Journals (Sweden)

    José ALEMANY

    2016-11-01

    Full Text Available The automatic recipe recommendation which take into account the dietary restrictions of users (such as allergies or intolerances is a complex and open problem. Some of the limitations of the problem is the lack of food databases correctly labeled with its potential allergens and non-unification of this information by companies in the food sector. In the absence of an appropriate solution, people affected by food restrictions cannot use recommender systems, because this recommend them inappropriate recipes. In order to resolve this situation, in this article we propose a solution based on a collaborative multi-agent system, using negotiation and machine learning techniques, is able to detect and label potential allergens in recipes. The proposed system is being employed in receteame.com, a recipe recommendation system which includes persuasive technologies, which are interactive technologies aimed at changing users’ attitudes or behaviors through persuasion and social influence, and social information to improve the recommendations.

  4. Multi-Agent Information Classification Using Dynamic Acquaintance Lists.

    Science.gov (United States)

    Mukhopadhyay, Snehasis; Peng, Shengquan; Raje, Rajeev; Palakal, Mathew; Mostafa, Javed

    2003-01-01

    Discussion of automated information services focuses on information classification and collaborative agents, i.e. intelligent computer programs. Highlights include multi-agent systems; distributed artificial intelligence; thesauri; document representation and classification; agent modeling; acquaintances, or remote agents discovered through…

  5. Agent Collaborative Target Localization and Classification in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sheng Wang

    2007-07-01

    Full Text Available Wireless sensor networks (WSNs are autonomous networks that have beenfrequently deployed to collaboratively perform target localization and classification tasks.Their autonomous and collaborative features resemble the characteristics of agents. Suchsimilarities inspire the development of heterogeneous agent architecture for WSN in thispaper. The proposed agent architecture views WSN as multi-agent systems and mobileagents are employed to reduce in-network communication. According to the architecture,an energy based acoustic localization algorithm is proposed. In localization, estimate oftarget location is obtained by steepest descent search. The search algorithm adapts tomeasurement environments by dynamically adjusting its termination condition. With theagent architecture, target classification is accomplished by distributed support vectormachine (SVM. Mobile agents are employed for feature extraction and distributed SVMlearning to reduce communication load. Desirable learning performance is guaranteed bycombining support vectors and convex hull vectors. Fusion algorithms are designed tomerge SVM classification decisions made from various modalities. Real world experimentswith MICAz sensor nodes are conducted for vehicle localization and classification.Experimental results show the proposed agent architecture remarkably facilitates WSNdesigns and algorithm implementation. The localization and classification algorithms alsoprove to be accurate and energy efficient.

  6. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  7. Granular loess classification based

    International Nuclear Information System (INIS)

    Browzin, B.S.

    1985-01-01

    This paper discusses how loess might be identified by two index properties: the granulometric composition and the dry unit weight. These two indices are necessary but not always sufficient for identification of loess. On the basis of analyses of samples from three continents, it was concluded that the 0.01-0.5-mm fraction deserves the name loessial fraction. Based on the loessial fraction concept, a granulometric classification of loess is proposed. A triangular chart is used to classify loess

  8. Agent-Based Optimization

    CERN Document Server

    Jędrzejowicz, Piotr; Kacprzyk, Janusz

    2013-01-01

    This volume presents a collection of original research works by leading specialists focusing on novel and promising approaches in which the multi-agent system paradigm is used to support, enhance or replace traditional approaches to solving difficult optimization problems. The editors have invited several well-known specialists to present their solutions, tools, and models falling under the common denominator of the agent-based optimization. The book consists of eight chapters covering examples of application of the multi-agent paradigm and respective customized tools to solve  difficult optimization problems arising in different areas such as machine learning, scheduling, transportation and, more generally, distributed and cooperative problem solving.

  9. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  10. Multi-agent Negotiation Mechanisms for Statistical Target Classification in Wireless Multimedia Sensor Networks

    Science.gov (United States)

    Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng

    2007-01-01

    The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223

  11. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  12. Agent Programming Languages and Logics in Agent-Based Simulation

    DEFF Research Database (Denmark)

    Larsen, John

    2018-01-01

    and social behavior, and work on verification. Agent-based simulation is an approach for simulation that also uses the notion of agents. Although agent programming languages and logics are much less used in agent-based simulation, there are successful examples with agents designed according to the BDI...

  13. Multi-agent Negotiation Mechanisms for Statistical Target Classification in Wireless Multimedia Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sheng Wang

    2007-10-01

    Full Text Available The recent availability of low cost and miniaturized hardware has allowedwireless sensor networks (WSNs to retrieve audio and video data in real worldapplications, which has fostered the development of wireless multimedia sensor networks(WMSNs. Resource constraints and challenging multimedia data volume makedevelopment of efficient algorithms to perform in-network processing of multimediacontents imperative. This paper proposes solving problems in the domain of WMSNs fromthe perspective of multi-agent systems. The multi-agent framework enables flexible networkconfiguration and efficient collaborative in-network processing. The focus is placed ontarget classification in WMSNs where audio information is retrieved by microphones. Todeal with the uncertainties related to audio information retrieval, the statistical approachesof power spectral density estimates, principal component analysis and Gaussian processclassification are employed. A multi-agent negotiation mechanism is specially developed toefficiently utilize limited resources and simultaneously enhance classification accuracy andreliability. The negotiation is composed of two phases, where an auction based approach isfirst exploited to allocate the classification task among the agents and then individual agentdecisions are combined by the committee decision mechanism. Simulation experiments withreal world data are conducted and the results show that the proposed statistical approachesand negotiation mechanism not only reduce memory and computation requi

  14. The efficiency of the RULES-4 classification learning algorithm in predicting the density of agents

    Directory of Open Access Journals (Sweden)

    Ziad Salem

    2014-12-01

    Full Text Available Learning is the act of obtaining new or modifying existing knowledge, behaviours, skills or preferences. The ability to learn is found in humans, other organisms and some machines. Learning is always based on some sort of observations or data such as examples, direct experience or instruction. This paper presents a classification algorithm to learn the density of agents in an arena based on the measurements of six proximity sensors of a combined actuator sensor units (CASUs. Rules are presented that were induced by the learning algorithm that was trained with data-sets based on the CASU’s sensor data streams collected during a number of experiments with “Bristlebots (agents in the arena (environment”. It was found that a set of rules generated by the learning algorithm is able to predict the number of bristlebots in the arena based on the CASU’s sensor readings with satisfying accuracy.

  15. From fault classification to fault tolerance for multi-agent systems

    CERN Document Server

    Potiron, Katia; Taillibert, Patrick

    2013-01-01

    Faults are a concern for Multi-Agent Systems (MAS) designers, especially if the MAS are built for industrial or military use because there must be some guarantee of dependability. Some fault classification exists for classical systems, and is used to define faults. When dependability is at stake, such fault classification may be used from the beginning of the system's conception to define fault classes and specify which types of faults are expected. Thus, one may want to use fault classification for MAS; however, From Fault Classification to Fault Tolerance for Multi-Agent Systems argues that

  16. Agent-based Big Data Classification

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... relevant and interesting knowledge from data, while data mining is a particular ... partitioning the original feature space instead of using the whole input ... classifying sentiment of online reviews using ontology. The proposed ...

  17. CT-based injury classification

    International Nuclear Information System (INIS)

    Mirvis, S.E.; Whitley, N.O.; Vainright, J.; Gens, D.

    1988-01-01

    Review of preoperative abdominal CT scans obtained in adults after blunt trauma during a 2.5-year period demonstrated isolated or predominant liver injury in 35 patients and splenic injury in 33 patients. CT-based injury scores, consisting of five levels of hepatic injury and four levels of splenic injury, were correlated with clinical outcome and surgical findings. Hepatic injury grades I-III, present in 33 of 35 patients, were associated with successful nonsurgical management in 27 (82%) or with findings at celiotomy not requiring surgical intervention in four (12%). Higher grades of splenic injury generally required early operative intervention, but eight (36%) of 22 patients with initial grade III or IV injury were managed without surgery, while four (36%) of 11 patients with grade I or II injury required delayed celiotomy and splenectomy (three patients) or emergent rehospitalization (one patient). CT-based injury classification is useful in guiding the nonoperative management of blunt hepatic injury in hemodynamically stable adults but appears to be less reliable in predicting the outcome of blunt splenic injury

  18. Review of therapeutic agents for burns pruritus and protocols for management in adult and paediatric patients using the GRADE classification

    Directory of Open Access Journals (Sweden)

    Goutos Ioannis

    2010-10-01

    Full Text Available To review the current evidence on therapeutic agents for burns pruritus and use the Grading of Recommendations, Assessment, Development and Evaluation (GRADE classification to propose therapeutic protocols for adult and paediatric patients. All published interventions for burns pruritus were analysed by a multidisciplinary panel of burns specialists following the GRADE classification to rate individual agents. Following the collation of results and panel discussion, consensus protocols are presented. Twenty-three studies appraising therapeutic agents in the burns literature were identified. The majority of these studies (16 out of 23 are of an observational nature, making an evidence-based approach to defining optimal therapy not feasible. Our multidisciplinary approach employing the GRADE classification recommends the use of antihistamines (cetirizine and cimetidine and gabapentin as the first-line pharmacological agents for both adult and paediatric patients. Ondansetron and loratadine are the second-line medications in our protocols. We additionally recommend a variety of non-pharmacological adjuncts for the perusal of clinicians in order to maximise symptomatic relief in patients troubled with postburn itch. Most studies in the subject area lack sufficient statistical power to dictate a ′gold standard′ treatment agent for burns itch. We encourage clinicians to employ the GRADE system in order to delineate the most appropriate therapeutic approach for burns pruritus until further research elucidates the most efficacious interventions. This widely adopted classification empowers burns clinicians to tailor therapeutic regimens according to current evidence, patient values, risks and resource considerations in different medical environments.

  19. Agent Based Individual Traffic guidance

    DEFF Research Database (Denmark)

    Wanscher, Jørgen Bundgaard

    2004-01-01

    When working with traffic planning or guidance it is common practice to view the vehicles as a combined mass. >From this models are employed to specify the vehicle supply and demand for each region. As the models are complex and the calculations are equally demanding the regions and the detail...... of the road network is aggregated. As a result the calculations reveal only what the mass of vehicles are doing and not what a single vehicle is doing. This is the crucial difference to ABIT (Agent Based Individual Trafficguidance). ABIT is based on the fact that information on the destination of each vehicle...

  20. Cloud field classification based on textural features

    Science.gov (United States)

    Sengupta, Sailes Kumar

    1989-01-01

    An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes

  1. Validation of Agent Based Distillation Movement Algorithms

    National Research Council Canada - National Science Library

    Gill, Andrew

    2003-01-01

    Agent based distillations (ABD) are low-resolution abstract models, which can be used to explore questions associated with land combat operations in a short period of time Movement of agents within the EINSTein and MANA ABDs...

  2. Agent-based enterprise integration

    Energy Technology Data Exchange (ETDEWEB)

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  3. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  4. EMG finger movement classification based on ANFIS

    Science.gov (United States)

    Caesarendra, W.; Tjahjowidodo, T.; Nico, Y.; Wahyudati, S.; Nurhasanah, L.

    2018-04-01

    An increase number of people suffering from stroke has impact to the rapid development of finger hand exoskeleton to enable an automatic physical therapy. Prior to the development of finger exoskeleton, a research topic yet important i.e. machine learning of finger gestures classification is conducted. This paper presents a study on EMG signal classification of 5 finger gestures as a preliminary study toward the finger exoskeleton design and development in Indonesia. The EMG signals of 5 finger gestures were acquired using Myo EMG sensor. The EMG signal features were extracted and reduced using PCA. The ANFIS based learning is used to classify reduced features of 5 finger gestures. The result shows that the classification of finger gestures is less than the classification of 7 hand gestures.

  5. Inventory classification based on decoupling points

    Directory of Open Access Journals (Sweden)

    Joakim Wikner

    2015-01-01

    Full Text Available The ideal state of continuous one-piece flow may never be achieved. Still the logistics manager can improve the flow by carefully positioning inventory to buffer against variations. Strategies such as lean, postponement, mass customization, and outsourcing all rely on strategic positioning of decoupling points to separate forecast-driven from customer-order-driven flows. Planning and scheduling of the flow are also based on classification of decoupling points as master scheduled or not. A comprehensive classification scheme for these types of decoupling points is introduced. The approach rests on identification of flows as being either demand based or supply based. The demand or supply is then combined with exogenous factors, classified as independent, or endogenous factors, classified as dependent. As a result, eight types of strategic as well as tactical decoupling points are identified resulting in a process-based framework for inventory classification that can be used for flow design.

  6. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    to launch the MFM Workbench into an agent based environment, which can complement disadvantages of the original software. The agent-based MFM Workbench is centered on a concept called “Blackboard System” and use an event based mechanism to arrange the reasoning tasks. This design will support the new...

  7. Three essays in agent-based macroeconomics

    OpenAIRE

    Canzian, Giulia

    2009-01-01

    The dissertation is aimed at offering an insight into the agent-based methodology and its possible application to the macroeconomic analysis. Relying on this methodology, I deal with three different issues concerning heterogeneity of economic agents, bounded rationality and interaction. Specifically, the first chapter is devoted to describe the distinctive characteristics of agent-based economics and its advantages-disadvantages. In the second chapter I propose a credit market framework c...

  8. Voice based gender classification using machine learning

    Science.gov (United States)

    Raahul, A.; Sapthagiri, R.; Pankaj, K.; Vijayarajan, V.

    2017-11-01

    Gender identification is one of the major problem speech analysis today. Tracing the gender from acoustic data i.e., pitch, median, frequency etc. Machine learning gives promising results for classification problem in all the research domains. There are several performance metrics to evaluate algorithms of an area. Our Comparative model algorithm for evaluating 5 different machine learning algorithms based on eight different metrics in gender classification from acoustic data. Agenda is to identify gender, with five different algorithms: Linear Discriminant Analysis (LDA), K-Nearest Neighbour (KNN), Classification and Regression Trees (CART), Random Forest (RF), and Support Vector Machine (SVM) on basis of eight different metrics. The main parameter in evaluating any algorithms is its performance. Misclassification rate must be less in classification problems, which says that the accuracy rate must be high. Location and gender of the person have become very crucial in economic markets in the form of AdSense. Here with this comparative model algorithm, we are trying to assess the different ML algorithms and find the best fit for gender classification of acoustic data.

  9. Story telling engine based on agent interaction

    OpenAIRE

    Porcel, Juan Carlos

    2008-01-01

    Comics have been used as a programming tool for agents, giving them instructions on how to act. In this thesis I do this in reverse, I use comics to describe the actions of agents already interacting with each other to create a storytelling engine that dynamically generate stories, based on the interaction of said agents. The model for the agent behaviours is based on the improvisational puppets model of Barbara Hayes-Roth. This model is chosen due to the nature of comics themselves. Comics ...

  10. Rough set classification based on quantum logic

    Science.gov (United States)

    Hassan, Yasser F.

    2017-11-01

    By combining the advantages of quantum computing and soft computing, the paper shows that rough sets can be used with quantum logic for classification and recognition systems. We suggest the new definition of rough set theory as quantum logic theory. Rough approximations are essential elements in rough set theory, the quantum rough set model for set-valued data directly construct set approximation based on a kind of quantum similarity relation which is presented here. Theoretical analyses demonstrate that the new model for quantum rough sets has new type of decision rule with less redundancy which can be used to give accurate classification using principles of quantum superposition and non-linear quantum relations. To our knowledge, this is the first attempt aiming to define rough sets in representation of a quantum rather than logic or sets. The experiments on data-sets have demonstrated that the proposed model is more accuracy than the traditional rough sets in terms of finding optimal classifications.

  11. Agent based energy management systems

    Energy Technology Data Exchange (ETDEWEB)

    Wolter, Martin

    2012-07-01

    In liberalized, regulated energy markets, the different participants - namely producers and consumers of energy, transmission and distribution system operators as well as regulatory authorities - have partly divergent and partly convergent interests. Loads, power plants and grid operators try to maximize their own benefit in this highly complex environment accepting to act detrimentally to others. Although the relationship between the participants is mostly competitive, there are some fundamental shared interests, e.g. voltage stability, a constant system frequency or efficient energy production, transmission and distribution, which are endangered e.g. by increased injection of volatile sources in low and medium voltage grids, displacement of stabilizing bulk generation and the slowly progressing extension of the electric grid. There is a global consensus, that the resulting challenges can efficiently be faced using information and communication technologies to coordinate grid utilization and operation. The basic idea is to benefit from unused reserves by participating in deployment of system services e.g. reactive power supply to keep the voltage within certain bounds. The coordination can best be done by the grid operator. All activities of that kind are summarized under the umbrella term ''Smart Grid''. To simultaneously model the behavior and interests of different types of market participants and their convergent and divergent interests, multi-agent systems are used. They offer a perfectly fitting framework for this sort of game theory and can easily be adapted to all kinds of new challenges of electricity markets. In this work, multi-agent systems are used to either cooperatively or competitively solve problems in distribution and transmission systems. Therefore, conventional algorithms have to be modified to converge into multiple local optima using only small pieces of the entire system information. It is clearly stated, that personal

  12. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  13. Assurance in Agent-Based Systems

    Energy Technology Data Exchange (ETDEWEB)

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-05-10

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems.

  14. Assurance in Agent-Based Systems

    International Nuclear Information System (INIS)

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-01-01

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems

  15. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  16. Analysis of composition-based metagenomic classification.

    Science.gov (United States)

    Higashi, Susan; Barreto, André da Motta Salles; Cantão, Maurício Egidio; de Vasconcelos, Ana Tereza Ribeiro

    2012-01-01

    An essential step of a metagenomic study is the taxonomic classification, that is, the identification of the taxonomic lineage of the organisms in a given sample. The taxonomic classification process involves a series of decisions. Currently, in the context of metagenomics, such decisions are usually based on empirical studies that consider one specific type of classifier. In this study we propose a general framework for analyzing the impact that several decisions can have on the classification problem. Instead of focusing on any specific classifier, we define a generic score function that provides a measure of the difficulty of the classification task. Using this framework, we analyze the impact of the following parameters on the taxonomic classification problem: (i) the length of n-mers used to encode the metagenomic sequences, (ii) the similarity measure used to compare sequences, and (iii) the type of taxonomic classification, which can be conventional or hierarchical, depending on whether the classification process occurs in a single shot or in several steps according to the taxonomic tree. We defined a score function that measures the degree of separability of the taxonomic classes under a given configuration induced by the parameters above. We conducted an extensive computational experiment and found out that reasonable values for the parameters of interest could be (i) intermediate values of n, the length of the n-mers; (ii) any similarity measure, because all of them resulted in similar scores; and (iii) the hierarchical strategy, which performed better in all of the cases. As expected, short n-mers generate lower configuration scores because they give rise to frequency vectors that represent distinct sequences in a similar way. On the other hand, large values for n result in sparse frequency vectors that represent differently metagenomic fragments that are in fact similar, also leading to low configuration scores. Regarding the similarity measure, in

  17. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  18. Mechanism-based drug exposure classification in pharmacoepidemiological studies

    NARCIS (Netherlands)

    Verdel, B.M.

    2010-01-01

    Mechanism-based classification of drug exposure in pharmacoepidemiological studies In pharmacoepidemiology and pharmacovigilance, the relation between drug exposure and clinical outcomes is crucial. Exposure classification in pharmacoepidemiological studies is traditionally based on

  19. Agent-Based Models in Social Physics

    Science.gov (United States)

    Quang, Le Anh; Jung, Nam; Cho, Eun Sung; Choi, Jae Han; Lee, Jae Woo

    2018-06-01

    We review the agent-based models (ABM) on social physics including econophysics. The ABM consists of agent, system space, and external environment. The agent is autonomous and decides his/her behavior by interacting with the neighbors or the external environment with the rules of behavior. Agents are irrational because they have only limited information when they make decisions. They adapt using learning from past memories. Agents have various attributes and are heterogeneous. ABM is a non-equilibrium complex system that exhibits various emergence phenomena. The social complexity ABM describes human behavioral characteristics. In ABMs of econophysics, we introduce the Sugarscape model and the artificial market models. We review minority games and majority games in ABMs of game theory. Social flow ABM introduces crowding, evacuation, traffic congestion, and pedestrian dynamics. We also review ABM for opinion dynamics and voter model. We discuss features and advantages and disadvantages of Netlogo, Repast, Swarm, and Mason, which are representative platforms for implementing ABM.

  20. SQL based cardiovascular ultrasound image classification.

    Science.gov (United States)

    Nandagopalan, S; Suryanarayana, Adiga B; Sudarshan, T S B; Chandrashekar, Dhanalakshmi; Manjunath, C N

    2013-01-01

    This paper proposes a novel method to analyze and classify the cardiovascular ultrasound echocardiographic images using Naïve-Bayesian model via database OLAP-SQL. Efficient data mining algorithms based on tightly-coupled model is used to extract features. Three algorithms are proposed for classification namely Naïve-Bayesian Classifier for Discrete variables (NBCD) with SQL, NBCD with OLAP-SQL, and Naïve-Bayesian Classifier for Continuous variables (NBCC) using OLAP-SQL. The proposed model is trained with 207 patient images containing normal and abnormal categories. Out of the three proposed algorithms, a high classification accuracy of 96.59% was achieved from NBCC which is better than the earlier methods.

  1. N-grams Based Supervised Machine Learning Model for Mobile Agent Platform Protection against Unknown Malicious Mobile Agents

    Directory of Open Access Journals (Sweden)

    Pallavi Bagga

    2017-12-01

    Full Text Available From many past years, the detection of unknown malicious mobile agents before they invade the Mobile Agent Platform has been the subject of much challenging activity. The ever-growing threat of malicious agents calls for techniques for automated malicious agent detection. In this context, the machine learning (ML methods are acknowledged more effective than the Signature-based and Behavior-based detection methods. Therefore, in this paper, the prime contribution has been made to detect the unknown malicious mobile agents based on n-gram features and supervised ML approach, which has not been done so far in the sphere of the Mobile Agents System (MAS security. To carry out the study, the n-grams ranging from 3 to 9 are extracted from a dataset containing 40 malicious and 40 non-malicious mobile agents. Subsequently, the classification is performed using different classifiers. A nested 5-fold cross validation scheme is employed in order to avoid the biasing in the selection of optimal parameters of classifier. The observations of extensive experiments demonstrate that the work done in this paper is suitable for the task of unknown malicious mobile agent detection in a Mobile Agent Environment, and also adds the ML in the interest list of researchers dealing with MAS security.

  2. Agent-Based Computing: Promise and Perils

    OpenAIRE

    Jennings, N. R.

    1999-01-01

    Agent-based computing represents an exciting new synthesis both for Artificial Intelligence (AI) and, more genrally, Computer Science. It has the potential to significantly improve the theory and practice of modelling, designing and implementing complex systems. Yet, to date, there has been little systematic analysis of what makes an agent such an appealing and powerful conceptual model. Moreover, even less effort has been devoted to exploring the inherent disadvantages that stem from adoptin...

  3. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  4. Research on Classification of Chinese Text Data Based on SVM

    Science.gov (United States)

    Lin, Yuan; Yu, Hongzhi; Wan, Fucheng; Xu, Tao

    2017-09-01

    Data Mining has important application value in today’s industry and academia. Text classification is a very important technology in data mining. At present, there are many mature algorithms for text classification. KNN, NB, AB, SVM, decision tree and other classification methods all show good classification performance. Support Vector Machine’ (SVM) classification method is a good classifier in machine learning research. This paper will study the classification effect based on the SVM method in the Chinese text data, and use the support vector machine method in the chinese text to achieve the classify chinese text, and to able to combination of academia and practical application.

  5. Contextual segment-based classification of airborne laser scanner data

    NARCIS (Netherlands)

    Vosselman, George; Coenen, Maximilian; Rottensteiner, Franz

    2017-01-01

    Classification of point clouds is needed as a first step in the extraction of various types of geo-information from point clouds. We present a new approach to contextual classification of segmented airborne laser scanning data. Potential advantages of segment-based classification are easily offset

  6. Agent-based simulation in entrepreneurship research

    NARCIS (Netherlands)

    Yang, S.-J.S.; Chandra, Y.

    2009-01-01

    Agent-based modeling (ABM) has wide applications in natural and social sciences yet it has not been widely applied in entrepreneurship research. We discuss the nature of ABM, its position among conventional methodologies and then offer a roadmap for developing, testing and extending theories of

  7. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn-Beckers, Petronella; Doldersum, Tom; Useya, Juliana; Augustijn, Dionysius C.M.

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  8. Agent-based simulation of animal behaviour

    NARCIS (Netherlands)

    C.M. Jonker (Catholijn); J. Treur

    1998-01-01

    textabstract In this paper it is shown how animal behaviour can be simulated in an agent-based manner. Different models are shown for different types of behaviour, varying from purely reactive behaviour to pro-active, social and adaptive behaviour. The compositional development method for

  9. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  10. Patterns of Use of an Agent-Based Model and a System Dynamics Model: The Application of Patterns of Use and the Impacts on Learning Outcomes

    Science.gov (United States)

    Thompson, Kate; Reimann, Peter

    2010-01-01

    A classification system that was developed for the use of agent-based models was applied to strategies used by school-aged students to interrogate an agent-based model and a system dynamics model. These were compared, and relationships between learning outcomes and the strategies used were also analysed. It was found that the classification system…

  11. FIPA agent based network distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    D. Abbott; V. Gyurjyan; G. Heyes; E. Jastrzembski; C. Timmer; E. Wolin

    2003-03-01

    A control system with the capabilities to combine heterogeneous control systems or processes into a uniform homogeneous environment is discussed. This dynamically extensible system is an example of the software system at the agent level of abstraction. This level of abstraction considers agents as atomic entities that communicate to implement the functionality of the control system. Agents' engineering aspects are addressed by adopting the domain independent software standard, formulated by FIPA. Jade core Java classes are used as a FIPA specification implementation. A special, lightweight, XML RDFS based, control oriented, ontology markup language is developed to standardize the description of the arbitrary control system data processor. Control processes, described in this language, are integrated into the global system at runtime, without actual programming. Fault tolerance and recovery issues are also addressed.

  12. FIPA agent based network distributed control system

    International Nuclear Information System (INIS)

    Abbott, D.; Gyurjyan, V.; Heyes, G.; Jastrzembski, E.; Timmer, C.; Wolin, E.

    2003-01-01

    A control system with the capabilities to combine heterogeneous control systems or processes into a uniform homogeneous environment is discussed. This dynamically extensible system is an example of the software system at the agent level of abstraction. This level of abstraction considers agents as atomic entities that communicate to implement the functionality of the control system. Agents' engineering aspects are addressed by adopting the domain independent software standard, formulated by FIPA. Jade core Java classes are used as a FIPA specification implementation. A special, lightweight, XML RDFS based, control oriented, ontology markup language is developed to standardize the description of the arbitrary control system data processor. Control processes, described in this language, are integrated into the global system at runtime, without actual programming. Fault tolerance and recovery issues are also addressed

  13. Density Based Support Vector Machines for Classification

    OpenAIRE

    Zahra Nazari; Dongshik Kang

    2015-01-01

    Support Vector Machines (SVM) is the most successful algorithm for classification problems. SVM learns the decision boundary from two classes (for Binary Classification) of training points. However, sometimes there are some less meaningful samples amongst training points, which are corrupted by noises or misplaced in wrong side, called outliers. These outliers are affecting on margin and classification performance, and machine should better to discard them. SVM as a popular and widely used cl...

  14. Fuzzy Constraint-Based Agent Negotiation

    Institute of Scientific and Technical Information of China (English)

    Menq-Wen Lin; K. Robert Lai; Ting-Jung Yu

    2005-01-01

    Conflicts between two or more parties arise for various reasons and perspectives. Thus, resolution of conflicts frequently relies on some form of negotiation. This paper presents a general problem-solving framework for modeling multi-issue multilateral negotiation using fuzzy constraints. Agent negotiation is formulated as a distributed fuzzy constraint satisfaction problem (DFCSP). Fuzzy constrains are thus used to naturally represent each agent's desires involving imprecision and human conceptualization, particularly when lexical imprecision and subjective matters are concerned. On the other hand, based on fuzzy constraint-based problem-solving, our approach enables an agent not only to systematically relax fuzzy constraints to generate a proposal, but also to employ fuzzy similarity to select the alternative that is subject to its acceptability by the opponents. This task of problem-solving is to reach an agreement that benefits all agents with a high satisfaction degree of fuzzy constraints, and move towards the deal more quickly since their search focuses only on the feasible solution space. An application to multilateral negotiation of a travel planning is provided to demonstrate the usefulness and effectiveness of our framework.

  15. Classification of research reactors and discussion of thinking of safety regulation based on the classification

    International Nuclear Information System (INIS)

    Song Chenxiu; Zhu Lixin

    2013-01-01

    Research reactors have different characteristics in the fields of reactor type, use, power level, design principle, operation model and safety performance, etc, and also have significant discrepancy in the aspect of nuclear safety regulation. This paper introduces classification of research reactors and discusses thinking of safety regulation based on the classification of research reactors. (authors)

  16. Sentiment classification technology based on Markov logic networks

    Science.gov (United States)

    He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe

    2016-07-01

    With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.

  17. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  18. Dissimilarity-based classification of anatomical tree structures

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Lo, Pechin Chien Pau; Dirksen, Asger

    2011-01-01

    A novel method for classification of abnormality in anatomical tree structures is presented. A tree is classified based on direct comparisons with other trees in a dissimilarity-based classification scheme. The pair-wise dissimilarity measure between two trees is based on a linear assignment betw...

  19. Dissimilarity-based classification of anatomical tree structures

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Lo, Pechin Chien Pau; Dirksen, Asger

    2011-01-01

    A novel method for classification of abnormality in anatomical tree structures is presented. A tree is classified based on direct comparisons with other trees in a dissimilarity-based classification scheme. The pair-wise dissimilarity measure between two trees is based on a linear assignment...

  20. Agent Based Modelling for Social Simulation

    OpenAIRE

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course of this project two workshops were organized. At these workshops, a wide range of experts, both ABM experts and domain experts, worked on several potential applications of ABM. The results and ins...

  1. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  2. Improving Agent Based Modeling of Critical Incidents

    Directory of Open Access Journals (Sweden)

    Robert Till

    2010-04-01

    Full Text Available Agent Based Modeling (ABM is a powerful method that has been used to simulate potential critical incidents in the infrastructure and built environments. This paper will discuss the modeling of some critical incidents currently simulated using ABM and how they may be expanded and improved by using better physiological modeling, psychological modeling, modeling the actions of interveners, introducing Geographic Information Systems (GIS and open source models.

  3. CATS-based Air Traffic Controller Agents

    Science.gov (United States)

    Callantine, Todd J.

    2002-01-01

    This report describes intelligent agents that function as air traffic controllers. Each agent controls traffic in a single sector in real time; agents controlling traffic in adjoining sectors can coordinate to manage an arrival flow across a given meter fix. The purpose of this research is threefold. First, it seeks to study the design of agents for controlling complex systems. In particular, it investigates agent planning and reactive control functionality in a dynamic environment in which a variety perceptual and decision making skills play a central role. It examines how heuristic rules can be applied to model planning and decision making skills, rather than attempting to apply optimization methods. Thus, the research attempts to develop intelligent agents that provide an approximation of human air traffic controller behavior that, while not based on an explicit cognitive model, does produce task performance consistent with the way human air traffic controllers operate. Second, this research sought to extend previous research on using the Crew Activity Tracking System (CATS) as the basis for intelligent agents. The agents use a high-level model of air traffic controller activities to structure the control task. To execute an activity in the CATS model, according to the current task context, the agents reference a 'skill library' and 'control rules' that in turn execute the pattern recognition, planning, and decision-making required to perform the activity. Applying the skills enables the agents to modify their representation of the current control situation (i.e., the 'flick' or 'picture'). The updated representation supports the next activity in a cycle of action that, taken as a whole, simulates air traffic controller behavior. A third, practical motivation for this research is to use intelligent agents to support evaluation of new air traffic control (ATC) methods to support new Air Traffic Management (ATM) concepts. Current approaches that use large, human

  4. Risk-based classification system of nanomaterials

    International Nuclear Information System (INIS)

    Tervonen, Tommi; Linkov, Igor; Figueira, Jose Rui; Steevens, Jeffery; Chappell, Mark; Merad, Myriam

    2009-01-01

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  5. Structure-Based Algorithms for Microvessel Classification

    KAUST Repository

    Smith, Amy F.

    2015-02-01

    © 2014 The Authors. Microcirculation published by John Wiley & Sons Ltd. Objective: Recent developments in high-resolution imaging techniques have enabled digital reconstruction of three-dimensional sections of microvascular networks down to the capillary scale. To better interpret these large data sets, our goal is to distinguish branching trees of arterioles and venules from capillaries. Methods: Two novel algorithms are presented for classifying vessels in microvascular anatomical data sets without requiring flow information. The algorithms are compared with a classification based on observed flow directions (considered the gold standard), and with an existing resistance-based method that relies only on structural data. Results: The first algorithm, developed for networks with one arteriolar and one venular tree, performs well in identifying arterioles and venules and is robust to parameter changes, but incorrectly labels a significant number of capillaries as arterioles or venules. The second algorithm, developed for networks with multiple inlets and outlets, correctly identifies more arterioles and venules, but is more sensitive to parameter changes. Conclusions: The algorithms presented here can be used to classify microvessels in large microvascular data sets lacking flow information. This provides a basis for analyzing the distinct geometrical properties and modelling the functional behavior of arterioles, capillaries, and venules.

  6. Risk-based classification system of nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Tervonen, Tommi, E-mail: t.p.tervonen@rug.n [University of Groningen, Faculty of Economics and Business (Netherlands); Linkov, Igor, E-mail: igor.linkov@usace.army.mi [US Army Research and Development Center (United States); Figueira, Jose Rui, E-mail: figueira@ist.utl.p [Technical University of Lisbon, CEG-IST, Centre for Management Studies, Instituto Superior Tecnico (Portugal); Steevens, Jeffery, E-mail: jeffery.a.steevens@usace.army.mil; Chappell, Mark, E-mail: mark.a.chappell@usace.army.mi [US Army Research and Development Center (United States); Merad, Myriam, E-mail: myriam.merad@ineris.f [INERIS BP 2, Societal Management of Risks Unit/Accidental Risks Division (France)

    2009-05-15

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  7. KNN BASED CLASSIFICATION OF DIGITAL MODULATED SIGNALS

    Directory of Open Access Journals (Sweden)

    Sajjad Ahmed Ghauri

    2016-11-01

    Full Text Available Demodulation process without the knowledge of modulation scheme requires Automatic Modulation Classification (AMC. When receiver has limited information about received signal then AMC become essential process. AMC finds important place in the field many civil and military fields such as modern electronic warfare, interfering source recognition, frequency management, link adaptation etc. In this paper we explore the use of K-nearest neighbor (KNN for modulation classification with different distance measurement methods. Five modulation schemes are used for classification purpose which is Binary Phase Shift Keying (BPSK, Quadrature Phase Shift Keying (QPSK, Quadrature Amplitude Modulation (QAM, 16-QAM and 64-QAM. Higher order cummulants (HOC are used as an input feature set to the classifier. Simulation results shows that proposed classification method provides better results for the considered modulation formats.

  8. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    Science.gov (United States)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  9. Integrating Globality and Locality for Robust Representation Based Classification

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2014-01-01

    Full Text Available The representation based classification method (RBCM has shown huge potential for face recognition since it first emerged. Linear regression classification (LRC method and collaborative representation classification (CRC method are two well-known RBCMs. LRC and CRC exploit training samples of each class and all the training samples to represent the testing sample, respectively, and subsequently conduct classification on the basis of the representation residual. LRC method can be viewed as a “locality representation” method because it just uses the training samples of each class to represent the testing sample and it cannot embody the effectiveness of the “globality representation.” On the contrary, it seems that CRC method cannot own the benefit of locality of the general RBCM. Thus we propose to integrate CRC and LRC to perform more robust representation based classification. The experimental results on benchmark face databases substantially demonstrate that the proposed method achieves high classification accuracy.

  10. Multiscale agent-based cancer modeling.

    Science.gov (United States)

    Zhang, Le; Wang, Zhihui; Sagotsky, Jonathan A; Deisboeck, Thomas S

    2009-04-01

    Agent-based modeling (ABM) is an in silico technique that is being used in a variety of research areas such as in social sciences, economics and increasingly in biomedicine as an interdisciplinary tool to study the dynamics of complex systems. Here, we describe its applicability to integrative tumor biology research by introducing a multi-scale tumor modeling platform that understands brain cancer as a complex dynamic biosystem. We summarize significant findings of this work, and discuss both challenges and future directions for ABM in the field of cancer research.

  11. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  12. Preliminary Research on Grassland Fine-classification Based on MODIS

    International Nuclear Information System (INIS)

    Hu, Z W; Zhang, S; Yu, X Y; Wang, X S

    2014-01-01

    Grassland ecosystem is important for climatic regulation, maintaining the soil and water. Research on the grassland monitoring method could provide effective reference for grassland resource investigation. In this study, we used the vegetation index method for grassland classification. There are several types of climate in China. Therefore, we need to use China's Main Climate Zone Maps and divide the study region into four climate zones. Based on grassland classification system of the first nation-wide grass resource survey in China, we established a new grassland classification system which is only suitable for this research. We used MODIS images as the basic data resources, and use the expert classifier method to perform grassland classification. Based on the 1:1,000,000 Grassland Resource Map of China, we obtained the basic distribution of all the grassland types and selected 20 samples evenly distributed in each type, then used NDVI/EVI product to summarize different spectral features of different grassland types. Finally, we introduced other classification auxiliary data, such as elevation, accumulate temperature (AT), humidity index (HI) and rainfall. China's nation-wide grassland classification map is resulted by merging the grassland in different climate zone. The overall classification accuracy is 60.4%. The result indicated that expert classifier is proper for national wide grassland classification, but the classification accuracy need to be improved

  13. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  14. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  15. Java-based mobile agent platforms for wireless sensor networks

    NARCIS (Netherlands)

    Aiello, F.; Carbone, A.; Fortino, G.; Galzarano, S.; Ganzha, M.; Paprzycki, M.

    2010-01-01

    This paper proposes an overview and comparison of mobile agent platforms for the development of wireless sensor network applications. In particular, the architecture, programming model and basic performance of two Java-based agent platforms, Mobile Agent Platform for Sun SPOT (MAPS) and Agent

  16. A proposed data base system for detection, classification and ...

    African Journals Online (AJOL)

    A proposed data base system for detection, classification and location of fault on electricity company of Ghana electrical distribution system. Isaac Owusu-Nyarko, Mensah-Ananoo Eugine. Abstract. No Abstract. Keywords: database, classification of fault, power, distribution system, SCADA, ECG. Full Text: EMAIL FULL TEXT ...

  17. AN OBJECT-BASED METHOD FOR CHINESE LANDFORM TYPES CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Ding

    2016-06-01

    Full Text Available Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM. In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  18. Agent-based modelling in synthetic biology.

    Science.gov (United States)

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  19. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  20. A Secure Protocol Based on a Sedentary Agent for Mobile Agent Environments

    OpenAIRE

    Abdelmorhit E. Rhazi; Samuel Pierre; Hanifa Boucheneb

    2007-01-01

    The main challenge when deploying mobile agent environments pertains to security issues concerning mobile agents and their executive platform. This paper proposes a secure protocol which protects mobile agents against attacks from malicious hosts in these environments. Protection is based on the perfect cooperation of a sedentary agent running inside a trusted third host. Results show that the protocol detects several attacks, such as denial of service, incorrect execution and re-execution of...

  1. Agent-based models of financial markets

    Energy Technology Data Exchange (ETDEWEB)

    Samanidou, E [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany); Zschischang, E [HSH Nord Bank, Portfolio Mngmt. and Inv., Martensdamm 6, D-24103 Kiel (Germany); Stauffer, D [Institute for Theoretical Physics, Cologne University, D-50923 Koeln (Germany); Lux, T [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany)

    2007-03-15

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we

  2. Agent-based models of financial markets

    Science.gov (United States)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  3. Agent-based models of financial markets

    International Nuclear Information System (INIS)

    Samanidou, E; Zschischang, E; Stauffer, D; Lux, T

    2007-01-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  4. Agent-based modeling in ecological economics.

    Science.gov (United States)

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems.

  5. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  6. A Classification-based Review Recommender

    Science.gov (United States)

    O'Mahony, Michael P.; Smyth, Barry

    Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.

  7. Text document classification based on mixture models

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Malík, Antonín

    2004-01-01

    Roč. 40, č. 3 (2004), s. 293-304 ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004

  8. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  9. Knowledge Management in Role Based Agents

    Science.gov (United States)

    Kır, Hüseyin; Ekinci, Erdem Eser; Dikenelli, Oguz

    In multi-agent system literature, the role concept is getting increasingly researched to provide an abstraction to scope beliefs, norms, goals of agents and to shape relationships of the agents in the organization. In this research, we propose a knowledgebase architecture to increase applicability of roles in MAS domain by drawing inspiration from the self concept in the role theory of sociology. The proposed knowledgebase architecture has granulated structure that is dynamically organized according to the agent's identification in a social environment. Thanks to this dynamic structure, agents are enabled to work on consistent knowledge in spite of inevitable conflicts between roles and the agent. The knowledgebase architecture is also implemented and incorporated into the SEAGENT multi-agent system development framework.

  10. Towards Agent-Based Model Specification in Smart Grid: A Cognitive Agent-based Computing Approach

    OpenAIRE

    Akram, Waseem; Niazi, Muaz A.; Iantovics, Laszlo Barna

    2017-01-01

    A smart grid can be considered as a complex network where each node represents a generation unit or a consumer. Whereas links can be used to represent transmission lines. One way to study complex systems is by using the agent-based modeling (ABM) paradigm. An ABM is a way of representing a complex system of autonomous agents interacting with each other. Previously, a number of studies have been presented in the smart grid domain making use of the ABM paradigm. However, to the best of our know...

  11. Agent-based land markets: Heterogeneous agents, land proces and urban land use change

    NARCIS (Netherlands)

    Filatova, Tatiana; Parker, Dawn C.; van der Veen, A.; Amblard, F.

    2007-01-01

    We construct a spatially explicit agent-based model of a bilateral land market. Heterogeneous agents form their bid and ask prices for land based on the utility that they obtain from a certain location (houte/land) and base on the state of the market (an excess of demand or supply). We underline the

  12. TENSOR MODELING BASED FOR AIRBORNE LiDAR DATA CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    N. Li

    2016-06-01

    Full Text Available Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the “raw” data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.

  13. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  14. An Immune Agent for Web-Based AI Course

    Science.gov (United States)

    Gong, Tao; Cai, Zixing

    2006-01-01

    To overcome weakness and faults of a web-based e-learning course such as Artificial Intelligence (AI), an immune agent was proposed, simulating a natural immune mechanism against a virus. The immune agent was built on the multi-dimension education agent model and immune algorithm. The web-based AI course was comprised of many files, such as HTML…

  15. Iris Image Classification Based on Hierarchical Visual Codebook.

    Science.gov (United States)

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection.

  16. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  17. Agent based modeling of energy networks

    International Nuclear Information System (INIS)

    Gonzalez de Durana, José María; Barambones, Oscar; Kremers, Enrique; Varga, Liz

    2014-01-01

    Highlights: • A new approach for energy network modeling is designed and tested. • The agent-based approach is general and no technology dependent. • The models can be easily extended. • The range of applications encompasses from small to large energy infrastructures. - Abstract: Attempts to model any present or future power grid face a huge challenge because a power grid is a complex system, with feedback and multi-agent behaviors, integrated by generation, distribution, storage and consumption systems, using various control and automation computing systems to manage electricity flows. Our approach to modeling is to build upon an established model of the low voltage electricity network which is tested and proven, by extending it to a generalized energy model. But, in order to address the crucial issues of energy efficiency, additional processes like energy conversion and storage, and further energy carriers, such as gas, heat, etc., besides the traditional electrical one, must be considered. Therefore a more powerful model, provided with enhanced nodes or conversion points, able to deal with multidimensional flows, is being required. This article addresses the issue of modeling a local multi-carrier energy network. This problem can be considered as an extension of modeling a low voltage distribution network located at some urban or rural geographic area. But instead of using an external power flow analysis package to do the power flow calculations, as used in electric networks, in this work we integrate a multiagent algorithm to perform the task, in a concurrent way to the other simulation tasks, and not only for the electric fluid but also for a number of additional energy carriers. As the model is mainly focused in system operation, generation and load models are not developed

  18. Agent-Based Negotiation in Uncertain Environments

    Science.gov (United States)

    Debenham, John; Sierra, Carles

    An agent aims to secure his projected needs by attempting to build a set of (business) relationships with other agents. A relationship is built by exchanging private information, and is characterised by its intimacy — degree of closeness — and balance — degree of fairness. Each argumentative interaction between two agents then has two goals: to satisfy some immediate need, and to do so in a way that develops the relationship in a desired direction. An agent's desire to develop each relationship in a particular way then places constraints on the argumentative utterances. The form of negotiation described is argumentative interaction constrained by a desire to develop such relationships.

  19. Ebolavirus Classification Based on Natural Vectors

    Science.gov (United States)

    Zheng, Hui; Yin, Changchuan; Hoang, Tung; He, Rong Lucy; Yang, Jie

    2015-01-01

    According to the WHO, ebolaviruses have resulted in 8818 human deaths in West Africa as of January 2015. To better understand the evolutionary relationship of the ebolaviruses and infer virulence from the relationship, we applied the alignment-free natural vector method to classify the newest ebolaviruses. The dataset includes three new Guinea viruses as well as 99 viruses from Sierra Leone. For the viruses of the family of Filoviridae, both genus label classification and species label classification achieve an accuracy rate of 100%. We represented the relationships among Filoviridae viruses by Unweighted Pair Group Method with Arithmetic Mean (UPGMA) phylogenetic trees and found that the filoviruses can be separated well by three genera. We performed the phylogenetic analysis on the relationship among different species of Ebolavirus by their coding-complete genomes and seven viral protein genes (glycoprotein [GP], nucleoprotein [NP], VP24, VP30, VP35, VP40, and RNA polymerase [L]). The topology of the phylogenetic tree by the viral protein VP24 shows consistency with the variations of virulence of ebolaviruses. The result suggests that VP24 be a pharmaceutical target for treating or preventing ebolaviruses. PMID:25803489

  20. Recent advances in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Fujita, Katsuhide; Robu, Valentin

    2016-01-01

    This book covers recent advances in Complex Automated Negotiations as a widely studied emerging area in the field of Autonomous Agents and Multi-Agent Systems. The book includes selected revised and extended papers from the 7th International Workshop on Agent-Based Complex Automated Negotiation (ACAN2014), which was held in Paris, France, in May 2014. The book also includes brief introductions about Agent-based Complex Automated Negotiation which are based on tutorials provided in the workshop, and brief summaries and descriptions about the ANAC'14 (Automated Negotiating Agents Competition) competition, where authors of selected finalist agents explain the strategies and the ideas used by them. The book is targeted to academic and industrial researchers in various communities of autonomous agents and multi-agent systems, such as agreement technology, mechanism design, electronic commerce, related areas, as well as graduate, undergraduate, and PhD students working in those areas or having interest in them.

  1. Hot complaint intelligent classification based on text mining

    Directory of Open Access Journals (Sweden)

    XIA Haifeng

    2013-10-01

    Full Text Available The complaint recognizer system plays an important role in making sure the correct classification of the hot complaint,improving the service quantity of telecommunications industry.The customers’ complaint in telecommunications industry has its special particularity which should be done in limited time,which cause the error in classification of hot complaint.The paper presents a model of complaint hot intelligent classification based on text mining,which can classify the hot complaint in the correct level of the complaint navigation.The examples show that the model can be efficient to classify the text of the complaint.

  2. Radar Target Classification using Recursive Knowledge-Based Methods

    DEFF Research Database (Denmark)

    Jochumsen, Lars Wurtz

    The topic of this thesis is target classification of radar tracks from a 2D mechanically scanning coastal surveillance radar. The measurements provided by the radar are position data and therefore the classification is mainly based on kinematic data, which is deduced from the position. The target...... been terminated. Therefore, an update of the classification results must be made for each measurement of the target. The data for this work are collected throughout the PhD and are both collected from radars and other sensors such as GPS....

  3. Key-phrase based classification of public health web pages.

    Science.gov (United States)

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  4. Modelling of robotic work cells using agent based-approach

    Science.gov (United States)

    Sękala, A.; Banaś, W.; Gwiazda, A.; Monica, Z.; Kost, G.; Hryniewicz, P.

    2016-08-01

    In the case of modern manufacturing systems the requirements, both according the scope and according characteristics of technical procedures are dynamically changing. This results in production system organization inability to keep up with changes in a market demand. Accordingly, there is a need for new design methods, characterized, on the one hand with a high efficiency and on the other with the adequate level of the generated organizational solutions. One of the tools that could be used for this purpose is the concept of agent systems. These systems are the tools of artificial intelligence. They allow assigning to agents the proper domains of procedures and knowledge so that they represent in a self-organizing system of an agent environment, components of a real system. The agent-based system for modelling robotic work cell should be designed taking into consideration many limitations considered with the characteristic of this production unit. It is possible to distinguish some grouped of structural components that constitute such a system. This confirms the structural complexity of a work cell as a specific production system. So it is necessary to develop agents depicting various aspects of the work cell structure. The main groups of agents that are used to model a robotic work cell should at least include next pattern representatives: machine tool agents, auxiliary equipment agents, robots agents, transport equipment agents, organizational agents as well as data and knowledge bases agents. In this way it is possible to create the holarchy of the agent-based system.

  5. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  6. Online Learning for Classification of Alzheimer Disease based on Cortical Thickness and Hippocampal Shape Analysis.

    Science.gov (United States)

    Lee, Ga-Young; Kim, Jeonghun; Kim, Ju Han; Kim, Kiwoong; Seong, Joon-Kyung

    2014-01-01

    Mobile healthcare applications are becoming a growing trend. Also, the prevalence of dementia in modern society is showing a steady growing trend. Among degenerative brain diseases that cause dementia, Alzheimer disease (AD) is the most common. The purpose of this study was to identify AD patients using magnetic resonance imaging in the mobile environment. We propose an incremental classification for mobile healthcare systems. Our classification method is based on incremental learning for AD diagnosis and AD prediction using the cortical thickness data and hippocampus shape. We constructed a classifier based on principal component analysis and linear discriminant analysis. We performed initial learning and mobile subject classification. Initial learning is the group learning part in our server. Our smartphone agent implements the mobile classification and shows various results. With use of cortical thickness data analysis alone, the discrimination accuracy was 87.33% (sensitivity 96.49% and specificity 64.33%). When cortical thickness data and hippocampal shape were analyzed together, the achieved accuracy was 87.52% (sensitivity 96.79% and specificity 63.24%). In this paper, we presented a classification method based on online learning for AD diagnosis by employing both cortical thickness data and hippocampal shape analysis data. Our method was implemented on smartphone devices and discriminated AD patients for normal group.

  7. Empirical Studies On Machine Learning Based Text Classification Algorithms

    OpenAIRE

    Shweta C. Dharmadhikari; Maya Ingle; Parag Kulkarni

    2011-01-01

    Automatic classification of text documents has become an important research issue now days. Properclassification of text documents requires information retrieval, machine learning and Natural languageprocessing (NLP) techniques. Our aim is to focus on important approaches to automatic textclassification based on machine learning techniques viz. supervised, unsupervised and semi supervised.In this paper we present a review of various text classification approaches under machine learningparadig...

  8. Space Situational Awareness using Market Based Agents

    Science.gov (United States)

    Sullivan, C.; Pier, E.; Gregory, S.; Bush, M.

    2012-09-01

    Space surveillance for the DoD is not limited to the Space Surveillance Network (SSN). Other DoD-owned assets have some existing capabilities for tasking but have no systematic way to work collaboratively with the SSN. These are run by diverse organizations including the Services, other defense and intelligence agencies and national laboratories. Beyond these organizations, academic and commercial entities have systems that possess SSA capability. Most all of these assets have some level of connectivity, security, and potential autonomy. Exploiting them in a mutually beneficial structure could provide a more comprehensive, efficient and cost effective solution for SSA. The collection of all potential assets, providers and consumers of SSA data comprises a market which is functionally illiquid. The development of a dynamic marketplace for SSA data could enable would-be providers the opportunity to sell data to SSA consumers for monetary or incentive based compensation. A well-conceived market architecture could drive down SSA data costs through increased supply and improve efficiency through increased competition. Oceanit will investigate market and market agent architectures, protocols, standards, and incentives toward producing high-volume/low-cost SSA.

  9. Ontology-based multi-agent systems

    Energy Technology Data Exchange (ETDEWEB)

    Hadzic, Maja; Wongthongtham, Pornpit; Dillon, Tharam; Chang, Elizabeth [Digital Ecosystems and Business Intelligence Institute, Perth, WA (Australia)

    2009-07-01

    The Semantic web has given a great deal of impetus to the development of ontologies and multi-agent systems. Several books have appeared which discuss the development of ontologies or of multi-agent systems separately on their own. The growing interaction between agents and ontologies has highlighted the need for integrated development of these. This book is unique in being the first to provide an integrated treatment of the modeling, design and implementation of such combined ontology/multi-agent systems. It provides clear exposition of this integrated modeling and design methodology. It further illustrates this with two detailed case studies in (a) the biomedical area and (b) the software engineering area. The book is, therefore, of interest to researchers, graduate students and practitioners in the semantic web and web science area. (orig.)

  10. Polarimetric SAR image classification based on discriminative dictionary learning model

    Science.gov (United States)

    Sang, Cheng Wei; Sun, Hong

    2018-03-01

    Polarimetric SAR (PolSAR) image classification is one of the important applications of PolSAR remote sensing. It is a difficult high-dimension nonlinear mapping problem, the sparse representations based on learning overcomplete dictionary have shown great potential to solve such problem. The overcomplete dictionary plays an important role in PolSAR image classification, however for PolSAR image complex scenes, features shared by different classes will weaken the discrimination of learned dictionary, so as to degrade classification performance. In this paper, we propose a novel overcomplete dictionary learning model to enhance the discrimination of dictionary. The learned overcomplete dictionary by the proposed model is more discriminative and very suitable for PolSAR classification.

  11. Semantic Document Image Classification Based on Valuable Text Pattern

    Directory of Open Access Journals (Sweden)

    Hossein Pourghassem

    2011-01-01

    Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.

  12. Video based object representation and classification using multiple covariance matrices.

    Science.gov (United States)

    Zhang, Yurong; Liu, Quan

    2017-01-01

    Video based object recognition and classification has been widely studied in computer vision and image processing area. One main issue of this task is to develop an effective representation for video. This problem can generally be formulated as image set representation. In this paper, we present a new method called Multiple Covariance Discriminative Learning (MCDL) for image set representation and classification problem. The core idea of MCDL is to represent an image set using multiple covariance matrices with each covariance matrix representing one cluster of images. Firstly, we use the Nonnegative Matrix Factorization (NMF) method to do image clustering within each image set, and then adopt Covariance Discriminative Learning on each cluster (subset) of images. At last, we adopt KLDA and nearest neighborhood classification method for image set classification. Promising experimental results on several datasets show the effectiveness of our MCDL method.

  13. Pathological Bases for a Robust Application of Cancer Molecular Classification

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2015-04-01

    Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.

  14. Classification of BCI Users Based on Cognition

    Directory of Open Access Journals (Sweden)

    N. Firat Ozkan

    2018-01-01

    Full Text Available Brain-Computer Interfaces (BCI are systems originally developed to assist paralyzed patients allowing for commands to the computer with brain activities. This study aims to examine cognitive state with an objective, easy-to-use, and easy-to-interpret method utilizing Brain-Computer Interface systems. Seventy healthy participants completed six tasks using a Brain-Computer Interface system and participants’ pupil dilation, blink rate, and Galvanic Skin Response (GSR data were collected simultaneously. Participants filled Nasa-TLX forms following each task and task performances of participants were also measured. Cognitive state clusters were created from the data collected using the K-means method. Taking these clusters and task performances into account, the general cognitive state of each participant was classified as low risk or high risk. Logistic Regression, Decision Tree, and Neural Networks were also used to classify the same data in order to measure the consistency of this classification with other techniques and the method provided a consistency between 87.1% and 100% with other techniques.

  15. An Interactive Tool for Creating Multi-Agent Systems and Interactive Agent-based Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Utilizing principles from parallel and distributed processing combined with inspiration from modular robotics, we developed the modular interactive tiles. As an educational tool, the modular interactive tiles facilitate the learning of multi-agent systems and interactive agent-based games...

  16. The Study of Land Use Classification Based on SPOT6 High Resolution Data

    OpenAIRE

    Wu Song; Jiang Qigang

    2016-01-01

    A method is carried out to quick classification extract of the type of land use in agricultural areas, which is based on the spot6 high resolution remote sensing classification data and used of the good nonlinear classification ability of support vector machine. The results show that the spot6 high resolution remote sensing classification data can realize land classification efficiently, the overall classification accuracy reached 88.79% and Kappa factor is 0.8632 which means that the classif...

  17. Dendrimer-based Macromolecular MRI Contrast Agents: Characteristics and Application

    Directory of Open Access Journals (Sweden)

    Hisataka Kobayashi

    2003-01-01

    Full Text Available Numerous macromolecular MRI contrast agents prepared employing relatively simple chemistry may be readily available that can provide sufficient enhancement for multiple applications. These agents operate using a ~100-fold lower concentration of gadolinium ions in comparison to the necessary concentration of iodine employed in CT imaging. Herein, we describe some of the general potential directions of macromolecular MRI contrast agents using our recently reported families of dendrimer-based agents as examples. Changes in molecular size altered the route of excretion. Smaller-sized contrast agents less than 60 kDa molecular weight were excreted through the kidney resulting in these agents being potentially suitable as functional renal contrast agents. Hydrophilic and larger-sized contrast agents were found better suited for use as blood pool contrast agents. Hydrophobic variants formed with polypropylenimine diaminobutane dendrimer cores created liver contrast agents. Larger hydrophilic agents are useful for lymphatic imaging. Finally, contrast agents conjugated with either monoclonal antibodies or with avidin are able to function as tumor-specific contrast agents, which also might be employed as therapeutic drugs for either gadolinium neutron capture therapy or in conjunction with radioimmunotherapy.

  18. Modeling collective emotions: a stochastic approach based on Brownian agents

    International Nuclear Information System (INIS)

    Schweitzer, F.

    2010-01-01

    We develop a agent-based framework to model the emergence of collective emotions, which is applied to online communities. Agents individual emotions are described by their valence and arousal. Using the concept of Brownian agents, these variables change according to a stochastic dynamics, which also considers the feedback from online communication. Agents generate emotional information, which is stored and distributed in a field modeling the online medium. This field affects the emotional states of agents in a non-linear manner. We derive conditions for the emergence of collective emotions, observable in a bimodal valence distribution. Dependent on a saturated or a super linear feedback between the information field and the agent's arousal, we further identify scenarios where collective emotions only appear once or in a repeated manner. The analytical results are illustrated by agent-based computer simulations. Our framework provides testable hypotheses about the emergence of collective emotions, which can be verified by data from online communities. (author)

  19. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  20. A new circulation type classification based upon Lagrangian air trajectories

    Directory of Open Access Journals (Sweden)

    Alexandre M. Ramos

    2014-10-01

    Full Text Available A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories. The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification.A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  1. Internet-enabled collaborative agent-based supply chains

    Science.gov (United States)

    Shen, Weiming; Kremer, Rob; Norrie, Douglas H.

    2000-12-01

    This paper presents some results of our recent research work related to the development of a new Collaborative Agent System Architecture (CASA) and an Infrastructure for Collaborative Agent Systems (ICAS). Initially being proposed as a general architecture for Internet based collaborative agent systems (particularly complex industrial collaborative agent systems), the proposed architecture is very suitable for managing the Internet enabled complex supply chain for a large manufacturing enterprise. The general collaborative agent system architecture with the basic communication and cooperation services, domain independent components, prototypes and mechanisms are described. Benefits of implementing Internet enabled supply chains with the proposed infrastructure are discussed. A case study on Internet enabled supply chain management is presented.

  2. Atmospheric circulation classification comparison based on wildfires in Portugal

    Science.gov (United States)

    Pereira, M. G.; Trigo, R. M.

    2009-04-01

    Atmospheric circulation classifications are not a simple description of atmospheric states but a tool to understand and interpret the atmospheric processes and to model the relation between atmospheric circulation and surface climate and other related variables (Radan Huth et al., 2008). Classifications were initially developed with weather forecasting purposes, however with the progress in computer processing capability, new and more robust objective methods were developed and applied to large datasets prompting atmospheric circulation classification methods to one of the most important fields in synoptic and statistical climatology. Classification studies have been extensively used in climate change studies (e.g. reconstructed past climates, recent observed changes and future climates), in bioclimatological research (e.g. relating human mortality to climatic factors) and in a wide variety of synoptic climatological applications (e.g. comparison between datasets, air pollution, snow avalanches, wine quality, fish captures and forest fires). Likewise, atmospheric circulation classifications are important for the study of the role of weather in wildfire occurrence in Portugal because the daily synoptic variability is the most important driver of local weather conditions (Pereira et al., 2005). In particular, the objective classification scheme developed by Trigo and DaCamara (2000) to classify the atmospheric circulation affecting Portugal have proved to be quite useful in discriminating the occurrence and development of wildfires as well as the distribution over Portugal of surface climatic variables with impact in wildfire activity such as maximum and minimum temperature and precipitation. This work aims to present: (i) an overview the existing circulation classification for the Iberian Peninsula, and (ii) the results of a comparison study between these atmospheric circulation classifications based on its relation with wildfires and relevant meteorological

  3. Failure diagnosis using deep belief learning based health state classification

    International Nuclear Information System (INIS)

    Tamilselvan, Prasanna; Wang, Pingfeng

    2013-01-01

    Effective health diagnosis provides multifarious benefits such as improved safety, improved reliability and reduced costs for operation and maintenance of complex engineered systems. This paper presents a novel multi-sensor health diagnosis method using deep belief network (DBN). DBN has recently become a popular approach in machine learning for its promised advantages such as fast inference and the ability to encode richer and higher order network structures. The DBN employs a hierarchical structure with multiple stacked restricted Boltzmann machines and works through a layer by layer successive learning process. The proposed multi-sensor health diagnosis methodology using DBN based state classification can be structured in three consecutive stages: first, defining health states and preprocessing sensory data for DBN training and testing; second, developing DBN based classification models for diagnosis of predefined health states; third, validating DBN classification models with testing sensory dataset. Health diagnosis using DBN based health state classification technique is compared with four existing diagnosis techniques. Benchmark classification problems and two engineering health diagnosis applications: aircraft engine health diagnosis and electric power transformer health diagnosis are employed to demonstrate the efficacy of the proposed approach

  4. Competency Based Curriculum for Real Estate Agent.

    Science.gov (United States)

    McCloy, Robert J.

    This publication is a curriculum and teaching guide for preparing real estate agents in the state of West Virginia. The guide contains 30 units, or lessons. Each lesson is designed to cover three to five hours of instruction time. Competencies provided for each lesson are stated in terms of what the student should be able to do as a result of the…

  5. Agent Persuasion Mechanism of Acquaintance

    Science.gov (United States)

    Jinghua, Wu; Wenguang, Lu; Hailiang, Meng

    Agent persuasion can improve negotiation efficiency in dynamic environment based on its initiative and autonomy, and etc., which is being affected much more by acquaintance. Classification of acquaintance on agent persuasion is illustrated, and the agent persuasion model of acquaintance is also illustrated. Then the concept of agent persuasion degree of acquaintance is given. Finally, relative interactive mechanism is elaborated.

  6. Russian and Foreign Experience of Integration of Agent-Based Models and Geographic Information Systems

    Directory of Open Access Journals (Sweden)

    Konstantin Anatol’evich Gulin

    2016-11-01

    Full Text Available The article provides an overview of the mechanisms of integration of agent-based models and GIS technology developed by Russian and foreign researchers. The basic framework of the article is based on critical analysis of domestic and foreign literature (monographs, scientific articles. The study is based on the application of universal scientific research methods: system approach, analysis and synthesis, classification, systematization and grouping, generalization and comparison. The article presents theoretical and methodological bases of integration of agent-based models and geographic information systems. The concept and essence of agent-based models are explained; their main advantages (compared to other modeling methods are identified. The paper characterizes the operating environment of agents as a key concept in the theory of agent-based modeling. It is shown that geographic information systems have a wide range of information resources for calculations, searching, modeling of the real world in various aspects, acting as an effective tool for displaying the agents’ operating environment and allowing to bring the model as close as possible to the real conditions. The authors also focus on a wide range of possibilities for various researches in different spatial and temporal contexts. Comparative analysis of platforms supporting the integration of agent-based models and geographic information systems has been carried out. The authors give examples of complex socio-economic models: the model of a creative city, humanitarian assistance model. In the absence of standards for research results description, the authors focus on the models’ elements such as the characteristics of the agents and their operation environment, agents’ behavior, rules of interaction between the agents and the external environment. The paper describes the possibilities and prospects of implementing these models

  7. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  8. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  9. Point Based Emotion Classification Using SVM

    OpenAIRE

    Swinkels, Wout

    2016-01-01

    The detection of emotions is a hot topic in the area of computer vision. Emotions are based on subtle changes in the face that are intuitively detected and interpreted by humans. Detecting these subtle changes, based on mathematical models, is a great challenge in the area of computer vision. In this thesis a new method is proposed to achieve state-of-the-art emotion detection performance. This method is based on facial feature points to monitor subtle changes in the face. Therefore the c...

  10. ICF-based classification and measurement of functioning.

    Science.gov (United States)

    Stucki, G; Kostanjsek, N; Ustün, B; Cieza, A

    2008-09-01

    If we aim towards a comprehensive understanding of human functioning and the development of comprehensive programs to optimize functioning of individuals and populations we need to develop suitable measures. The approval of the International Classification, Disability and Health (ICF) in 2001 by the 54th World Health Assembly as the first universally shared model and classification of functioning, disability and health marks, therefore an important step in the development of measurement instruments and ultimately for our understanding of functioning, disability and health. The acceptance and use of the ICF as a reference framework and classification has been facilitated by its development in a worldwide, comprehensive consensus process and the increasing evidence regarding its validity. However, the broad acceptance and use of the ICF as a reference framework and classification will also depend on the resolution of conceptual and methodological challenges relevant for the classification and measurement of functioning. This paper therefore describes first how the ICF categories can serve as building blocks for the measurement of functioning and then the current state of the development of ICF based practical tools and international standards such as the ICF Core Sets. Finally it illustrates how to map the world of measures to the ICF and vice versa and the methodological principles relevant for the transformation of information obtained with a clinical test or a patient-oriented instrument to the ICF as well as the development of ICF-based clinical and self-reported measurement instruments.

  11. Chinese Sentence Classification Based on Convolutional Neural Network

    Science.gov (United States)

    Gu, Chengwei; Wu, Ming; Zhang, Chuang

    2017-10-01

    Sentence classification is one of the significant issues in Natural Language Processing (NLP). Feature extraction is often regarded as the key point for natural language processing. Traditional ways based on machine learning can not take high level features into consideration, such as Naive Bayesian Model. The neural network for sentence classification can make use of contextual information to achieve greater results in sentence classification tasks. In this paper, we focus on classifying Chinese sentences. And the most important is that we post a novel architecture of Convolutional Neural Network (CNN) to apply on Chinese sentence classification. In particular, most of the previous methods often use softmax classifier for prediction, we embed a linear support vector machine to substitute softmax in the deep neural network model, minimizing a margin-based loss to get a better result. And we use tanh as an activation function, instead of ReLU. The CNN model improve the result of Chinese sentence classification tasks. Experimental results on the Chinese news title database validate the effectiveness of our model.

  12. Collective Machine Learning: Team Learning and Classification in Multi-Agent Systems

    Science.gov (United States)

    Gifford, Christopher M.

    2009-01-01

    This dissertation focuses on the collaboration of multiple heterogeneous, intelligent agents (hardware or software) which collaborate to learn a task and are capable of sharing knowledge. The concept of collaborative learning in multi-agent and multi-robot systems is largely under studied, and represents an area where further research is needed to…

  13. NIM: A Node Influence Based Method for Cancer Classification

    Directory of Open Access Journals (Sweden)

    Yiwen Wang

    2014-01-01

    Full Text Available The classification of different cancer types owns great significance in the medical field. However, the great majority of existing cancer classification methods are clinical-based and have relatively weak diagnostic ability. With the rapid development of gene expression technology, it is able to classify different kinds of cancers using DNA microarray. Our main idea is to confront the problem of cancer classification using gene expression data from a graph-based view. Based on a new node influence model we proposed, this paper presents a novel high accuracy method for cancer classification, which is composed of four parts: the first is to calculate the similarity matrix of all samples, the second is to compute the node influence of training samples, the third is to obtain the similarity between every test sample and each class using weighted sum of node influence and similarity matrix, and the last is to classify each test sample based on its similarity between every class. The data sets used in our experiments are breast cancer, central nervous system, colon tumor, prostate cancer, acute lymphoblastic leukemia, and lung cancer. experimental results showed that our node influence based method (NIM is more efficient and robust than the support vector machine, K-nearest neighbor, C4.5, naive Bayes, and CART.

  14. Improvement of Bioactive Compound Classification through Integration of Orthogonal Cell-Based Biosensing Methods

    Directory of Open Access Journals (Sweden)

    Goran N. Jovanovic

    2007-01-01

    Full Text Available Lack of specificity for different classes of chemical and biological agents, and false positives and negatives, can limit the range of applications for cell-based biosensors. This study suggests that the integration of results from algal cells (Mesotaenium caldariorum and fish chromatophores (Betta splendens improves classification efficiency and detection reliability. Cells were challenged with paraquat, mercuric chloride, sodium arsenite and clonidine. The two detection systems were independently investigated for classification of the toxin set by performing discriminant analysis. The algal system correctly classified 72% of the bioactive compounds, whereas the fish chromatophore system correctly classified 68%. The combined classification efficiency was 95%. The algal sensor readout is based on fluorescence measurements of changes in the energy producing pathways of photosynthetic cells, whereas the response from fish chromatophores was quantified using optical density. Change in optical density reflects interference with the functioning of cellular signal transduction networks. Thus, algal cells and fish chromatophores respond to the challenge agents through sufficiently different mechanisms of action to be considered orthogonal.

  15. Some improved classification-based ridge parameter of Hoerl and ...

    African Journals Online (AJOL)

    Some improved classification-based ridge parameter of Hoerl and Kennard estimation techniques. ... This assumption is often violated and Ridge Regression estimator introduced by [2]has been identified to be more efficient than ordinary least square (OLS) in handling it. However, it requires a ridge parameter, K, of which ...

  16. Classification and Target Group Selection Based Upon Frequent Patterns

    NARCIS (Netherlands)

    W.H.L.M. Pijls (Wim); R. Potharst (Rob)

    2000-01-01

    textabstractIn this technical report , two new algorithms based upon frequent patterns are proposed. One algorithm is a classification method. The other one is an algorithm for target group selection. In both algorithms, first of all, the collection of frequent patterns in the training set is

  17. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  18. Torrent classification - Base of rational management of erosive regions

    International Nuclear Information System (INIS)

    Gavrilovic, Zoran; Stefanovic, Milutin; Milovanovic, Irina; Cotric, Jelena; Milojevic, Mileta

    2008-01-01

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  19. Classification of scintigrams on the base of an automatic analysis

    International Nuclear Information System (INIS)

    Vidyukov, V.I.; Kasatkin, Yu.N.; Kal'nitskaya, E.F.; Mironov, S.P.; Rotenberg, E.M.

    1980-01-01

    The stages of drawing a discriminative system based on self-education for an automatic analysis of scintigrams have been considered. The results of the classification of 240 scintigrams of the liver into ''normal'', ''diffuse lesions'', ''focal lesions'' have been evaluated by medical experts and computer. The accuracy of the computerized classification was 91.7%, that of the experts-85%. The automatic analysis methods of scintigrams of the liver have been realized using the specialized MDS system of data processing. The quality of the discriminative system has been assessed on 125 scintigrams. The accuracy of the classification is equal to 89.6%. The employment of the self-education; methods permitted one to single out two subclasses depending on the severity of diffuse lesions

  20. Hyperspectral image classification based on local binary patterns and PCANet

    Science.gov (United States)

    Yang, Huizhen; Gao, Feng; Dong, Junyu; Yang, Yang

    2018-04-01

    Hyperspectral image classification has been well acknowledged as one of the challenging tasks of hyperspectral data processing. In this paper, we propose a novel hyperspectral image classification framework based on local binary pattern (LBP) features and PCANet. In the proposed method, linear prediction error (LPE) is first employed to select a subset of informative bands, and LBP is utilized to extract texture features. Then, spectral and texture features are stacked into a high dimensional vectors. Next, the extracted features of a specified position are transformed to a 2-D image. The obtained images of all pixels are fed into PCANet for classification. Experimental results on real hyperspectral dataset demonstrate the effectiveness of the proposed method.

  1. Remote Sensing Image Classification Based on Stacked Denoising Autoencoder

    Directory of Open Access Journals (Sweden)

    Peng Liang

    2017-12-01

    Full Text Available Focused on the issue that conventional remote sensing image classification methods have run into the bottlenecks in accuracy, a new remote sensing image classification method inspired by deep learning is proposed, which is based on Stacked Denoising Autoencoder. First, the deep network model is built through the stacked layers of Denoising Autoencoder. Then, with noised input, the unsupervised Greedy layer-wise training algorithm is used to train each layer in turn for more robust expressing, characteristics are obtained in supervised learning by Back Propagation (BP neural network, and the whole network is optimized by error back propagation. Finally, Gaofen-1 satellite (GF-1 remote sensing data are used for evaluation, and the total accuracy and kappa accuracy reach 95.7% and 0.955, respectively, which are higher than that of the Support Vector Machine and Back Propagation neural network. The experiment results show that the proposed method can effectively improve the accuracy of remote sensing image classification.

  2. Torrent classification - Base of rational management of erosive regions

    Energy Technology Data Exchange (ETDEWEB)

    Gavrilovic, Zoran; Stefanovic, Milutin; Milovanovic, Irina; Cotric, Jelena; Milojevic, Mileta [Institute for the Development of Water Resources ' Jaroslav Cerni' , 11226 Beograd (Pinosava), Jaroslava Cernog 80 (Serbia)], E-mail: gavrilovicz@sbb.rs

    2008-11-01

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  3. Deep learning for EEG-Based preference classification

    Science.gov (United States)

    Teo, Jason; Hou, Chew Lin; Mountstephens, James

    2017-10-01

    Electroencephalogram (EEG)-based emotion classification is rapidly becoming one of the most intensely studied areas of brain-computer interfacing (BCI). The ability to passively identify yet accurately correlate brainwaves with our immediate emotions opens up truly meaningful and previously unattainable human-computer interactions such as in forensic neuroscience, rehabilitative medicine, affective entertainment and neuro-marketing. One particularly useful yet rarely explored areas of EEG-based emotion classification is preference recognition [1], which is simply the detection of like versus dislike. Within the limited investigations into preference classification, all reported studies were based on musically-induced stimuli except for a single study which used 2D images. The main objective of this study is to apply deep learning, which has been shown to produce state-of-the-art results in diverse hard problems such as in computer vision, natural language processing and audio recognition, to 3D object preference classification over a larger group of test subjects. A cohort of 16 users was shown 60 bracelet-like objects as rotating visual stimuli on a computer display while their preferences and EEGs were recorded. After training a variety of machine learning approaches which included deep neural networks, we then attempted to classify the users' preferences for the 3D visual stimuli based on their EEGs. Here, we show that that deep learning outperforms a variety of other machine learning classifiers for this EEG-based preference classification task particularly in a highly challenging dataset with large inter- and intra-subject variability.

  4. VigilAgent for the development of agent-based multi-robot surveillance systems

    OpenAIRE

    Gascueña Noheda, José Manuel; Navarro Martínez, Elena María; Fernández Caballero, Antonio

    2011-01-01

    Usually, surveillance applications are developed following an ad-hoc approach instead of using a methodology to guide stakeholders in achieving quality standards expected from commercial software. To solve this gap, our conjecture is that surveillance applications can be fully developed from their initial design stages by means of agent-based methodologies. Specifically, this paper describes the experience and the results of using a multi-agent systems approach according to the process provid...

  5. Hardware Accelerators Targeting a Novel Group Based Packet Classification Algorithm

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2013-01-01

    Full Text Available Packet classification is a ubiquitous and key building block for many critical network devices. However, it remains as one of the main bottlenecks faced when designing fast network devices. In this paper, we propose a novel Group Based Search packet classification Algorithm (GBSA that is scalable, fast, and efficient. GBSA consumes an average of 0.4 megabytes of memory for a 10 k rule set. The worst-case classification time per packet is 2 microseconds, and the preprocessing speed is 3 M rules/second based on an Xeon processor operating at 3.4 GHz. When compared with other state-of-the-art classification techniques, the results showed that GBSA outperforms the competition with respect to speed, memory usage, and processing time. Moreover, GBSA is amenable to implementation in hardware. Three different hardware implementations are also presented in this paper including an Application Specific Instruction Set Processor (ASIP implementation and two pure Register-Transfer Level (RTL implementations based on Impulse-C and Handel-C flows, respectively. Speedups achieved with these hardware accelerators ranged from 9x to 18x compared with a pure software implementation running on an Xeon processor.

  6. Sparse Representation Based Binary Hypothesis Model for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Yidong Tang

    2016-01-01

    Full Text Available The sparse representation based classifier (SRC and its kernel version (KSRC have been employed for hyperspectral image (HSI classification. However, the state-of-the-art SRC often aims at extended surface objects with linear mixture in smooth scene and assumes that the number of classes is given. Considering the small target with complex background, a sparse representation based binary hypothesis (SRBBH model is established in this paper. In this model, a query pixel is represented in two ways, which are, respectively, by background dictionary and by union dictionary. The background dictionary is composed of samples selected from the local dual concentric window centered at the query pixel. Thus, for each pixel the classification issue becomes an adaptive multiclass classification problem, where only the number of desired classes is required. Furthermore, the kernel method is employed to improve the interclass separability. In kernel space, the coding vector is obtained by using kernel-based orthogonal matching pursuit (KOMP algorithm. Then the query pixel can be labeled by the characteristics of the coding vectors. Instead of directly using the reconstruction residuals, the different impacts the background dictionary and union dictionary have on reconstruction are used for validation and classification. It enhances the discrimination and hence improves the performance.

  7. Energy-efficiency based classification of the manufacturing workstation

    Science.gov (United States)

    Frumuşanu, G.; Afteni, C.; Badea, N.; Epureanu, A.

    2017-08-01

    EU Directive 92/75/EC established for the first time an energy consumption labelling scheme, further implemented by several other directives. As consequence, nowadays many products (e.g. home appliances, tyres, light bulbs, houses) have an EU Energy Label when offered for sale or rent. Several energy consumption models of manufacturing equipments have been also developed. This paper proposes an energy efficiency - based classification of the manufacturing workstation, aiming to characterize its energetic behaviour. The concept of energy efficiency of the manufacturing workstation is defined. On this base, a classification methodology has been developed. It refers to specific criteria and their evaluation modalities, together to the definition & delimitation of energy efficiency classes. The energy class position is defined after the amount of energy needed by the workstation in the middle point of its operating domain, while its extension is determined by the value of the first coefficient from the Taylor series that approximates the dependence between the energy consume and the chosen parameter of the working regime. The main domain of interest for this classification looks to be the optimization of the manufacturing activities planning and programming. A case-study regarding an actual lathe classification from energy efficiency point of view, based on two different approaches (analytical and numerical) is also included.

  8. Knowledge-based approach to video content classification

    Science.gov (United States)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  9. Bayesian outcome-based strategy classification.

    Science.gov (United States)

    Lee, Michael D

    2016-03-01

    Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.

  10. Towards an agent-oriented programming language based on Scala

    Science.gov (United States)

    Mitrović, Dejan; Ivanović, Mirjana; Budimac, Zoran

    2012-09-01

    Scala and its multi-threaded model based on actors represent an excellent framework for developing purely reactive agents. This paper presents an early research on extending Scala with declarative programming constructs, which would result in a new agent-oriented programming language suitable for developing more advanced, BDI agent architectures. The main advantage the new language over many other existing solutions for programming BDI agents is a natural and straightforward integration of imperative and declarative programming constructs, fitted under a single development framework.

  11. Migration control for mobile agents based on passport and visa

    OpenAIRE

    Guan, SU; Wang, T; Ong, SH

    2003-01-01

    Research on mobile agents has attracted much attention as this paradigm has demonstrated great potential for the next-generation e-commerce. Proper solutions to security-related problems become key factors in the successful deployment of mobile agents in e-commerce systems. We propose the use of passport and visa (P/V) for securing mobile agent migration across communities based on the SAFER e-commerce framework. P/V not only serves as up-to-date digital credentials for agent-host authentica...

  12. Aryl sulfonate based anticancer alkylating agents.

    Science.gov (United States)

    Sheikh, Hamdullah Khadim; Arshad, Tanzila; Kanwal, Ghazala

    2018-05-01

    This research work revolves around synthesis of antineoplastic alkylating sulfonate esters with dual alkylating sites for crosslinking of the DNA strands. These molecules were evaluated as potential antineoplastic cross linking alkylating agents by reaction with the nucleoside of Guanine DNA nucleobase at both ends of the synthesized molecule. Synthesis of the alkylating molecules and the crosslinking with the guanosine nucleoside was monitored by MALDITOF mass spectroscopy. The synthesized molecule's crosslinking or adduct forming rate with the nucleoside was compared with that of 1,4 butane disulfonate (busulfan), in form of time taken for the appearance of [M+H] + . It was found that aryl sulfonate leaving group was causing higher rate of nucleophilic attack by the Lewis basic site of the nucleobase. Furthermore, the rate was also found to be a function of electron withdrawing or donating nature of the substituent on the aryl ring. Compound with strong electron withdrawing substituent on the para position of the ring reacted fastest. Hence, new alkylating agents were synthesized with optimized or desired reactivity.

  13. Complexity in Simplicity: Flexible Agent-based State Space Exploration

    DEFF Research Database (Denmark)

    Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2007-01-01

    In this paper, we describe a new flexible framework for state space exploration based on cooperating agents. The idea is to let various agents with different search patterns explore the state space individually and communicate information about fruitful subpaths of the search tree to each other...

  14. Agent-Based Modeling: A Powerful Tool for Tourism Researchers

    NARCIS (Netherlands)

    Nicholls, Sarah; Amelung, B.; Student, Jillian

    2017-01-01

    Agent-based modeling (ABM) is a way of representing complex systems of autonomous agents or actors, and of simulating the multiple potential outcomes of these agents’ behaviors and interactions in the form of a range of alternatives or futures. Despite the complexity of the tourism system, and the

  15. A Framework For Agent-Based Educational Guidance And ...

    African Journals Online (AJOL)

    This work applies principles of artificial intelligence and agent development of educational guidance and counselling. An agentbased expert system is developed. The system supports the storage and intelligent interactive processing of the knowledge acquired by study and experience of the human expert in the domain ...

  16. Agent-based transportation planning compared with scheduling heuristics

    NARCIS (Netherlands)

    Mes, Martijn R.K.; van der Heijden, Matthijs C.; van Harten, Aart

    2004-01-01

    Here we consider the problem of dynamically assigning vehicles to transportation orders that have di¤erent time windows and should be handled in real time. We introduce a new agent-based system for the planning and scheduling of these transportation networks. Intelligent vehicle agents schedule

  17. Optical beam classification using deep learning: a comparison with rule- and feature-based classification

    Science.gov (United States)

    Alom, Md. Zahangir; Awwal, Abdul A. S.; Lowe-Webb, Roger; Taha, Tarek M.

    2017-08-01

    Vector Machine (SVM). The experimental results show around 96% classification accuracy using CNN; the CNN approach also provides comparable recognition results compared to the present feature-based off-normal detection. The feature-based solution was developed to capture the expertise of a human expert in classifying the images. The misclassified results are further studied to explain the differences and discover any discrepancies or inconsistencies in current classification.

  18. A Chinese text classification system based on Naive Bayes algorithm

    Directory of Open Access Journals (Sweden)

    Cui Wei

    2016-01-01

    Full Text Available In this paper, aiming at the characteristics of Chinese text classification, using the ICTCLAS(Chinese lexical analysis system of Chinese academy of sciences for document segmentation, and for data cleaning and filtering the Stop words, using the information gain and document frequency feature selection algorithm to document feature selection. Based on this, based on the Naive Bayesian algorithm implemented text classifier , and use Chinese corpus of Fudan University has carried on the experiment and analysis on the system.

  19. Agent-based Simulation of the Maritime Domain

    Directory of Open Access Journals (Sweden)

    O. Vaněk

    2010-01-01

    Full Text Available In this paper, a multi-agent based simulation platform is introduced that focuses on legitimate and illegitimate aspects of maritime traffic, mainly on intercontinental transport through piracy afflicted areas. The extensible architecture presented here comprises several modules controlling the simulation and the life-cycle of the agents, analyzing the simulation output and visualizing the entire simulated domain. The simulation control module is initialized by various configuration scenarios to simulate various real-world situations, such as a pirate ambush, coordinated transit through a transport corridor, or coastal fishing and local traffic. The environmental model provides a rich set of inputs for agents that use the geo-spatial data and the vessel operational characteristics for their reasoning. The agent behavior model based on finite state machines together with planning algorithms allows complex expression of agent behavior, so the resulting simulation output can serve as a substitution for real world data from the maritime domain.

  20. Group-Based Active Learning of Classification Models.

    Science.gov (United States)

    Luo, Zhipeng; Hauskrecht, Milos

    2017-05-01

    Learning of classification models from real-world data often requires additional human expert effort to annotate the data. However, this process can be rather costly and finding ways of reducing the human annotation effort is critical for this task. The objective of this paper is to develop and study new ways of providing human feedback for efficient learning of classification models by labeling groups of examples. Briefly, unlike traditional active learning methods that seek feedback on individual examples, we develop a new group-based active learning framework that solicits label information on groups of multiple examples. In order to describe groups in a user-friendly way, conjunctive patterns are used to compactly represent groups. Our empirical study on 12 UCI data sets demonstrates the advantages and superiority of our approach over both classic instance-based active learning work, as well as existing group-based active-learning methods.

  1. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    Science.gov (United States)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land

  2. Agent-based method for distributed clustering of textual information

    Science.gov (United States)

    Potok, Thomas E [Oak Ridge, TN; Reed, Joel W [Knoxville, TN; Elmore, Mark T [Oak Ridge, TN; Treadwell, Jim N [Louisville, TN

    2010-09-28

    A computer method and system for storing, retrieving and displaying information has a multiplexing agent (20) that calculates a new document vector (25) for a new document (21) to be added to the system and transmits the new document vector (25) to master cluster agents (22) and cluster agents (23) for evaluation. These agents (22, 23) perform the evaluation and return values upstream to the multiplexing agent (20) based on the similarity of the document to documents stored under their control. The multiplexing agent (20) then sends the document (21) and the document vector (25) to the master cluster agent (22), which then forwards it to a cluster agent (23) or creates a new cluster agent (23) to manage the document (21). The system also searches for stored documents according to a search query having at least one term and identifying the documents found in the search, and displays the documents in a clustering display (80) of similarity so as to indicate similarity of the documents to each other.

  3. Classification of Hearing Loss Disorders Using Teoae-Based Descriptors

    Science.gov (United States)

    Hatzopoulos, Stavros Dimitris

    Transiently Evoked Otoacoustic Emissions (TEOAE) are signals produced by the cochlea upon stimulation by an acoustic click. Within the context of this dissertation, it was hypothesized that the relationship between the TEOAEs and the functional status of the OHCs provided an opportunity for designing a TEOAE-based clinical procedure that could be used to assess cochlear function. To understand the nature of the TEOAE signals in the time and the frequency domain several different analyses were performed. Using normative Input-Output (IO) curves, short-time FFT analyses and cochlear computer simulations, it was found that for optimization of the hearing loss classification it is necessary to use a complete 20 ms TEOAE segment. It was also determined that various 2-D filtering methods (median and averaging filtering masks, LP-FFT) used to enhance of the TEOAE S/N offered minimal improvement (less than 6 dB per stimulus level). Higher S/N improvements resulted in TEOAE sequences that were over-smoothed. The final classification algorithm was based on a statistical analysis of raw FFT data and when applied to a sample set of clinically obtained TEOAE recordings (from 56 normal and 66 hearing-loss subjects) correctly identified 94.3% of the normal and 90% of the hearing loss subjects, at the 80 dB SPL stimulus level. To enhance the discrimination between the conductive and the sensorineural populations, data from the 68 dB SPL stimulus level were used, which yielded a normal classification of 90.2%, a hearing loss classification of 87.5% and a conductive-sensorineural classification of 87%. Among the hearing-loss populations the best discrimination was obtained in the group of otosclerosis and the worst in the group of acute acoustic trauma.

  4. Vehicle Maneuver Detection with Accelerometer-Based Classification

    Directory of Open Access Journals (Sweden)

    Javier Cervantes-Villanueva

    2016-09-01

    Full Text Available In the mobile computing era, smartphones have become instrumental tools to develop innovative mobile context-aware systems. In that sense, their usage in the vehicular domain eases the development of novel and personal transportation solutions. In this frame, the present work introduces an innovative mechanism to perceive the current kinematic state of a vehicle on the basis of the accelerometer data from a smartphone mounted in the vehicle. Unlike previous proposals, the introduced architecture targets the computational limitations of such devices to carry out the detection process following an incremental approach. For its realization, we have evaluated different classification algorithms to act as agents within the architecture. Finally, our approach has been tested with a real-world dataset collected by means of the ad hoc mobile application developed.

  5. Comparison Of Power Quality Disturbances Classification Based On Neural Network

    Directory of Open Access Journals (Sweden)

    Nway Nway Kyaw Win

    2015-07-01

    Full Text Available Abstract Power quality disturbances PQDs result serious problems in the reliability safety and economy of power system network. In order to improve electric power quality events the detection and classification of PQDs must be made type of transient fault. Software analysis of wavelet transform with multiresolution analysis MRA algorithm and feed forward neural network probabilistic and multilayer feed forward neural network based methodology for automatic classification of eight types of PQ signals flicker harmonics sag swell impulse fluctuation notch and oscillatory will be presented. The wavelet family Db4 is chosen in this system to calculate the values of detailed energy distributions as input features for classification because it can perform well in detecting and localizing various types of PQ disturbances. This technique classifies the types of PQDs problem sevents.The classifiers classify and identify the disturbance type according to the energy distribution. The results show that the PNN can analyze different power disturbance types efficiently. Therefore it can be seen that PNN has better classification accuracy than MLFF.

  6. Classification of neuromuscular blocking agents in a new neuromuscular preparation of the chick in vitro

    NARCIS (Netherlands)

    Riezen, H. van

    1968-01-01

    A neuromuscular preparation of the chick is described: 1. 1. The sciatic nerve-tibilis anterior muscle preparation of the 2–10 days old chick fulfils all criteria of an assay preparation and differentiates between curare-like and decamethonium-like agents. 2. 2. The preparation responds to

  7. Agent-Based Collaborative Traffic Flow Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose agent-based game-theoretic approaches for simulation of strategies involved in multi-objective collaborative traffic flow management (CTFM). Intelligent...

  8. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  9. Teamcore Project Control of Agent-Based Systems (COABS) Program

    National Research Council Canada - National Science Library

    Tambe, Milind

    2002-01-01

    An increasing number of agent-based systems now operate in complex dynamic environments, such as disaster rescue missions, monitoring/surveillance tasks, enterprise integration, and education/training environments...

  10. Use of agent based simulation for traffic safety assessment

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2008-07-01

    Full Text Available This paper describes the development of an agent based Computational Building Simulation (CBS) tool, termed KRONOS that is being used to work on advanced research questions such as traffic safety assessment and user behaviour in buildings...

  11. Structure-based classification and ontology in chemistry

    Directory of Open Access Journals (Sweden)

    Hastings Janna

    2012-04-01

    Full Text Available Abstract Background Recent years have seen an explosion in the availability of data in the chemistry domain. With this information explosion, however, retrieving relevant results from the available information, and organising those results, become even harder problems. Computational processing is essential to filter and organise the available resources so as to better facilitate the work of scientists. Ontologies encode expert domain knowledge in a hierarchically organised machine-processable format. One such ontology for the chemical domain is ChEBI. ChEBI provides a classification of chemicals based on their structural features and a role or activity-based classification. An example of a structure-based class is 'pentacyclic compound' (compounds containing five-ring structures, while an example of a role-based class is 'analgesic', since many different chemicals can act as analgesics without sharing structural features. Structure-based classification in chemistry exploits elegant regularities and symmetries in the underlying chemical domain. As yet, there has been neither a systematic analysis of the types of structural classification in use in chemistry nor a comparison to the capabilities of available technologies. Results We analyze the different categories of structural classes in chemistry, presenting a list of patterns for features found in class definitions. We compare these patterns of class definition to tools which allow for automation of hierarchy construction within cheminformatics and within logic-based ontology technology, going into detail in the latter case with respect to the expressive capabilities of the Web Ontology Language and recent extensions for modelling structured objects. Finally we discuss the relationships and interactions between cheminformatics approaches and logic-based approaches. Conclusion Systems that perform intelligent reasoning tasks on chemistry data require a diverse set of underlying computational

  12. Gadolinium-based contrast agents in pediatric magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Gale, Eric M.; Caravan, Peter [Massachusetts General Hospital, Harvard Medical School, Department of Radiology, The Martinos Center for Biomedical Imaging, Boston, MA (United States); Rao, Anil G. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); McDonald, Robert J. [College of Medicine, Mayo Clinic, Department of Radiology, Rochester, MN (United States); Winfeld, Matthew [University of Pennsylvania Perelman School of Medicine, Philadelphia, PA (United States); Fleck, Robert J. [Cincinnati Children' s Hospital Medical Center, Department of Pediatric Radiology, Cincinnati, OH (United States); Gee, Michael S. [MassGeneral Hospital for Children, Harvard Medical School, Division of Pediatric Imaging, Department of Radiology, Boston, MA (United States)

    2017-05-15

    Gadolinium-based contrast agents can increase the accuracy and expediency of an MRI examination. However the benefits of a contrast-enhanced scan must be carefully weighed against the well-documented risks associated with administration of exogenous contrast media. The purpose of this review is to discuss commercially available gadolinium-based contrast agents (GBCAs) in the context of pediatric radiology. We discuss the chemistry, regulatory status, safety and clinical applications, with particular emphasis on imaging of the blood vessels, heart, hepatobiliary tree and central nervous system. We also discuss non-GBCA MRI contrast agents that are less frequently used or not commercially available. (orig.)

  13. The fractional volatility model: An agent-based interpretation

    Science.gov (United States)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  14. An Intelligent Agent based Architecture for Visual Data Mining

    OpenAIRE

    Hamdi Ellouzi; Hela Ltifi; Mounir Ben Ayed

    2016-01-01

    the aim of this paper is to present an intelligent architecture of Decision Support System (DSS) based on visual data mining. This architecture applies the multi-agent technology to facilitate the design and development of DSS in complex and dynamic environment. Multi-Agent Systems add a high level of abstraction. To validate the proposed architecture, it is implemented to develop a distributed visual data mining based DSS to predict nosocomial infectionsoccurrence in intensive care units. Th...

  15. Gradient Evolution-based Support Vector Machine Algorithm for Classification

    Science.gov (United States)

    Zulvia, Ferani E.; Kuo, R. J.

    2018-03-01

    This paper proposes a classification algorithm based on a support vector machine (SVM) and gradient evolution (GE) algorithms. SVM algorithm has been widely used in classification. However, its result is significantly influenced by the parameters. Therefore, this paper aims to propose an improvement of SVM algorithm which can find the best SVMs’ parameters automatically. The proposed algorithm employs a GE algorithm to automatically determine the SVMs’ parameters. The GE algorithm takes a role as a global optimizer in finding the best parameter which will be used by SVM algorithm. The proposed GE-SVM algorithm is verified using some benchmark datasets and compared with other metaheuristic-based SVM algorithms. The experimental results show that the proposed GE-SVM algorithm obtains better results than other algorithms tested in this paper.

  16. Vessel-guided airway segmentation based on voxel classification

    DEFF Research Database (Denmark)

    Lo, Pechin Chien Pau; Sporring, Jon; Ashraf, Haseem

    2008-01-01

    This paper presents a method for improving airway tree segmentation using vessel orientation information. We use the fact that an airway branch is always accompanied by an artery, with both structures having similar orientations. This work is based on a  voxel classification airway segmentation...... method proposed previously. The probability of a voxel belonging to the airway, from the voxel classification method, is augmented with an orientation similarity measure as a criterion for region growing. The orientation similarity measure of a voxel indicates how similar is the orientation...... of the surroundings of a voxel, estimated based on a tube model, is to that of a neighboring vessel. The proposed method is tested on 20 CT images from different subjects selected randomly from a lung cancer screening study. Length of the airway branches from the results of the proposed method are significantly...

  17. A Sieving ANN for Emotion-Based Movie Clip Classification

    Science.gov (United States)

    Watanapa, Saowaluk C.; Thipakorn, Bundit; Charoenkitkarn, Nipon

    Effective classification and analysis of semantic contents are very important for the content-based indexing and retrieval of video database. Our research attempts to classify movie clips into three groups of commonly elicited emotions, namely excitement, joy and sadness, based on a set of abstract-level semantic features extracted from the film sequence. In particular, these features consist of six visual and audio measures grounded on the artistic film theories. A unique sieving-structured neural network is proposed to be the classifying model due to its robustness. The performance of the proposed model is tested with 101 movie clips excerpted from 24 award-winning and well-known Hollywood feature films. The experimental result of 97.8% correct classification rate, measured against the collected human-judges, indicates the great potential of using abstract-level semantic features as an engineered tool for the application of video-content retrieval/indexing.

  18. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  19. Laser-based instrumentation for the detection of chemical agents

    International Nuclear Information System (INIS)

    Hartford, A. Jr.; Sander, R.K.; Quigley, G.P.; Radziemski, L.J.; Cremers, D.A.

    1982-01-01

    Several laser-based techniques are being evaluated for the remote, point, and surface detection of chemical agents. Among the methods under investigation are optoacoustic spectroscopy, laser-induced breakdown spectroscopy (LIBS), and synchronous detection of laser-induced fluorescence (SDLIF). Optoacoustic detection has already been shown to be capable of extremely sensitive point detection. Its application to remote sensing of chemical agents is currently being evaluated. Atomic emission from the region of a laser-generated plasma has been used to identify the characteristic elements contained in nerve (P and F) and blister (S and Cl) agents. Employing this LIBS approach, detection of chemical agent simulants dispersed in air and adsorbed on a variety of surfaces has been achieved. Synchronous detection of laser-induced fluorescence provides an attractive alternative to conventional LIF, in that an artificial narrowing of the fluorescence emission is obtained. The application of this technique to chemical agent simulants has been successfully demonstrated. 19 figures

  20. Evolutionary game theory using agent-based methods.

    Science.gov (United States)

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Multi-issue Agent Negotiation Based on Fairness

    Science.gov (United States)

    Zuo, Baohe; Zheng, Sue; Wu, Hong

    Agent-based e-commerce service has become a hotspot now. How to make the agent negotiation process quickly and high-efficiently is the main research direction of this area. In the multi-issue model, MAUT(Multi-attribute Utility Theory) or its derived theory usually consider little about the fairness of both negotiators. This work presents a general model of agent negotiation which considered the satisfaction of both negotiators via autonomous learning. The model can evaluate offers from the opponent agent based on the satisfaction degree, learn online to get the opponent's knowledge from interactive instances of history and negotiation of this time, make concessions dynamically based on fair object. Through building the optimal negotiation model, the bilateral negotiation achieved a higher efficiency and fairer deal.

  2. A new gammagraphic and functional-based classification for hyperthyroidism

    International Nuclear Information System (INIS)

    Sanchez, J.; Lamata, F.; Cerdan, R.; Agilella, V.; Gastaminza, R.; Abusada, R.; Gonzales, M.; Martinez, M.

    2000-01-01

    The absence of an universal classification for hyperthyroidism's (HT), give rise to inadequate interpretation of series and trials, and prevents decision making. We offer a tentative classification based on gammagraphic and functional findings. Clinical records from patients who underwent thyroidectomy in our Department since 1967 to 1997 were reviewed. Those with functional measurements of hyperthyroidism were considered. All were managed according to the same preestablished guidelines. HT was the surgical indication in 694 (27,1%) of the 2559 thyroidectomy. Based on gammagraphic studies, we classified HTs in: parenchymatous increased-uptake, which could be diffuse, diffuse with cold nodules or diffuse with at least one nodule, and nodular increased-uptake (Autonomous Functioning Thyroid Nodes-AFTN), divided into solitary AFTN or toxic adenoma and multiple AFTN o toxic multi-nodular goiter. This gammagraphic-based classification in useful and has high sensitivity to detect these nodules assessing their activity, allowing us to make therapeutic decision making and, in some cases, to choose surgical technique. (authors)

  3. Changing Histopathological Diagnostics by Genome-Based Tumor Classification

    Directory of Open Access Journals (Sweden)

    Michael Kloth

    2014-05-01

    Full Text Available Traditionally, tumors are classified by histopathological criteria, i.e., based on their specific morphological appearances. Consequently, current therapeutic decisions in oncology are strongly influenced by histology rather than underlying molecular or genomic aberrations. The increase of information on molecular changes however, enabled by the Human Genome Project and the International Cancer Genome Consortium as well as the manifold advances in molecular biology and high-throughput sequencing techniques, inaugurated the integration of genomic information into disease classification. Furthermore, in some cases it became evident that former classifications needed major revision and adaption. Such adaptations are often required by understanding the pathogenesis of a disease from a specific molecular alteration, using this molecular driver for targeted and highly effective therapies. Altogether, reclassifications should lead to higher information content of the underlying diagnoses, reflecting their molecular pathogenesis and resulting in optimized and individual therapeutic decisions. The objective of this article is to summarize some particularly important examples of genome-based classification approaches and associated therapeutic concepts. In addition to reviewing disease specific markers, we focus on potentially therapeutic or predictive markers and the relevance of molecular diagnostics in disease monitoring.

  4. G0-WISHART Distribution Based Classification from Polarimetric SAR Images

    Science.gov (United States)

    Hu, G. C.; Zhao, Q. H.

    2017-09-01

    Enormous scientific and technical developments have been carried out to further improve the remote sensing for decades, particularly Polarimetric Synthetic Aperture Radar(PolSAR) technique, so classification method based on PolSAR images has getted much more attention from scholars and related department around the world. The multilook polarmetric G0-Wishart model is a more flexible model which describe homogeneous, heterogeneous and extremely heterogeneous regions in the image. Moreover, the polarmetric G0-Wishart distribution dose not include the modified Bessel function of the second kind. It is a kind of simple statistical distribution model with less parameter. To prove its feasibility, a process of classification has been tested with the full-polarized Synthetic Aperture Radar (SAR) image by the method. First, apply multilook polarimetric SAR data process and speckle filter to reduce speckle influence for classification result. Initially classify the image into sixteen classes by H/A/α decomposition. Using the ICM algorithm to classify feature based on the G0-Wshart distance. Qualitative and quantitative results show that the proposed method can classify polaimetric SAR data effectively and efficiently.

  5. Agent-based services for B2B electronic commerce

    Science.gov (United States)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  6. Emergent Macroeconomics An Agent-Based Approach to Business Fluctuations

    CERN Document Server

    Delli Gatti, Domenico; Gallegati, Mauro; Giulioni, Gianfranco; Palestrini, Antonio

    2008-01-01

    This book contributes substantively to the current state-of-the-art of macroeconomics by providing a method for building models in which business cycles and economic growth emerge from the interactions of a large number of heterogeneous agents. Drawing from recent advances in agent-based computational modeling, the authors show how insights from dispersed fields like the microeconomics of capital market imperfections, industrial dynamics and the theory of stochastic processes can be fruitfully combined to improve our understanding of macroeconomic dynamics. This book should be a valuable resource for all researchers interested in analyzing macroeconomic issues without recurring to a fictitious representative agent.

  7. QoS Negotiation and Renegotiation Based on Mobile Agents

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shi-bing; ZHANG Deng-yin

    2006-01-01

    The Quality of Service (QoS) has received more and more attention since QoS becomes increasingly important in the Internet development. Mobile software agents represent a valid alternative to the implementation of strategies for the negotiation. In this paper, a QoS negotiation and renegotiation system architecture based on mobile agents is proposed. The agents perform the task in the whole process. Therefore, such a system can reduce the network load, overcome latency, and avoid frequent exchange information between clients and server. The simulation results show that the proposed system could improve the network resource utility about 10%.

  8. Histological image classification using biologically interpretable shape-based features

    International Nuclear Information System (INIS)

    Kothari, Sonal; Phan, John H; Young, Andrew N; Wang, May D

    2013-01-01

    Automatic cancer diagnostic systems based on histological image classification are important for improving therapeutic decisions. Previous studies propose textural and morphological features for such systems. These features capture patterns in histological images that are useful for both cancer grading and subtyping. However, because many of these features lack a clear biological interpretation, pathologists may be reluctant to adopt these features for clinical diagnosis. We examine the utility of biologically interpretable shape-based features for classification of histological renal tumor images. Using Fourier shape descriptors, we extract shape-based features that capture the distribution of stain-enhanced cellular and tissue structures in each image and evaluate these features using a multi-class prediction model. We compare the predictive performance of the shape-based diagnostic model to that of traditional models, i.e., using textural, morphological and topological features. The shape-based model, with an average accuracy of 77%, outperforms or complements traditional models. We identify the most informative shapes for each renal tumor subtype from the top-selected features. Results suggest that these shapes are not only accurate diagnostic features, but also correlate with known biological characteristics of renal tumors. Shape-based analysis of histological renal tumor images accurately classifies disease subtypes and reveals biologically insightful discriminatory features. This method for shape-based analysis can be extended to other histological datasets to aid pathologists in diagnostic and therapeutic decisions

  9. A Multiagent-based Intrusion Detection System with the Support of Multi-Class Supervised Classification

    Science.gov (United States)

    Shyu, Mei-Ling; Sainani, Varsha

    The increasing number of network security related incidents have made it necessary for the organizations to actively protect their sensitive data with network intrusion detection systems (IDSs). IDSs are expected to analyze a large volume of data while not placing a significantly added load on the monitoring systems and networks. This requires good data mining strategies which take less time and give accurate results. In this study, a novel data mining assisted multiagent-based intrusion detection system (DMAS-IDS) is proposed, particularly with the support of multiclass supervised classification. These agents can detect and take predefined actions against malicious activities, and data mining techniques can help detect them. Our proposed DMAS-IDS shows superior performance compared to central sniffing IDS techniques, and saves network resources compared to other distributed IDS with mobile agents that activate too many sniffers causing bottlenecks in the network. This is one of the major motivations to use a distributed model based on multiagent platform along with a supervised classification technique.

  10. Next frontier in agent-based complex automated negotiation

    CERN Document Server

    Ito, Takayuki; Zhang, Minjie; Robu, Valentin

    2015-01-01

    This book focuses on automated negotiations based on multi-agent systems. It is intended for researchers and students in various fields involving autonomous agents and multi-agent systems, such as e-commerce tools, decision-making and negotiation support systems, and collaboration tools. The contents will help them to understand the concept of automated negotiations, negotiation protocols, negotiating agents’ strategies, and the applications of those strategies. In this book, some negotiation protocols focusing on the multiple interdependent issues in negotiations are presented, making it possible to find high-quality solutions for the complex agents’ utility functions. This book is a compilation of the extended versions of the very best papers selected from the many that were presented at the International Workshop on Agent-Based Complex Automated Negotiations.

  11. Towards a framework for agent-based image analysis of remote-sensing data.

    Science.gov (United States)

    Hofmann, Peter; Lettmayer, Paul; Blaschke, Thomas; Belgiu, Mariana; Wegenkittl, Stefan; Graf, Roland; Lampoltshammer, Thomas Josef; Andrejchenko, Vera

    2015-04-03

    Object-based image analysis (OBIA) as a paradigm for analysing remotely sensed image data has in many cases led to spatially and thematically improved classification results in comparison to pixel-based approaches. Nevertheless, robust and transferable object-based solutions for automated image analysis capable of analysing sets of images or even large image archives without any human interaction are still rare. A major reason for this lack of robustness and transferability is the high complexity of image contents: Especially in very high resolution (VHR) remote-sensing data with varying imaging conditions or sensor characteristics, the variability of the objects' properties in these varying images is hardly predictable. The work described in this article builds on so-called rule sets. While earlier work has demonstrated that OBIA rule sets bear a high potential of transferability, they need to be adapted manually, or classification results need to be adjusted manually in a post-processing step. In order to automate these adaptation and adjustment procedures, we investigate the coupling, extension and integration of OBIA with the agent-based paradigm, which is exhaustively investigated in software engineering. The aims of such integration are (a) autonomously adapting rule sets and (b) image objects that can adopt and adjust themselves according to different imaging conditions and sensor characteristics. This article focuses on self-adapting image objects and therefore introduces a framework for agent-based image analysis (ABIA).

  12. Design and implementation based on the classification protection vulnerability scanning system

    International Nuclear Information System (INIS)

    Wang Chao; Lu Zhigang; Liu Baoxu

    2010-01-01

    With the application and spread of the classification protection, Network Security Vulnerability Scanning should consider the efficiency and the function expansion. It proposes a kind of a system vulnerability from classification protection, and elaborates the design and implementation of a vulnerability scanning system based on vulnerability classification plug-in technology and oriented classification protection. According to the experiment, the application of classification protection has good adaptability and salability with the system, and it also approves the efficiency of scanning. (authors)

  13. Soil classification basing on the spectral characteristics of topsoil samples

    Science.gov (United States)

    Liu, Huanjun; Zhang, Xiaokang; Zhang, Xinle

    2016-04-01

    Soil taxonomy plays an important role in soil utility and management, but China has only course soil map created based on 1980s data. New technology, e.g. spectroscopy, could simplify soil classification. The study try to classify soils basing on the spectral characteristics of topsoil samples. 148 topsoil samples of typical soils, including Black soil, Chernozem, Blown soil and Meadow soil, were collected from Songnen plain, Northeast China, and the room spectral reflectance in the visible and near infrared region (400-2500 nm) were processed with weighted moving average, resampling technique, and continuum removal. Spectral indices were extracted from soil spectral characteristics, including the second absorption positions of spectral curve, the first absorption vale's area, and slope of spectral curve at 500-600 nm and 1340-1360 nm. Then K-means clustering and decision tree were used respectively to build soil classification model. The results indicated that 1) the second absorption positions of Black soil and Chernozem were located at 610 nm and 650 nm respectively; 2) the spectral curve of the meadow is similar to its adjacent soil, which could be due to soil erosion; 3) decision tree model showed higher classification accuracy, and accuracy of Black soil, Chernozem, Blown soil and Meadow are 100%, 88%, 97%, 50% respectively, and the accuracy of Blown soil could be increased to 100% by adding one more spectral index (the first two vole's area) to the model, which showed that the model could be used for soil classification and soil map in near future.

  14. Dissemination of Cultural Norms and Values: Agent-Based Modeling

    Directory of Open Access Journals (Sweden)

    Denis Andreevich Degterev

    2016-12-01

    Full Text Available This article shows how agent-based modeling allows us to explore the mechanisms of the dissemination of cultural norms and values both within one country and in the whole world. In recent years, this type of simulation is particularly prevalent in the analysis of international relations, becoming more popular than the system dynamics and discrete event simulation. The use of agent-based modeling in the analysis of international relations is connected with the agent-structure problem in international relations. Structure and agents act as interdependent and dynamically changing in the process of interaction between entities. Agent-structure interaction could be modeled by means of the theory of complex adaptive systems with the use of agent-based modeling techniques. One of the first examples of the use of agent-based modeling in political science is a model of racial segregation T. Shellinga. On the basis of this model, the author shows how the change in behavioral patterns at micro-level impacts on the macro-level. Patterns are changing due to the dynamics of cultural norms and values, formed by mass-media and other social institutes. The author shows the main areas of modern application of agent-based modeling in international studies including the analysis of ethnic conflicts, the formation of international coalitions. Particular attention is paid to Robert Axelrod approach based on the use of genetic algorithms to the spread of cultural norms and values. Agent-based modeling shows how to how to create such conditions that the norms that originally are not shared by a significant part of the population, eventually spread everywhere. Practical application of these algorithms is shown by the author of the article on the example of the situation in Ukraine in 2015-2016. The article also reveals the mechanisms of international spread of cultural norms and values. The main think-tanks using agent-based modeling in international studies are

  15. Invariance and universality in social agent-based simulations

    Science.gov (United States)

    Cioffi-Revilla, Claudio

    2002-01-01

    Agent-based simulation models have a promising future in the social sciences, from political science to anthropology, economics, and sociology. To realize their full scientific potential, however, these models must address a set of key problems, such as the number of interacting agents and their geometry, network topology, time calibration, phenomenological calibration, structural stability, power laws, and other substantive and methodological issues. This paper discusses and highlights these problems and outlines some solutions. PMID:12011412

  16. An Agent Based Collaborative Simplification of 3D Mesh Model

    Science.gov (United States)

    Wang, Li-Rong; Yu, Bo; Hagiwara, Ichiro

    Large-volume mesh model faces the challenge in fast rendering and transmission by Internet. The current mesh models obtained by using three-dimensional (3D) scanning technology are usually very large in data volume. This paper develops a mobile agent based collaborative environment on the development platform of mobile-C. Communication among distributed agents includes grasping image of visualized mesh model, annotation to grasped image and instant message. Remote and collaborative simplification can be efficiently conducted by Internet.

  17. Contract Monitoring in Agent-Based Systems: Case Study

    Science.gov (United States)

    Hodík, Jiří; Vokřínek, Jiří; Jakob, Michal

    Monitoring of fulfilment of obligations defined by electronic contracts in distributed domains is presented in this paper. A two-level model of contract-based systems and the types of observations needed for contract monitoring are introduced. The observations (inter-agent communication and agents’ actions) are collected and processed by the contract observation and analysis pipeline. The presented approach has been utilized in a multi-agent system for electronic contracting in a modular certification testing domain.

  18. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  19. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  20. Radiographic classification for fractures of the fifth metatarsal base

    International Nuclear Information System (INIS)

    Mehlhorn, Alexander T.; Zwingmann, Joern; Hirschmueller, Anja; Suedkamp, Norbert P.; Schmal, Hagen

    2014-01-01

    Avulsion fractures of the fifth metatarsal base (MTB5) are common fore foot injuries. Based on a radiomorphometric analysis reflecting the risk for a secondary displacement, a new classification was developed. A cohort of 95 healthy, sportive, and young patients (age ≤ 50 years) with avulsion fractures of the MTB5 was included in the study and divided into groups with non-displaced, primary-displaced, and secondary-displaced fractures. Radiomorphometric data obtained using standard oblique and dorso-plantar views were analyzed in association with secondary displacement. Based on this, a classification was developed and checked for reproducibility. Fractures with a longer distance between the lateral edge of the styloid process and the lateral fracture step-off and fractures with a more medial joint entry of the fracture line at the MTB5 are at higher risk to displace secondarily. Based on these findings, all fractures were divided into three types: type I with a fracture entry in the lateral third; type II in the middle third; and type III in the medial third of the MTB5. Additionally, the three types were subdivided into an A-type with a fracture displacement <2 mm and a B-type with a fracture displacement ≥ 2 mm. A substantial level of interobserver agreement was found in the assignment of all 95 fractures to the six fracture types (κ = 0.72). The secondary displacement of fractures was confirmed by all examiners in 100 %. Radiomorphometric data may identify fractures at risk for secondary displacement of the MTB5. Based on this, a reliable classification was developed. (orig.)

  1. Radiographic classification for fractures of the fifth metatarsal base

    Energy Technology Data Exchange (ETDEWEB)

    Mehlhorn, Alexander T.; Zwingmann, Joern; Hirschmueller, Anja; Suedkamp, Norbert P.; Schmal, Hagen [University of Freiburg Medical Center, Department of Orthopaedic Surgery, Freiburg (Germany)

    2014-04-15

    Avulsion fractures of the fifth metatarsal base (MTB5) are common fore foot injuries. Based on a radiomorphometric analysis reflecting the risk for a secondary displacement, a new classification was developed. A cohort of 95 healthy, sportive, and young patients (age ≤ 50 years) with avulsion fractures of the MTB5 was included in the study and divided into groups with non-displaced, primary-displaced, and secondary-displaced fractures. Radiomorphometric data obtained using standard oblique and dorso-plantar views were analyzed in association with secondary displacement. Based on this, a classification was developed and checked for reproducibility. Fractures with a longer distance between the lateral edge of the styloid process and the lateral fracture step-off and fractures with a more medial joint entry of the fracture line at the MTB5 are at higher risk to displace secondarily. Based on these findings, all fractures were divided into three types: type I with a fracture entry in the lateral third; type II in the middle third; and type III in the medial third of the MTB5. Additionally, the three types were subdivided into an A-type with a fracture displacement <2 mm and a B-type with a fracture displacement ≥ 2 mm. A substantial level of interobserver agreement was found in the assignment of all 95 fractures to the six fracture types (κ = 0.72). The secondary displacement of fractures was confirmed by all examiners in 100 %. Radiomorphometric data may identify fractures at risk for secondary displacement of the MTB5. Based on this, a reliable classification was developed. (orig.)

  2. Nanochemistry of protein-based delivery agents

    Science.gov (United States)

    Rajendran, Subin; Udenigwe, Chibuike; Yada, Rickey

    2016-07-01

    The past decade has seen an increased interest in the conversion of food proteins into functional biomaterials, including their use for loading and delivery of physiologically active compounds such as nutraceuticals and pharmaceuticals. Proteins possess a competitive advantage over other platforms for the development of nanodelivery systems since they are biocompatible, amphipathic, and widely available. Proteins also have unique molecular structures and diverse functional groups that can be selectively modified to alter encapsulation and release properties. A number of physical and chemical methods have been used for preparing protein nanoformulations, each based on different underlying protein chemistry. This review focuses on the chemistry of the reorganization and/or modification of proteins into functional nanostructures for delivery, from the perspective of their preparation, functionality, stability and physiological behavior.

  3. Nanochemistry of protein-based delivery agents

    Directory of Open Access Journals (Sweden)

    Subin R.C.K. Rajendran

    2016-07-01

    Full Text Available The past decade has seen an increased interest in the conversion of food proteins into functional biomaterials, including their use for loading and delivery of physiologically active compounds such as nutraceuticals and pharmaceuticals. Proteins possess a competitive advantage over other platforms for the development of nanodelivery systems since they are biocompatible, amphipathic, and widely available. Proteins also have unique molecular structures and diverse functional groups that can be selectively modified to alter encapsulation and release properties. A number of physical and chemical methods have been used for preparing protein nanoformulations, each based on different underlying protein chemistry. This review focuses on the chemistry of the reorganization and/or modification of proteins into functional nanostructures for delivery, from the perspective of their preparation, functionality, stability and physiological behavior.

  4. Risk Classification and Risk-based Safety and Mission Assurance

    Science.gov (United States)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  5. Overfitting Reduction of Text Classification Based on AdaBELM

    Directory of Open Access Journals (Sweden)

    Xiaoyue Feng

    2017-07-01

    Full Text Available Overfitting is an important problem in machine learning. Several algorithms, such as the extreme learning machine (ELM, suffer from this issue when facing high-dimensional sparse data, e.g., in text classification. One common issue is that the extent of overfitting is not well quantified. In this paper, we propose a quantitative measure of overfitting referred to as the rate of overfitting (RO and a novel model, named AdaBELM, to reduce the overfitting. With RO, the overfitting problem can be quantitatively measured and identified. The newly proposed model can achieve high performance on multi-class text classification. To evaluate the generalizability of the new model, we designed experiments based on three datasets, i.e., the 20 Newsgroups, Reuters-21578, and BioMed corpora, which represent balanced, unbalanced, and real application data, respectively. Experiment results demonstrate that AdaBELM can reduce overfitting and outperform classical ELM, decision tree, random forests, and AdaBoost on all three text-classification datasets; for example, it can achieve 62.2% higher accuracy than ELM. Therefore, the proposed model has a good generalizability.

  6. Image Classification Based on Convolutional Denoising Sparse Autoencoder

    Directory of Open Access Journals (Sweden)

    Shuangshuang Chen

    2017-01-01

    Full Text Available Image classification aims to group images into corresponding semantic categories. Due to the difficulties of interclass similarity and intraclass variability, it is a challenging issue in computer vision. In this paper, an unsupervised feature learning approach called convolutional denoising sparse autoencoder (CDSAE is proposed based on the theory of visual attention mechanism and deep learning methods. Firstly, saliency detection method is utilized to get training samples for unsupervised feature learning. Next, these samples are sent to the denoising sparse autoencoder (DSAE, followed by convolutional layer and local contrast normalization layer. Generally, prior in a specific task is helpful for the task solution. Therefore, a new pooling strategy—spatial pyramid pooling (SPP fused with center-bias prior—is introduced into our approach. Experimental results on the common two image datasets (STL-10 and CIFAR-10 demonstrate that our approach is effective in image classification. They also demonstrate that none of these three components: local contrast normalization, SPP fused with center-prior, and l2 vector normalization can be excluded from our proposed approach. They jointly improve image representation and classification performance.

  7. Tongue Images Classification Based on Constrained High Dispersal Network

    Directory of Open Access Journals (Sweden)

    Dan Meng

    2017-01-01

    Full Text Available Computer aided tongue diagnosis has a great potential to play important roles in traditional Chinese medicine (TCM. However, the majority of the existing tongue image analyses and classification methods are based on the low-level features, which may not provide a holistic view of the tongue. Inspired by deep convolutional neural network (CNN, we propose a novel feature extraction framework called constrained high dispersal neural networks (CHDNet to extract unbiased features and reduce human labor for tongue diagnosis in TCM. Previous CNN models have mostly focused on learning convolutional filters and adapting weights between them, but these models have two major issues: redundancy and insufficient capability in handling unbalanced sample distribution. We introduce high dispersal and local response normalization operation to address the issue of redundancy. We also add multiscale feature analysis to avoid the problem of sensitivity to deformation. Our proposed CHDNet learns high-level features and provides more classification information during training time, which may result in higher accuracy when predicting testing samples. We tested the proposed method on a set of 267 gastritis patients and a control group of 48 healthy volunteers. Test results show that CHDNet is a promising method in tongue image classification for the TCM study.

  8. Evaluating Water Demand Using Agent-Based Modeling

    Science.gov (United States)

    Lowry, T. S.

    2004-12-01

    The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage

  9. An Approach for Leukemia Classification Based on Cooperative Game Theory

    Directory of Open Access Journals (Sweden)

    Atefeh Torkaman

    2011-01-01

    Full Text Available Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO. Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5 with (90.16% in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment

  10. An approach for leukemia classification based on cooperative game theory.

    Science.gov (United States)

    Torkaman, Atefeh; Charkari, Nasrollah Moghaddam; Aghaeipour, Mahnaz

    2011-01-01

    Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO). Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5) with (90.16%) in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment of leukemic

  11. Moral Guilt : An Agent-Based Model Analysis

    OpenAIRE

    Gaudou , Benoit; Lorini , Emiliano; Mayor , Eunate

    2013-01-01

    International audience; In this article we analyze the influence of a concrete moral emotion (i.e. moral guilt) on strategic decision making. We present a normal form Prisoner’s Dilemma with a moral component. We assume that agents evaluate the game’s outcomes with respect to their ideality degree (i.e. how much a given outcome conforms to the player’s moral values), based on two proposed notions on ethical preferences: Harsanyi’s and Rawls’. Based on such game, we construct and agent-based m...

  12. Methods for Model-Based Reasoning within Agent-Based Ambient Intelligence Applications

    NARCIS (Netherlands)

    Bosse, T.; Both, F.; Gerritsen, C.; Hoogendoorn, M.; Treur, J.

    2012-01-01

    Within agent-based Ambient Intelligence applications agents react to humans based on information obtained by sensoring and their knowledge about human functioning. Appropriate types of reactions depend on the extent to which an agent understands the human and is able to interpret the available

  13. Agent-based simulation of a financial market

    Science.gov (United States)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  14. A role based coordination model in agent systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ya-ying; YOU Jin-yuan

    2005-01-01

    Coordination technology addresses the construction of open, flexible systems from active and independent software agents in concurrent and distributed systems. In most open distributed applications, multiple agents need interaction and communication to achieve their overall goal. Coordination technologies for the Internet typically are concerned with enabling interaction among agents and helping them cooperate with each other.At the same time, access control should also be considered to constrain interaction to make it harmless. Access control should be regarded as the security counterpart of coordination. At present, the combination of coordination and access control remains an open problem. Thus, we propose a role based coordination model with policy enforcement in agent application systems. In this model, coordination is combined with access control so as to fully characterize the interactions in agent systems. A set of agents interacting with each other for a common global system task constitutes a coordination group. Role based access control is applied in this model to prevent unauthorized accesses. Coordination policy is enforced in a distributed manner so that the model can be applied to the open distributed systems such as Intemet. An Internet online auction system is presented as a case study to illustrate the proposed coordination model and finally the performance analysis of the model is introduced.

  15. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  16. MATT: Multi Agents Testing Tool Based Nets within Nets

    Directory of Open Access Journals (Sweden)

    Sara Kerraoui

    2016-12-01

    As part of this effort, we propose a model based testing approach for multi agent systems based on such a model called Reference net, where a tool, which aims to providing a uniform and automated approach is developed. The feasibility and the advantage of the proposed approach are shown through a short case study.

  17. Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben

    Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based on the po......Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based...

  18. A technology path to tactical agent-based modeling

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.

    2017-05-01

    Wargaming is a process of thinking through and visualizing events that could occur during a possible course of action. Over the past 200 years, wargaming has matured into a set of formalized processes. One area of growing interest is the application of agent-based modeling. Agent-based modeling and its additional supporting technologies has potential to introduce a third-generation wargaming capability to the Army, creating a positive overmatch decision-making capability. In its simplest form, agent-based modeling is a computational technique that helps the modeler understand and simulate how the "whole of a system" responds to change over time. It provides a decentralized method of looking at situations where individual agents are instantiated within an environment, interact with each other, and empowered to make their own decisions. However, this technology is not without its own risks and limitations. This paper explores a technology roadmap, identifying research topics that could realize agent-based modeling within a tactical wargaming context.

  19. Agent-based simulation of electricity markets : a literature review

    International Nuclear Information System (INIS)

    Sensfuss, F.; Genoese, M.; Genoese, M.; Most, D.

    2007-01-01

    The electricity sector in Europe and North America is undergoing considerable changes as a result of deregulation, issues related to climate change, and the integration of renewable resources within the electricity grid. This article reviewed agent-based simulation methods of analyzing electricity markets. The paper provided an analysis of research currently being conducted on electricity market designs and examined methods of modelling agent decisions. Methods of coupling long term and short term decisions were also reviewed. Issues related to single and multiple market analysis methods were discussed, as well as different approaches to integrating agent-based models with models of other commodities. The integration of transmission constraints within agent-based models was also discussed, and methods of measuring market efficiency were evaluated. Other topics examined in the paper included approaches to integrating investment decisions, carbon dioxide (CO 2 ) trading, and renewable support schemes. It was concluded that agent-based models serve as a test bed for the electricity sector, and will help to provide insights for future policy decisions. 74 refs., 6 figs

  20. Contaminant classification using cosine distances based on multiple conventional sensors.

    Science.gov (United States)

    Liu, Shuming; Che, Han; Smith, Kate; Chang, Tian

    2015-02-01

    Emergent contamination events have a significant impact on water systems. After contamination detection, it is important to classify the type of contaminant quickly to provide support for remediation attempts. Conventional methods generally either rely on laboratory-based analysis, which requires a long analysis time, or on multivariable-based geometry analysis and sequence analysis, which is prone to being affected by the contaminant concentration. This paper proposes a new contaminant classification method, which discriminates contaminants in a real time manner independent of the contaminant concentration. The proposed method quantifies the similarities or dissimilarities between sensors' responses to different types of contaminants. The performance of the proposed method was evaluated using data from contaminant injection experiments in a laboratory and compared with a Euclidean distance-based method. The robustness of the proposed method was evaluated using an uncertainty analysis. The results show that the proposed method performed better in identifying the type of contaminant than the Euclidean distance based method and that it could classify the type of contaminant in minutes without significantly compromising the correct classification rate (CCR).

  1. Application of Bayesian Classification to Content-Based Data Management

    Science.gov (United States)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  2. Macromolecular and dendrimer-based magnetic resonance contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Bumb, Ambika; Brechbiel, Martin W. (Radiation Oncology Branch, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States)), e-mail: pchoyke@mail.nih.gov; Choyke, Peter (Molecular Imaging Program, National Cancer Inst., National Inst. of Health, Bethesda, MD (United States))

    2010-09-15

    Magnetic resonance imaging (MRI) is a powerful imaging modality that can provide an assessment of function or molecular expression in tandem with anatomic detail. Over the last 20-25 years, a number of gadolinium-based MR contrast agents have been developed to enhance signal by altering proton relaxation properties. This review explores a range of these agents from small molecule chelates, such as Gd-DTPA and Gd-DOTA, to macromolecular structures composed of albumin, polylysine, polysaccharides (dextran, inulin, starch), poly(ethylene glycol), copolymers of cystamine and cystine with GD-DTPA, and various dendritic structures based on polyamidoamine and polylysine (Gadomers). The synthesis, structure, biodistribution, and targeting of dendrimer-based MR contrast agents are also discussed

  3. A knowledge base architecture for distributed knowledge agents

    Science.gov (United States)

    Riedesel, Joel; Walls, Bryan

    1990-01-01

    A tuple space based object oriented model for knowledge base representation and interpretation is presented. An architecture for managing distributed knowledge agents is then implemented within the model. The general model is based upon a database implementation of a tuple space. Objects are then defined as an additional layer upon the database. The tuple space may or may not be distributed depending upon the database implementation. A language for representing knowledge and inference strategy is defined whose implementation takes advantage of the tuple space. The general model may then be instantiated in many different forms, each of which may be a distinct knowledge agent. Knowledge agents may communicate using tuple space mechanisms as in the LINDA model as well as using more well known message passing mechanisms. An implementation of the model is presented describing strategies used to keep inference tractable without giving up expressivity. An example applied to a power management and distribution network for Space Station Freedom is given.

  4. Object-based Dimensionality Reduction in Land Surface Phenology Classification

    Directory of Open Access Journals (Sweden)

    Brian E. Bunker

    2016-11-01

    Full Text Available Unsupervised classification or clustering of multi-decadal land surface phenology provides a spatio-temporal synopsis of natural and agricultural vegetation response to environmental variability and anthropogenic activities. Notwithstanding the detailed temporal information available in calibrated bi-monthly normalized difference vegetation index (NDVI and comparable time series, typical pre-classification workflows average a pixel’s bi-monthly index within the larger multi-decadal time series. While this process is one practical way to reduce the dimensionality of time series with many hundreds of image epochs, it effectively dampens temporal variation from both intra and inter-annual observations related to land surface phenology. Through a novel application of object-based segmentation aimed at spatial (not temporal dimensionality reduction, all 294 image epochs from a Moderate Resolution Imaging Spectroradiometer (MODIS bi-monthly NDVI time series covering the northern Fertile Crescent were retained (in homogenous landscape units as unsupervised classification inputs. Given the inherent challenges of in situ or manual image interpretation of land surface phenology classes, a cluster validation approach based on transformed divergence enabled comparison between traditional and novel techniques. Improved intra-annual contrast was clearly manifest in rain-fed agriculture and inter-annual trajectories showed increased cluster cohesion, reducing the overall number of classes identified in the Fertile Crescent study area from 24 to 10. Given careful segmentation parameters, this spatial dimensionality reduction technique augments the value of unsupervised learning to generate homogeneous land surface phenology units. By combining recent scalable computational approaches to image segmentation, future work can pursue new global land surface phenology products based on the high temporal resolution signatures of vegetation index time series.

  5. Hydrophobicity classification of polymeric materials based on fractal dimension

    Directory of Open Access Journals (Sweden)

    Daniel Thomazini

    2008-12-01

    Full Text Available This study proposes a new method to obtain hydrophobicity classification (HC in high voltage polymer insulators. In the method mentioned, the HC was analyzed by fractal dimension (fd and its processing time was evaluated having as a goal the application in mobile devices. Texture images were created from spraying solutions produced of mixtures of isopropyl alcohol and distilled water in proportions, which ranged from 0 to 100% volume of alcohol (%AIA. Based on these solutions, the contact angles of the drops were measured and the textures were used as patterns for fractal dimension calculations.

  6. Parametric classification of handvein patterns based on texture features

    Science.gov (United States)

    Al Mahafzah, Harbi; Imran, Mohammad; Supreetha Gowda H., D.

    2018-04-01

    In this paper, we have developed Biometric recognition system adopting hand based modality Handvein,which has the unique pattern for each individual and it is impossible to counterfeit and fabricate as it is an internal feature. We have opted in choosing feature extraction algorithms such as LBP-visual descriptor, LPQ-blur insensitive texture operator, Log-Gabor-Texture descriptor. We have chosen well known classifiers such as KNN and SVM for classification. We have experimented and tabulated results of single algorithm recognition rate for Handvein under different distance measures and kernel options. The feature level fusion is carried out which increased the performance level.

  7. MODEL-BASED CLUSTERING FOR CLASSIFICATION OF AQUATIC SYSTEMS AND DIAGNOSIS OF ECOLOGICAL STRESS

    Science.gov (United States)

    Clustering approaches were developed using the classification likelihood, the mixture likelihood, and also using a randomization approach with a model index. Using a clustering approach based on the mixture and classification likelihoods, we have developed an algorithm that...

  8. Complex between lignin and a Ti-based coupling agent

    DEFF Research Database (Denmark)

    Rasmussen, Jonas Stensgaard; Barsberg, Søren Talbro; Felby, Claus

    2014-01-01

    -fourier transform infrared spectroscopy in combination with first principle predictions based on the density functional theory (DFT). In the infrared spectra, a new band at 1586 cm-1 was identified and the DFT predictions confirmed that the new band is because of the covalent bonds in the form of ether linkages...... coating formulations would have a better performance if the adhesion to wood could be improved. In the present work, the chemical interaction between a titanium-based coupling agent, isopropyl triisostearoyl titanate (titanium agent, TA) and lignin has been studied by means of attenuated total reflectance...

  9. Intelligent Agent-Based Intrusion Detection System Using Enhanced Multiclass SVM

    Science.gov (United States)

    Ganapathy, S.; Yogesh, P.; Kannan, A.

    2012-01-01

    Intrusion detection systems were used in the past along with various techniques to detect intrusions in networks effectively. However, most of these systems are able to detect the intruders only with high false alarm rate. In this paper, we propose a new intelligent agent-based intrusion detection model for mobile ad hoc networks using a combination of attribute selection, outlier detection, and enhanced multiclass SVM classification methods. For this purpose, an effective preprocessing technique is proposed that improves the detection accuracy and reduces the processing time. Moreover, two new algorithms, namely, an Intelligent Agent Weighted Distance Outlier Detection algorithm and an Intelligent Agent-based Enhanced Multiclass Support Vector Machine algorithm are proposed for detecting the intruders in a distributed database environment that uses intelligent agents for trust management and coordination in transaction processing. The experimental results of the proposed model show that this system detects anomalies with low false alarm rate and high-detection rate when tested with KDD Cup 99 data set. PMID:23056036

  10. Forest Classification Based on Forest texture in Northwest Yunnan Province

    Science.gov (United States)

    Wang, Jinliang; Gao, Yan; Wang, Xiaohua; Fu, Lei

    2014-03-01

    Forest texture is an intrinsic characteristic and an important visual feature of a forest ecological system. Full utilization of forest texture will be a great help in increasing the accuracy of forest classification based on remote sensed data. Taking Shangri-La as a study area, forest classification has been based on the texture. The results show that: (1) From the texture abundance, texture boundary, entropy as well as visual interpretation, the combination of Grayscale-gradient co-occurrence matrix and wavelet transformation is much better than either one of both ways of forest texture information extraction; (2) During the forest texture information extraction, the size of the texture-suitable window determined by the semi-variogram method depends on the forest type (evergreen broadleaf forest is 3×3, deciduous broadleaf forest is 5×5, etc.). (3)While classifying forest based on forest texture information, the texture factor assembly differs among forests: Variance Heterogeneity and Correlation should be selected when the window is between 3×3 and 5×5 Mean, Correlation, and Entropy should be used when the window in the range of 7×7 to 19×19 and Correlation, Second Moment, and Variance should be used when the range is larger than 21×21.

  11. Forest Classification Based on Forest texture in Northwest Yunnan Province

    International Nuclear Information System (INIS)

    Wang, Jinliang; Gao, Yan; Fu, Lei; Wang, Xiaohua

    2014-01-01

    Forest texture is an intrinsic characteristic and an important visual feature of a forest ecological system. Full utilization of forest texture will be a great help in increasing the accuracy of forest classification based on remote sensed data. Taking Shangri-La as a study area, forest classification has been based on the texture. The results show that: (1) From the texture abundance, texture boundary, entropy as well as visual interpretation, the combination of Grayscale-gradient co-occurrence matrix and wavelet transformation is much better than either one of both ways of forest texture information extraction; (2) During the forest texture information extraction, the size of the texture-suitable window determined by the semi-variogram method depends on the forest type (evergreen broadleaf forest is 3×3, deciduous broadleaf forest is 5×5, etc.). (3)While classifying forest based on forest texture information, the texture factor assembly differs among forests: Variance Heterogeneity and Correlation should be selected when the window is between 3×3 and 5×5; Mean, Correlation, and Entropy should be used when the window in the range of 7×7 to 19×19; and Correlation, Second Moment, and Variance should be used when the range is larger than 21×21

  12. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  13. Feature selection gait-based gender classification under different circumstances

    Science.gov (United States)

    Sabir, Azhin; Al-Jawad, Naseer; Jassim, Sabah

    2014-05-01

    This paper proposes a gender classification based on human gait features and investigates the problem of two variations: clothing (wearing coats) and carrying bag condition as addition to the normal gait sequence. The feature vectors in the proposed system are constructed after applying wavelet transform. Three different sets of feature are proposed in this method. First, Spatio-temporal distance that is dealing with the distance of different parts of the human body (like feet, knees, hand, Human Height and shoulder) during one gait cycle. The second and third feature sets are constructed from approximation and non-approximation coefficient of human body respectively. To extract these two sets of feature we divided the human body into two parts, upper and lower body part, based on the golden ratio proportion. In this paper, we have adopted a statistical method for constructing the feature vector from the above sets. The dimension of the constructed feature vector is reduced based on the Fisher score as a feature selection method to optimize their discriminating significance. Finally k-Nearest Neighbor is applied as a classification method. Experimental results demonstrate that our approach is providing more realistic scenario and relatively better performance compared with the existing approaches.

  14. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  15. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    Science.gov (United States)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  16. Stability of subsystem solutions in agent-based models

    Science.gov (United States)

    Perc, Matjaž

    2018-01-01

    The fact that relatively simple entities, such as particles or neurons, or even ants or bees or humans, give rise to fascinatingly complex behaviour when interacting in large numbers is the hallmark of complex systems science. Agent-based models are frequently employed for modelling and obtaining a predictive understanding of complex systems. Since the sheer number of equations that describe the behaviour of an entire agent-based model often makes it impossible to solve such models exactly, Monte Carlo simulation methods must be used for the analysis. However, unlike pairwise interactions among particles that typically govern solid-state physics systems, interactions among agents that describe systems in biology, sociology or the humanities often involve group interactions, and they also involve a larger number of possible states even for the most simplified description of reality. This begets the question: when can we be certain that an observed simulation outcome of an agent-based model is actually stable and valid in the large system-size limit? The latter is key for the correct determination of phase transitions between different stable solutions, and for the understanding of the underlying microscopic processes that led to these phase transitions. We show that a satisfactory answer can only be obtained by means of a complete stability analysis of subsystem solutions. A subsystem solution can be formed by any subset of all possible agent states. The winner between two subsystem solutions can be determined by the average moving direction of the invasion front that separates them, yet it is crucial that the competing subsystem solutions are characterised by a proper composition and spatiotemporal structure before the competition starts. We use the spatial public goods game with diverse tolerance as an example, but the approach has relevance for a wide variety of agent-based models.

  17. Soft computing based feature selection for environmental sound classification

    NARCIS (Netherlands)

    Shakoor, A.; May, T.M.; Van Schijndel, N.H.

    2010-01-01

    Environmental sound classification has a wide range of applications,like hearing aids, mobile communication devices, portable media players, and auditory protection devices. Sound classification systemstypically extract features from the input sound. Using too many features increases complexity

  18. Chemometric classification of casework arson samples based on gasoline content.

    Science.gov (United States)

    Sinkov, Nikolai A; Sandercock, P Mark L; Harynuk, James J

    2014-02-01

    Detection and identification of ignitable liquids (ILs) in arson debris is a critical part of arson investigations. The challenge of this task is due to the complex and unpredictable chemical nature of arson debris, which also contains pyrolysis products from the fire. ILs, most commonly gasoline, are complex chemical mixtures containing hundreds of compounds that will be consumed or otherwise weathered by the fire to varying extents depending on factors such as temperature, air flow, the surface on which IL was placed, etc. While methods such as ASTM E-1618 are effective, data interpretation can be a costly bottleneck in the analytical process for some laboratories. In this study, we address this issue through the application of chemometric tools. Prior to the application of chemometric tools such as PLS-DA and SIMCA, issues of chromatographic alignment and variable selection need to be addressed. Here we use an alignment strategy based on a ladder consisting of perdeuterated n-alkanes. Variable selection and model optimization was automated using a hybrid backward elimination (BE) and forward selection (FS) approach guided by the cluster resolution (CR) metric. In this work, we demonstrate the automated construction, optimization, and application of chemometric tools to casework arson data. The resulting PLS-DA and SIMCA classification models, trained with 165 training set samples, have provided classification of 55 validation set samples based on gasoline content with 100% specificity and sensitivity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Interactive classification and content-based retrieval of tissue images

    Science.gov (United States)

    Aksoy, Selim; Marchisio, Giovanni B.; Tusk, Carsten; Koperski, Krzysztof

    2002-11-01

    We describe a system for interactive classification and retrieval of microscopic tissue images. Our system models tissues in pixel, region and image levels. Pixel level features are generated using unsupervised clustering of color and texture values. Region level features include shape information and statistics of pixel level feature values. Image level features include statistics and spatial relationships of regions. To reduce the gap between low-level features and high-level expert knowledge, we define the concept of prototype regions. The system learns the prototype regions in an image collection using model-based clustering and density estimation. Different tissue types are modeled using spatial relationships of these regions. Spatial relationships are represented by fuzzy membership functions. The system automatically selects significant relationships from training data and builds models which can also be updated using user relevance feedback. A Bayesian framework is used to classify tissues based on these models. Preliminary experiments show that the spatial relationship models we developed provide a flexible and powerful framework for classification and retrieval of tissue images.

  20. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  1. Multiple kernel boosting framework based on information measure for classification

    International Nuclear Information System (INIS)

    Qi, Chengming; Wang, Yuping; Tian, Wenjie; Wang, Qun

    2016-01-01

    The performance of kernel-based method, such as support vector machine (SVM), is greatly affected by the choice of kernel function. Multiple kernel learning (MKL) is a promising family of machine learning algorithms and has attracted many attentions in recent years. MKL combines multiple sub-kernels to seek better results compared to single kernel learning. In order to improve the efficiency of SVM and MKL, in this paper, the Kullback–Leibler kernel function is derived to develop SVM. The proposed method employs an improved ensemble learning framework, named KLMKB, which applies Adaboost to learning multiple kernel-based classifier. In the experiment for hyperspectral remote sensing image classification, we employ feature selected through Optional Index Factor (OIF) to classify the satellite image. We extensively examine the performance of our approach in comparison to some relevant and state-of-the-art algorithms on a number of benchmark classification data sets and hyperspectral remote sensing image data set. Experimental results show that our method has a stable behavior and a noticeable accuracy for different data set.

  2. [Galaxy/quasar classification based on nearest neighbor method].

    Science.gov (United States)

    Li, Xiang-Ru; Lu, Yu; Zhou, Jian-Ming; Wang, Yong-Jun

    2011-09-01

    With the wide application of high-quality CCD in celestial spectrum imagery and the implementation of many large sky survey programs (e. g., Sloan Digital Sky Survey (SDSS), Two-degree-Field Galaxy Redshift Survey (2dF), Spectroscopic Survey Telescope (SST), Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) program and Large Synoptic Survey Telescope (LSST) program, etc.), celestial observational data are coming into the world like torrential rain. Therefore, to utilize them effectively and fully, research on automated processing methods for celestial data is imperative. In the present work, we investigated how to recognizing galaxies and quasars from spectra based on nearest neighbor method. Galaxies and quasars are extragalactic objects, they are far away from earth, and their spectra are usually contaminated by various noise. Therefore, it is a typical problem to recognize these two types of spectra in automatic spectra classification. Furthermore, the utilized method, nearest neighbor, is one of the most typical, classic, mature algorithms in pattern recognition and data mining, and often is used as a benchmark in developing novel algorithm. For applicability in practice, it is shown that the recognition ratio of nearest neighbor method (NN) is comparable to the best results reported in the literature based on more complicated methods, and the superiority of NN is that this method does not need to be trained, which is useful in incremental learning and parallel computation in mass spectral data processing. In conclusion, the results in this work are helpful for studying galaxies and quasars spectra classification.

  3. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    Directory of Open Access Journals (Sweden)

    Rui Sun

    2016-08-01

    Full Text Available Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods.

  4. Style-based classification of Chinese ink and wash paintings

    Science.gov (United States)

    Sheng, Jiachuan; Jiang, Jianmin

    2013-09-01

    Following the fact that a large collection of ink and wash paintings (IWP) is being digitized and made available on the Internet, their automated content description, analysis, and management are attracting attention across research communities. While existing research in relevant areas is primarily focused on image processing approaches, a style-based algorithm is proposed to classify IWPs automatically by their authors. As IWPs do not have colors or even tones, the proposed algorithm applies edge detection to locate the local region and detect painting strokes to enable histogram-based feature extraction and capture of important cues to reflect the styles of different artists. Such features are then applied to drive a number of neural networks in parallel to complete the classification, and an information entropy balanced fusion is proposed to make an integrated decision for the multiple neural network classification results in which the entropy is used as a pointer to combine the global and local features. Evaluations via experiments support that the proposed algorithm achieves good performances, providing excellent potential for computerized analysis and management of IWPs.

  5. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  6. On the Feature Selection and Classification Based on Information Gain for Document Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Asriyanti Indah Pratiwi

    2018-01-01

    Full Text Available Sentiment analysis in a movie review is the needs of today lifestyle. Unfortunately, enormous features make the sentiment of analysis slow and less sensitive. Finding the optimum feature selection and classification is still a challenge. In order to handle an enormous number of features and provide better sentiment classification, an information-based feature selection and classification are proposed. The proposed method reduces more than 90% unnecessary features while the proposed classification scheme achieves 96% accuracy of sentiment classification. From the experimental results, it can be concluded that the combination of proposed feature selection and classification achieves the best performance so far.

  7. Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    LI Jian-Wei

    2014-08-01

    Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.

  8. Bearing Fault Classification Based on Conditional Random Field

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2013-01-01

    Full Text Available Condition monitoring of rolling element bearing is paramount for predicting the lifetime and performing effective maintenance of the mechanical equipment. To overcome the drawbacks of the hidden Markov model (HMM and improve the diagnosis accuracy, conditional random field (CRF model based classifier is proposed. In this model, the feature vectors sequences and the fault categories are linked by an undirected graphical model in which their relationship is represented by a global conditional probability distribution. In comparison with the HMM, the main advantage of the CRF model is that it can depict the temporal dynamic information between the observation sequences and state sequences without assuming the independence of the input feature vectors. Therefore, the interrelationship between the adjacent observation vectors can also be depicted and integrated into the model, which makes the classifier more robust and accurate than the HMM. To evaluate the effectiveness of the proposed method, four kinds of bearing vibration signals which correspond to normal, inner race pit, outer race pit and roller pit respectively are collected from the test rig. And the CRF and HMM models are built respectively to perform fault classification by taking the sub band energy features of wavelet packet decomposition (WPD as the observation sequences. Moreover, K-fold cross validation method is adopted to improve the evaluation accuracy of the classifier. The analysis and comparison under different fold times show that the accuracy rate of classification using the CRF model is higher than the HMM. This method brings some new lights on the accurate classification of the bearing faults.

  9. Agent-based Personal Network (PN) service architecture

    DEFF Research Database (Denmark)

    Jiang, Bo; Olesen, Henning

    2004-01-01

    In this paper we proposte a new concept for a centralized agent system as the solution for the PN service architecture, which aims to efficiently control and manage the PN resources and enable the PN based services to run seamlessly over different networks and devices. The working principle...

  10. Emotion based Agent Architectures for Tutoring Systems : The INES Architecture

    NARCIS (Netherlands)

    Poel, Mannes; op den Akker, Rieks; Heylen, Dirk; Nijholt, Anton; Trappl, Robert

    2004-01-01

    In this paper we discuss our approach to integrate emotions in the agent based tutoring system INES (Intelligent Nursing Education System). First we discuss the INES system where we emphasize the emotional component of the system. Afterwards we show how a more advanced emotion generation

  11. Agent-based Security and Efficiency Estimation in Airport Terminals

    NARCIS (Netherlands)

    Janssen, S.A.M.

    We investigate the use of an Agent-based framework to identify and quantify the relationship between security and efficiency within airport terminals. In this framework, we define a novel Security Risk Assessment methodology that explicitly models attacker and defender behavior in a security

  12. Resource Based Multi Agent Plan Merging : Framework and application

    NARCIS (Netherlands)

    De Weerdt, M.M.; Van der Krogt, R.P.J.; Witteveen, C.

    2003-01-01

    We discuss a resource-based planning framework where agents are able to merge plans by exchanging resources. In this framework, plans are specified as structured objects composed of resource consuming and resource producing processes (actions). A plan itself can also be conceived as a process

  13. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a

  14. An agent-based model for diffusion of electric vehicles

    NARCIS (Netherlands)

    Kangur, Ayla; Jager, Wander; Verbrugge, Rineke; Bockarjova, Marija

    2017-01-01

    The transition from fuel cars to electric cars is a large-scale process involving many interactions between consumers and other stakeholders over decades. To explore how policies may interact with consumer behavior over such a long time period, we developed an agent-based social simulation model. In

  15. Emotion based Agent Architectures for Tutoring Systems: The INES Architecture

    NARCIS (Netherlands)

    Poel, Mannes; op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Nijholt, Antinus; Trappl, R.

    2004-01-01

    In this paper we discuss our approach to integrate emotions in the agent based tutoring system INES (Intelligent Nursing Education System). First we discuss the INES system where we emphasize the emotional component of the system. Afterwards we show how a more advanced emotion generation

  16. A review of Agent Based Modeling for agricultural policy evaluation

    NARCIS (Netherlands)

    Kremmydas, Dimitris; Athanasiadis, I.N.; Rozakis, Stelios

    2018-01-01

    Farm level scale policy analysis is receiving increased attention due to a changing agricultural policy orientation. Agent based models (ABM) are farm level models that have appeared in the end of 1990's, having several differences from traditional farm level models, like the consideration of

  17. Ontology-based intelligent fuzzy agent for diabetes application

    NARCIS (Netherlands)

    Acampora, G.; Lee, C.-S.; Wang, M.-H.; Hsu, C.-Y.; Loia, V.

    2009-01-01

    It is widely pointed out that classical ontologies are not sufficient to deal with imprecise and vague knowledge for some real world applications, but the fuzzy ontology can effectively solve data and knowledge with uncertainty. In this paper, an ontology-based intelligent fuzzy agent (OIFA),

  18. Solution of partial differential equations by agent-based simulation

    International Nuclear Information System (INIS)

    Szilagyi, Miklos N

    2014-01-01

    The purpose of this short note is to demonstrate that partial differential equations can be quickly solved by agent-based simulation with high accuracy. There is no need for the solution of large systems of algebraic equations. This method is especially useful for quick determination of potential distributions and demonstration purposes in teaching electromagnetism. (letters and comments)

  19. An agent-based architecture for multimodal interaction

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    In this paper, an executable generic process model is proposed for combined verbal and non-verbal communication processes and their interaction. The agent-based architecture can be used to create multimodal interaction. The generic process model has been designed, implemented and used to simulate

  20. An agent-based architecture for multimodal interaction

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    2001-01-01

    In this paper, an executable generic process model is proposed for combined verbal and non-verbal communication processes and their interaction. The agent-based architecture can be used to create multimodal interaction. The generic process model has been designed, implemented and used to simulate

  1. The Geographic Information Grid System Based on Mobile Agent

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    We analyze the deficiencies of current application systems, and discuss the key requirements of distributed Geographic Information service (GIS). We construct the distributed GIS on grid platform. Considering the flexibility and efficiency, we integrate the mobile agent technology into the system. We propose a new prototype system, the Geographic Information Grid System (GIGS) based on mobile agent. This system has flexible services and high performance, and improves the sharing of distributed resources. The service strategy of the system and the examples are also presented.

  2. A novel method for human age group classification based on

    Directory of Open Access Journals (Sweden)

    Anuradha Yarlagadda

    2015-10-01

    Full Text Available In the computer vision community, easy categorization of a person’s facial image into various age groups is often quite precise and is not pursued effectively. To address this problem, which is an important area of research, the present paper proposes an innovative method of age group classification system based on the Correlation Fractal Dimension of complex facial image. Wrinkles appear on the face with aging thereby changing the facial edges of the image. The proposed method is rotation and poses invariant. The present paper concentrates on developing an innovative technique that classifies facial images into four categories i.e. child image (0–15, young adult image (15–30, middle-aged adult image (31–50, and senior adult image (>50 based on correlation FD value of a facial edge image.

  3. Automatic classification of visual evoked potentials based on wavelet decomposition

    Science.gov (United States)

    Stasiakiewicz, Paweł; Dobrowolski, Andrzej P.; Tomczykiewicz, Kazimierz

    2017-04-01

    Diagnosis of part of the visual system, that is responsible for conducting compound action potential, is generally based on visual evoked potentials generated as a result of stimulation of the eye by external light source. The condition of patient's visual path is assessed by set of parameters that describe the time domain characteristic extremes called waves. The decision process is compound therefore diagnosis significantly depends on experience of a doctor. The authors developed a procedure - based on wavelet decomposition and linear discriminant analysis - that ensures automatic classification of visual evoked potentials. The algorithm enables to assign individual case to normal or pathological class. The proposed classifier has a 96,4% sensitivity at 10,4% probability of false alarm in a group of 220 cases and area under curve ROC equals to 0,96 which, from the medical point of view, is a very good result.

  4. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    Science.gov (United States)

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  5. Personalized E- learning System Based on Intelligent Agent

    Science.gov (United States)

    Duo, Sun; Ying, Zhou Cai

    Lack of personalized learning is the key shortcoming of traditional e-Learning system. This paper analyzes the personal characters in e-Learning activity. In order to meet the personalized e-learning, a personalized e-learning system based on intelligent agent was proposed and realized in the paper. The structure of system, work process, the design of intelligent agent and the realization of intelligent agent were introduced in the paper. After the test use of the system by certain network school, we found that the system could improve the learner's initiative participation, which can provide learners with personalized knowledge service. Thus, we thought it might be a practical solution to realize self- learning and self-promotion in the lifelong education age.

  6. Neighborhood Hypergraph Based Classification Algorithm for Incomplete Information System

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2015-01-01

    Full Text Available The problem of classification in incomplete information system is a hot issue in intelligent information processing. Hypergraph is a new intelligent method for machine learning. However, it is hard to process the incomplete information system by the traditional hypergraph, which is due to two reasons: (1 the hyperedges are generated randomly in traditional hypergraph model; (2 the existing methods are unsuitable to deal with incomplete information system, for the sake of missing values in incomplete information system. In this paper, we propose a novel classification algorithm for incomplete information system based on hypergraph model and rough set theory. Firstly, we initialize the hypergraph. Second, we classify the training set by neighborhood hypergraph. Third, under the guidance of rough set, we replace the poor hyperedges. After that, we can obtain a good classifier. The proposed approach is tested on 15 data sets from UCI machine learning repository. Furthermore, it is compared with some existing methods, such as C4.5, SVM, NavieBayes, and KNN. The experimental results show that the proposed algorithm has better performance via Precision, Recall, AUC, and F-measure.

  7. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  8. Gd-HOPO Based High Relaxivity MRI Contrast Agents

    Energy Technology Data Exchange (ETDEWEB)

    Datta, Ankona; Raymond, Kenneth

    2008-11-06

    Tris-bidentate HOPO-based ligands developed in our laboratory were designed to complement the coordination preferences of Gd{sup 3+}, especially its oxophilicity. The HOPO ligands provide a hexadentate coordination environment for Gd{sup 3+} in which all he donor atoms are oxygen. Because Gd{sup 3+} favors eight or nine coordination, this design provides two to three open sites for inner-sphere water molecules. These water molecules rapidly exchange with bulk solution, hence affecting the relaxation rates of bulk water olecules. The parameters affecting the efficiency of these contrast agents have been tuned to improve contrast while still maintaining a high thermodynamic stability for Gd{sup 3+} binding. The Gd- HOPO-based contrast agents surpass current commercially available agents ecause of a higher number of inner-sphere water molecules, rapid exchange of inner-sphere water molecules via an associative mechanism, and a long electronic relaxation time. The contrast enhancement provided by these agents is at least twice that of commercial contrast gents, which are based on polyaminocarboxylate ligands.

  9. Comparison Effectiveness of Pixel Based Classification and Object Based Classification Using High Resolution Image In Floristic Composition Mapping (Study Case: Gunung Tidar Magelang City)

    Science.gov (United States)

    Ardha Aryaguna, Prama; Danoedoro, Projo

    2016-11-01

    Developments of analysis remote sensing have same way with development of technology especially in sensor and plane. Now, a lot of image have high spatial and radiometric resolution, that's why a lot information. Vegetation object analysis such floristic composition got a lot advantage of that development. Floristic composition can be interpreted using a lot of method such pixel based classification and object based classification. The problems for pixel based method on high spatial resolution image are salt and paper who appear in result of classification. The purpose of this research are compare effectiveness between pixel based classification and object based classification for composition vegetation mapping on high resolution image Worldview-2. The results show that pixel based classification using majority 5×5 kernel windows give the highest accuracy between another classifications. The highest accuracy is 73.32% from image Worldview-2 are being radiometric corrected level surface reflectance, but for overall accuracy in every class, object based are the best between another methods. Reviewed from effectiveness aspect, pixel based are more effective then object based for vegetation composition mapping in Tidar forest.

  10. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    Science.gov (United States)

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Agent-Based Coordination Model for Designing Transportation Applications

    OpenAIRE

    BADEIG, F; BALBO, F; SCEMAMA, G; ZARGAYOUNA, M

    2008-01-01

    This paper presents an environment-centered approach to design multi-agent solutions to transportation problems. Based on the Property-based Coordination Principle (PbC), the objective of our approach is to solve three recurrent issues in the design of these solutions: the knowledge problem, the space-time dimension and the dynamics of the real environment. To demonstrate the benefits of our approach, two completely different applications, a demand-responsive transportation system and a simul...

  12. Radiological classification of renal angiomyolipomas based on 127 tumors

    Directory of Open Access Journals (Sweden)

    Prando Adilson

    2003-01-01

    Full Text Available PURPOSE: Demonstrate radiological findings of 127 angiomyolipomas (AMLs and propose a classification based on the radiological evidence of fat. MATERIALS AND METHODS: The imaging findings of 85 consecutive patients with AMLs: isolated (n = 73, multiple without tuberous sclerosis (TS (n = 4 and multiple with TS (n = 8, were retrospectively reviewed. Eighteen AMLs (14% presented with hemorrhage. All patients were submitted to a dedicated helical CT or magnetic resonance studies. All hemorrhagic and non-hemorrhagic lesions were grouped together since our objective was to analyze the presence of detectable fat. Out of 85 patients, 53 were monitored and 32 were treated surgically due to large perirenal component (n = 13, hemorrhage (n = 11 and impossibility of an adequate preoperative characterization (n = 8. There was not a case of renal cell carcinoma (RCC with fat component in this group of patients. RESULTS: Based on the presence and amount of detectable fat within the lesion, AMLs were classified in 4 distinct radiological patterns: Pattern-I, predominantly fatty (usually less than 2 cm in diameter and intrarenal: 54%; Pattern-II, partially fatty (intrarenal or exophytic: 29%; Pattern-III, minimally fatty (most exophytic and perirenal: 11%; and Pattern-IV, without fat (most exophytic and perirenal: 6%. CONCLUSIONS: This proposed classification might be useful to understand the imaging manifestations of AMLs, their differential diagnosis and determine when further radiological evaluation would be necessary. Small (< 1.5 cm, pattern-I AMLs tend to be intra-renal, homogeneous and predominantly fatty. As they grow they tend to be partially or completely exophytic and heterogeneous (patterns II and III. The rare pattern-IV AMLs, however, can be small or large, intra-renal or exophytic but are always homogeneous and hyperdense mass. Since no renal cell carcinoma was found in our series, from an evidence-based practice, all renal mass with detectable

  13. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  14. Deep neural network and noise classification-based speech enhancement

    Science.gov (United States)

    Shi, Wenhua; Zhang, Xiongwei; Zou, Xia; Han, Wei

    2017-07-01

    In this paper, a speech enhancement method using noise classification and Deep Neural Network (DNN) was proposed. Gaussian mixture model (GMM) was employed to determine the noise type in speech-absent frames. DNN was used to model the relationship between noisy observation and clean speech. Once the noise type was determined, the corresponding DNN model was applied to enhance the noisy speech. GMM was trained with mel-frequency cepstrum coefficients (MFCC) and the parameters were estimated with an iterative expectation-maximization (EM) algorithm. Noise type was updated by spectrum entropy-based voice activity detection (VAD). Experimental results demonstrate that the proposed method could achieve better objective speech quality and smaller distortion under stationary and non-stationary conditions.

  15. A robust probabilistic collaborative representation based classification for multimodal biometrics

    Science.gov (United States)

    Zhang, Jing; Liu, Huanxi; Ding, Derui; Xiao, Jianli

    2018-04-01

    Most of the traditional biometric recognition systems perform recognition with a single biometric indicator. These systems have suffered noisy data, interclass variations, unacceptable error rates, forged identity, and so on. Due to these inherent problems, it is not valid that many researchers attempt to enhance the performance of unimodal biometric systems with single features. Thus, multimodal biometrics is investigated to reduce some of these defects. This paper proposes a new multimodal biometric recognition approach by fused faces and fingerprints. For more recognizable features, the proposed method extracts block local binary pattern features for all modalities, and then combines them into a single framework. For better classification, it employs the robust probabilistic collaborative representation based classifier to recognize individuals. Experimental results indicate that the proposed method has improved the recognition accuracy compared to the unimodal biometrics.

  16. Machine Learning Based Localization and Classification with Atomic Magnetometers

    Science.gov (United States)

    Deans, Cameron; Griffin, Lewis D.; Marmugi, Luca; Renzoni, Ferruccio

    2018-01-01

    We demonstrate identification of position, material, orientation, and shape of objects imaged by a Rb 85 atomic magnetometer performing electromagnetic induction imaging supported by machine learning. Machine learning maximizes the information extracted from the images created by the magnetometer, demonstrating the use of hidden data. Localization 2.6 times better than the spatial resolution of the imaging system and successful classification up to 97% are obtained. This circumvents the need of solving the inverse problem and demonstrates the extension of machine learning to diffusive systems, such as low-frequency electrodynamics in media. Automated collection of task-relevant information from quantum-based electromagnetic imaging will have a relevant impact from biomedicine to security.

  17. Fines Classification Based on Sensitivity to Pore-Fluid Chemistry

    KAUST Repository

    Jang, Junbong

    2015-12-28

    The 75-μm particle size is used to discriminate between fine and coarse grains. Further analysis of fine grains is typically based on the plasticity chart. Whereas pore-fluid-chemistry-dependent soil response is a salient and distinguishing characteristic of fine grains, pore-fluid chemistry is not addressed in current classification systems. Liquid limits obtained with electrically contrasting pore fluids (deionized water, 2-M NaCl brine, and kerosene) are combined to define the soil "electrical sensitivity." Liquid limit and electrical sensitivity can be effectively used to classify fine grains according to their fluid-soil response into no-, low-, intermediate-, or high-plasticity fine grains of low, intermediate, or high electrical sensitivity. The proposed methodology benefits from the accumulated experience with liquid limit in the field and addresses the needs of a broader range of geotechnical engineering problems. © ASCE.

  18. DNA methylation-based classification of central nervous system tumours

    DEFF Research Database (Denmark)

    Capper, David; Jones, David T.W.; Sill, Martin

    2018-01-01

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter-observer variabil......Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging - with substantial inter......-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show...

  19. Fines classification based on sensitivity to pore-fluid chemistry

    Science.gov (United States)

    Jang, Junbong; Santamarina, J. Carlos

    2016-01-01

    The 75-μm particle size is used to discriminate between fine and coarse grains. Further analysis of fine grains is typically based on the plasticity chart. Whereas pore-fluid-chemistry-dependent soil response is a salient and distinguishing characteristic of fine grains, pore-fluid chemistry is not addressed in current classification systems. Liquid limits obtained with electrically contrasting pore fluids (deionized water, 2-M NaCl brine, and kerosene) are combined to define the soil “electrical sensitivity.” Liquid limit and electrical sensitivity can be effectively used to classify fine grains according to their fluid-soil response into no-, low-, intermediate-, or high-plasticity fine grains of low, intermediate, or high electrical sensitivity. The proposed methodology benefits from the accumulated experience with liquid limit in the field and addresses the needs of a broader range of geotechnical engineering problems.

  20. New classification system-based visual outcome in Eales′ disease

    Directory of Open Access Journals (Sweden)

    Saxena Sandeep

    2007-01-01

    Full Text Available Purpose: A retrospective tertiary care center-based study was undertaken to evaluate the visual outcome in Eales′ disease, based on a new classification system, for the first time. Materials and Methods: One hundred and fifty-nine consecutive cases of Eales′ disease were included. All the eyes were staged according to the new classification: Stage 1: periphlebitis of small (1a and large (1b caliber vessels with superficial retinal hemorrhages; Stage 2a: capillary non-perfusion, 2b: neovascularization elsewhere/of the disc; Stage 3a: fibrovascular proliferation, 3b: vitreous hemorrhage; Stage 4a: traction/combined rhegmatogenous retinal detachment and 4b: rubeosis iridis, neovascular glaucoma, complicated cataract and optic atrophy. Visual acuity was graded as: Grade I 20/20 or better; Grade II 20/30 to 20/40; Grade III 20/60 to 20/120 and Grade IV 20/200 or worse. All the cases were managed by medical therapy, photocoagulation and/or vitreoretinal surgery. Visual acuity was converted into decimal scale, denoting 20/20=1 and 20/800=0.01. Paired t-test / Wilcoxon signed-rank tests were used for statistical analysis. Results: Vitreous hemorrhage was the commonest presenting feature (49.32%. Cases with Stages 1 to 3 and 4a and 4b achieved final visual acuity ranging from 20/15 to 20/40; 20/80 to 20/400 and 20/200 to 20/400, respectively. Statistically significant improvement in visual acuities was observed in all the stages of the disease except Stages 1a and 4b. Conclusion: Significant improvement in visual acuities was observed in the majority of stages of Eales′ disease following treatment. This study adds further to the little available evidences of treatment effects in literature and may have effect on patient care and health policy in Eales′ disease.

  1. Quality-Oriented Classification of Aircraft Material Based on SVM

    Directory of Open Access Journals (Sweden)

    Hongxia Cai

    2014-01-01

    Full Text Available The existing material classification is proposed to improve the inventory management. However, different materials have the different quality-related attributes, especially in the aircraft industry. In order to reduce the cost without sacrificing the quality, we propose a quality-oriented material classification system considering the material quality character, Quality cost, and Quality influence. Analytic Hierarchy Process helps to make feature selection and classification decision. We use the improved Kraljic Portfolio Matrix to establish the three-dimensional classification model. The aircraft materials can be divided into eight types, including general type, key type, risk type, and leveraged type. Aiming to improve the classification accuracy of various materials, the algorithm of Support Vector Machine is introduced. Finally, we compare the SVM and BP neural network in the application. The results prove that the SVM algorithm is more efficient and accurate and the quality-oriented material classification is valuable.

  2. GA Based Optimal Feature Extraction Method for Functional Data Classification

    OpenAIRE

    Jun Wan; Zehua Chen; Yingwu Chen; Zhidong Bai

    2010-01-01

    Classification is an interesting problem in functional data analysis (FDA), because many science and application problems end up with classification problems, such as recognition, prediction, control, decision making, management, etc. As the high dimension and high correlation in functional data (FD), it is a key problem to extract features from FD whereas keeping its global characters, which relates to the classification efficiency and precision to heavens. In this paper...

  3. Classification of arterial and venous cerebral vasculature based on wavelet postprocessing of CT perfusion data.

    Science.gov (United States)

    Havla, Lukas; Schneider, Moritz J; Thierfelder, Kolja M; Beyer, Sebastian E; Ertl-Wagner, Birgit; Reiser, Maximilian F; Sommer, Wieland H; Dietrich, Olaf

    2016-02-01

    The purpose of this study was to propose and evaluate a new wavelet-based technique for classification of arterial and venous vessels using time-resolved cerebral CT perfusion data sets. Fourteen consecutive patients (mean age 73 yr, range 17-97) with suspected stroke but no pathology in follow-up MRI were included. A CT perfusion scan with 32 dynamic phases was performed during intravenous bolus contrast-agent application. After rigid-body motion correction, a Paul wavelet (order 1) was used to calculate voxelwise the wavelet power spectrum (WPS) of each attenuation-time course. The angiographic intensity A was defined as the maximum of the WPS, located at the coordinates T (time axis) and W (scale/width axis) within the WPS. Using these three parameters (A, T, W) separately as well as combined by (1) Fisher's linear discriminant analysis (FLDA), (2) logistic regression (LogR) analysis, or (3) support vector machine (SVM) analysis, their potential to classify 18 different arterial and venous vessel segments per subject was evaluated. The best vessel classification was obtained using all three parameters A and T and W [area under the curve (AUC): 0.953 with FLDA and 0.957 with LogR or SVM]. In direct comparison, the wavelet-derived parameters provided performance at least equal to conventional attenuation-time-course parameters. The maximum AUC obtained from the proposed wavelet parameters was slightly (although not statistically significantly) higher than the maximum AUC (0.945) obtained from the conventional parameters. A new method to classify arterial and venous cerebral vessels with high statistical accuracy was introduced based on the time-domain wavelet transform of dynamic CT perfusion data in combination with linear or nonlinear multidimensional classification techniques.

  4. Calibrating emergent phenomena in stock markets with agent based models.

    Science.gov (United States)

    Fievet, Lucas; Sornette, Didier

    2018-01-01

    Since the 2008 financial crisis, agent-based models (ABMs), which account for out-of-equilibrium dynamics, heterogeneous preferences, time horizons and strategies, have often been envisioned as the new frontier that could revolutionise and displace the more standard models and tools in economics. However, their adoption and generalisation is drastically hindered by the absence of general reliable operational calibration methods. Here, we start with a different calibration angle that qualifies an ABM for its ability to achieve abnormal trading performance with respect to the buy-and-hold strategy when fed with real financial data. Starting from the common definition of standard minority and majority agents with binary strategies, we prove their equivalence to optimal decision trees. This efficient representation allows us to exhaustively test all meaningful single agent models for their potential anomalous investment performance, which we apply to the NASDAQ Composite index over the last 20 years. We uncover large significant predictive power, with anomalous Sharpe ratio and directional accuracy, in particular during the dotcom bubble and crash and the 2008 financial crisis. A principal component analysis reveals transient convergence between the anomalous minority and majority models. A novel combination of the optimal single-agent models of both classes into a two-agents model leads to remarkable superior investment performance, especially during the periods of bubbles and crashes. Our design opens the field of ABMs to construct novel types of advanced warning systems of market crises, based on the emergent collective intelligence of ABMs built on carefully designed optimal decision trees that can be reversed engineered from real financial data.

  5. Calibrating emergent phenomena in stock markets with agent based models

    Science.gov (United States)

    Sornette, Didier

    2018-01-01

    Since the 2008 financial crisis, agent-based models (ABMs), which account for out-of-equilibrium dynamics, heterogeneous preferences, time horizons and strategies, have often been envisioned as the new frontier that could revolutionise and displace the more standard models and tools in economics. However, their adoption and generalisation is drastically hindered by the absence of general reliable operational calibration methods. Here, we start with a different calibration angle that qualifies an ABM for its ability to achieve abnormal trading performance with respect to the buy-and-hold strategy when fed with real financial data. Starting from the common definition of standard minority and majority agents with binary strategies, we prove their equivalence to optimal decision trees. This efficient representation allows us to exhaustively test all meaningful single agent models for their potential anomalous investment performance, which we apply to the NASDAQ Composite index over the last 20 years. We uncover large significant predictive power, with anomalous Sharpe ratio and directional accuracy, in particular during the dotcom bubble and crash and the 2008 financial crisis. A principal component analysis reveals transient convergence between the anomalous minority and majority models. A novel combination of the optimal single-agent models of both classes into a two-agents model leads to remarkable superior investment performance, especially during the periods of bubbles and crashes. Our design opens the field of ABMs to construct novel types of advanced warning systems of market crises, based on the emergent collective intelligence of ABMs built on carefully designed optimal decision trees that can be reversed engineered from real financial data. PMID:29499049

  6. A Framework for Agent-based Human Interaction Support

    Directory of Open Access Journals (Sweden)

    Axel Bürkle

    2008-10-01

    Full Text Available In this paper we describe an agent-based infrastructure for multimodal perceptual systems which aims at developing and realizing computer services that are delivered to humans in an implicit and unobtrusive way. The framework presented here supports the implementation of human-centric context-aware applications providing non-obtrusive assistance to participants in events such as meetings, lectures, conferences and presentations taking place in indoor "smart spaces". We emphasize on the design and implementation of an agent-based framework that supports "pluggable" service logic in the sense that the service developer can concentrate on coding the service logic independently of the underlying middleware. Furthermore, we give an example of the architecture's ability to support the cooperation of multiple services in a meeting scenario using an intelligent connector service and a semantic web oriented travel service.

  7. Agent-based distributed hierarchical control of dc microgrid systems

    DEFF Research Database (Denmark)

    Meng, Lexuan; Vasquez, Juan Carlos; Guerrero, Josep M.

    2014-01-01

    In order to enable distributed control and management for microgrids, this paper explores the application of information consensus and local decisionmaking methods formulating an agent based distributed hierarchical control system. A droop controlled paralleled DC/DC converter system is taken as ....... Standard genetic algorithm is applied in each local control system in order to search for a global optimum. Hardware-in-Loop simulation results are shown to demonstrate the effectiveness of the method.......In order to enable distributed control and management for microgrids, this paper explores the application of information consensus and local decisionmaking methods formulating an agent based distributed hierarchical control system. A droop controlled paralleled DC/DC converter system is taken...... as a case study. The objective is to enhance the system efficiency by finding the optimal sharing ratio of load current. Virtual resistances in local control systems are taken as decision variables. Consensus algorithms are applied for global information discovery and local control systems coordination...

  8. Engineering large-scale agent-based systems with consensus

    Science.gov (United States)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  9. DAIDS: a Distributed, Agent-based Information Dissemination System

    Directory of Open Access Journals (Sweden)

    Pete Haglich

    2007-10-01

    Full Text Available The Distributed Agent-Based Information Dissemination System (DAIDS concept was motivated by the need to share information among the members of a military tactical team in an atmosphere of extremely limited or intermittent bandwidth. The DAIDS approach recognizes that in many cases communications limitations will preclude the complete sharing of all tactical information between the members of the tactical team. Communications may be limited by obstructions to the line of sight between platforms; electronic warfare; or environmental conditions, or just contention from other users of that bandwidth. Since it may not be possible to achieve a complete information exchange, it is important to prioritize transmissions so the most critical information from the standpoint of the recipient is disseminated first. The challenge is to be able to determine which elements of information are the most important to each teammate. The key innovation of the DAIDS concept is the use of software proxy agents to represent the information needs of the recipient of the information. The DAIDS approach uses these proxy agents to evaluate the content of a message in accordance with the context and information needs of the recipient platform (the agent's principal and prioritize the message for dissemination. In our research we implemented this approach and demonstrated that it provides nearly a reduction in transmission times for critical tactical reports by up to a factor of 30 under severe bandwidth limitations.

  10. Advanced Contrast Agents for Multimodal Biomedical Imaging Based on Nanotechnology.

    Science.gov (United States)

    Calle, Daniel; Ballesteros, Paloma; Cerdán, Sebastián

    2018-01-01

    Clinical imaging modalities have reached a prominent role in medical diagnosis and patient management in the last decades. Different image methodologies as Positron Emission Tomography, Single Photon Emission Tomography, X-Rays, or Magnetic Resonance Imaging are in continuous evolution to satisfy the increasing demands of current medical diagnosis. Progress in these methodologies has been favored by the parallel development of increasingly more powerful contrast agents. These are molecules that enhance the intrinsic contrast of the images in the tissues where they accumulate, revealing noninvasively the presence of characteristic molecular targets or differential physiopathological microenvironments. The contrast agent field is currently moving to improve the performance of these molecules by incorporating the advantages that modern nanotechnology offers. These include, mainly, the possibilities to combine imaging and therapeutic capabilities over the same theranostic platform or improve the targeting efficiency in vivo by molecular engineering of the nanostructures. In this review, we provide an introduction to multimodal imaging methods in biomedicine, the sub-nanometric imaging agents previously used and the development of advanced multimodal and theranostic imaging agents based in nanotechnology. We conclude providing some illustrative examples from our own laboratories, including recent progress in theranostic formulations of magnetoliposomes containing ω-3 poly-unsaturated fatty acids to treat inflammatory diseases, or the use of stealth liposomes engineered with a pH-sensitive nanovalve to release their cargo specifically in the acidic extracellular pH microenvironment of tumors.

  11. The generalization ability of online SVM classification based on Markov sampling.

    Science.gov (United States)

    Xu, Jie; Yan Tang, Yuan; Zou, Bin; Xu, Zongben; Li, Luoqing; Lu, Yang

    2015-03-01

    In this paper, we consider online support vector machine (SVM) classification learning algorithms with uniformly ergodic Markov chain (u.e.M.c.) samples. We establish the bound on the misclassification error of an online SVM classification algorithm with u.e.M.c. samples based on reproducing kernel Hilbert spaces and obtain a satisfactory convergence rate. We also introduce a novel online SVM classification algorithm based on Markov sampling, and present the numerical studies on the learning ability of online SVM classification based on Markov sampling for benchmark repository. The numerical studies show that the learning performance of the online SVM classification algorithm based on Markov sampling is better than that of classical online SVM classification based on random sampling as the size of training samples is larger.

  12. Simulating classroom lessons : an agent-based attempt

    OpenAIRE

    Ingram, Fred; Brooks, Roger John

    2018-01-01

    This is an interim report on a project to construct an agent-based simulation that reproduces some of the interactions between students and their teacher in classroom lessons. In a pilot study, the activities of 67 students and 7 teachers during 40 lessons were recorded using a data collection instrument that currently captures 17 student states and 15 teacher states. These data enabled various conceptual models to be explored, providing empirical values and distributions for the model parame...

  13. AGENT-BASED NEGOTIATION PLATFORM IN COLLABORATIVE NETWORKED ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Adina-Georgeta CREȚAN

    2014-05-01

    Full Text Available This paper proposes an agent-based platform to model and support parallel and concurrent negotiations among organizations acting in the same industrial market. The underlying complexity is to model the dynamic environment where multi-attribute and multi-participant negotiations are racing over a set of heterogeneous resources. The metaphor Interaction Abstract Machines (IAMs is used to model the parallelism and the non-deterministic aspects of the negotiation processes that occur in Collaborative Networked Environment.

  14. Financial Regulation in an Agent Based Macroeconomic Model

    OpenAIRE

    Riccetti, Luca; Russo, Alberto; Mauro, Gallegati

    2013-01-01

    Starting from the agent-based decentralized matching macroeconomic model proposed in Riccetti et al. (2012), we explore the effects of banking regulation on macroeconomic dynamics. In particular, we study the overall credit exposure and the lending concentration towards a single counterparty, finding that the portfolio composition seems to be more relevant than the overall exposure for banking stability, even if both features are very important. We show that a too tight regulation is dangerou...

  15. SETH: A Hierarchical, Agent-based Architecture for Smart Spaces

    OpenAIRE

    Marsá Maestre, Iván

    2008-01-01

    The ultimate goal of any smart environment is to release users from the tasks they usually perform to achieve comfort, efficiency, and service personalization. To achieve this goal, we propose to use multiagent systems. In this report we describe the SETH architectur: a hierarchical, agent-based solution intended to be applicable to different smart space scenarios, ranging from small environments, like smart homes or smart offices, to large smart spaces like cities.

  16. Macroprudential policies in an agent-based artificial economy

    OpenAIRE

    Raberto, Marco; Teglio, Andrea; Cincotti, Silvano

    2012-01-01

    Basel III is a recently-agreed regulatory standard for bank capital adequacy with focus on the macroprudential dimension of banking regulation, i.e., the system- wide implications of banks’ lending and risk. An important Basel III provision is to reduce procyclicality of present banking regulation and promote countercyclical capital buffers for banks. The Eurace agent-based macroeconomic model and sim- ulator has been recently showed to be able to reproduce a credit-fueled boom-bust dynamics ...

  17. An Agent Based Approach for Project Management in Construction

    OpenAIRE

    Sencer, Safiye; Turgay, Tahsin

    2013-01-01

    Project management has an important role in terms of time, cost and flexibility. An agentbased architecture provides additional robustness, scalability, flexibility that is particularly appropriate for problems with a dynamic and distributed nature. Integrated agent based project management covers design and construction planning. It is combined with plan execution, tolerating both the design and plan, which may be changed as necessary. In this reason, the decision making process requires tha...

  18. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  19. Classification of types of stuttering symptoms based on brain activity.

    Directory of Open Access Journals (Sweden)

    Jing Jiang

    Full Text Available Among the non-fluencies seen in speech, some are more typical (MT of stuttering speakers, whereas others are less typical (LT and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT whole-word repetitions (WWR should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type.

  20. Sequence-based classification using discriminatory motif feature selection.

    Directory of Open Access Journals (Sweden)

    Hao Xiong

    Full Text Available Most existing methods for sequence-based classification use exhaustive feature generation, employing, for example, all k-mer patterns. The motivation behind such (enumerative approaches is to minimize the potential for overlooking important features. However, there are shortcomings to this strategy. First, practical constraints limit the scope of exhaustive feature generation to patterns of length ≤ k, such that potentially important, longer (> k predictors are not considered. Second, features so generated exhibit strong dependencies, which can complicate understanding of derived classification rules. Third, and most importantly, numerous irrelevant features are created. These concerns can compromise prediction and interpretation. While remedies have been proposed, they tend to be problem-specific and not broadly applicable. Here, we develop a generally applicable methodology, and an attendant software pipeline, that is predicated on discriminatory motif finding. In addition to the traditional training and validation partitions, our framework entails a third level of data partitioning, a discovery partition. A discriminatory motif finder is used on sequences and associated class labels in the discovery partition to yield a (small set of features. These features are then used as inputs to a classifier in the training partition. Finally, performance assessment occurs on the validation partition. Important attributes of our approach are its modularity (any discriminatory motif finder and any classifier can be deployed and its universality (all data, including sequences that are unaligned and/or of unequal length, can be accommodated. We illustrate our approach on two nucleosome occupancy datasets and a protein solubility dataset, previously analyzed using enumerative feature generation. Our method achieves excellent performance results, with and without optimization of classifier tuning parameters. A Python pipeline implementing the approach is

  1. User Classification in Crowdsourcing-Based Cooperative Spectrum Sensing

    Directory of Open Access Journals (Sweden)

    Linbo Zhai

    2017-07-01

    Full Text Available This paper studies cooperative spectrum sensing based on crowdsourcing in cognitive radio networks. Since intelligent mobile users such as smartphones and tablets can sense the wireless spectrum, channel sensing tasks can be assigned to these mobile users. This is referred to as the crowdsourcing method. However, there may be some malicious mobile users that send false sensing reports deliberately, for their own purposes. False sensing reports will influence decisions about channel state. Therefore, it is necessary to classify mobile users in order to distinguish malicious users. According to the sensing reports, mobile users should not just be divided into two classes (honest and malicious. There are two reasons for this: on the one hand, honest users in different positions may have different sensing outcomes, as shadowing, multi-path fading, and other issues may influence the sensing results; on the other hand, there may be more than one type of malicious users, acting differently in the network. Therefore, it is necessary to classify mobile users into more than two classes. Due to the lack of prior information of the number of user classes, this paper casts the problem of mobile user classification as a dynamic clustering problem that is NP-hard. The paper uses the interdistance-to-intradistance ratio of clusters as the fitness function, and aims to maximize the fitness function. To cast this optimization problem, this paper proposes a distributed algorithm for user classification in order to obtain bounded close-to-optimal solutions, and analyzes the approximation ratio of the proposed algorithm. Simulations show the distributed algorithm achieves higher performance than other algorithms.

  2. Classification of Types of Stuttering Symptoms Based on Brain Activity

    Science.gov (United States)

    Jiang, Jing; Lu, Chunming; Peng, Danling; Zhu, Chaozhe; Howell, Peter

    2012-01-01

    Among the non-fluencies seen in speech, some are more typical (MT) of stuttering speakers, whereas others are less typical (LT) and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT) whole-word repetitions (WWR) should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type. PMID:22761887

  3. Classification of biological agents

    NARCIS (Netherlands)

    Klein MR; LIS; cib

    2012-01-01

    Deze rapportage omvat een inventarisatie van de problematiek rond de classificatie van biologische agentia, die ziekte bij de mens kunnen veroorzaken.

    De constatering is dat de Europese lijst met classificaties verouderd is en geactualiseerd en uitgebreid zou moeten worden. De lijst bevat

  4. Sequence-based classification and identification of Fungi.

    Science.gov (United States)

    Hibbett, David; Abarenkov, Kessy; Kõljalg, Urmas; Öpik, Maarja; Chai, Benli; Cole, James; Wang, Qiong; Crous, Pedro; Robert, Vincent; Helgason, Thorunn; Herr, Joshua R; Kirk, Paul; Lueschow, Shiloh; O'Donnell, Kerry; Nilsson, R Henrik; Oono, Ryoko; Schoch, Conrad; Smyth, Christopher; Walker, Donald M; Porras-Alfaro, Andrea; Taylor, John W; Geiser, David M

    Fungal taxonomy and ecology have been revolutionized by the application of molecular methods and both have increasing connections to genomics and functional biology. However, data streams from traditional specimen- and culture-based systematics are not yet fully integrated with those from metagenomic and metatranscriptomic studies, which limits understanding of the taxonomic diversity and metabolic properties of fungal communities. This article reviews current resources, needs, and opportunities for sequence-based classification and identification (SBCI) in fungi as well as related efforts in prokaryotes. To realize the full potential of fungal SBCI it will be necessary to make advances in multiple areas. Improvements in sequencing methods, including long-read and single-cell technologies, will empower fungal molecular ecologists to look beyond ITS and current shotgun metagenomics approaches. Data quality and accessibility will be enhanced by attention to data and metadata standards and rigorous enforcement of policies for deposition of data and workflows. Taxonomic communities will need to develop best practices for molecular characterization in their focal clades, while also contributing to globally useful datasets including ITS. Changes to nomenclatural rules are needed to enable validPUBLICation of sequence-based taxon descriptions. Finally, cultural shifts are necessary to promote adoption of SBCI and to accord professional credit to individuals who contribute to community resources.

  5. Markov chain aggregation for agent-based models

    CERN Document Server

    Banisch, Sven

    2016-01-01

    This self-contained text develops a Markov chain approach that makes the rigorous analysis of a class of microscopic models that specify the dynamics of complex systems at the individual level possible. It presents a general framework of aggregation in agent-based and related computational models, one which makes use of lumpability and information theory in order to link the micro and macro levels of observation. The starting point is a microscopic Markov chain description of the dynamical process in complete correspondence with the dynamical behavior of the agent-based model (ABM), which is obtained by considering the set of all possible agent configurations as the state space of a huge Markov chain. An explicit formal representation of a resulting “micro-chain” including microscopic transition rates is derived for a class of models by using the random mapping representation of a Markov process. The type of probability distribution used to implement the stochastic part of the model, which defines the upd...

  6. A Multi Agent Based Model for Airport Service Planning

    Directory of Open Access Journals (Sweden)

    W.H. Ip

    2010-09-01

    Full Text Available Aviation industry is highly dynamic and demanding in nature that time and safety are the two most important factors while one of the major sources of delay is aircraft on ground because of it complexity, a lot of machinery like vehicles are involved and lots of communication are involved. As one of the aircraft ground services providers in Hong Kong International Airport, China Aircraft Services Limited (CASL aims to increase competitiveness by better its service provided while minimizing cost is also needed. One of the ways is to optimize the number of maintenance vehicles allocated in order to minimize chance of delay and also operating costs. In the paper, an agent-based model is proposed for support decision making in vehicle allocation. The overview of the aircrafts ground services procedures is firstly mentioned with different optimization methods suggested by researchers. Then, the agent-based approach is introduced and in the latter part of report and a multi-agent system is built and proposed which is decision supportive for CASL in optimizing the maintenance vehicles' allocation. The application provides flexibility for inputting number of different kinds of vehicles, simulation duration and aircraft arrival rate in order to simulation different scenarios which occurs in HKIA.

  7. Improving Generalization Based on l1-Norm Regularization for EEG-Based Motor Imagery Classification

    Directory of Open Access Journals (Sweden)

    Yuwei Zhao

    2018-05-01

    Full Text Available Multichannel electroencephalography (EEG is widely used in typical brain-computer interface (BCI systems. In general, a number of parameters are essential for a EEG classification algorithm due to redundant features involved in EEG signals. However, the generalization of the EEG method is often adversely affected by the model complexity, considerably coherent with its number of undetermined parameters, further leading to heavy overfitting. To decrease the complexity and improve the generalization of EEG method, we present a novel l1-norm-based approach to combine the decision value obtained from each EEG channel directly. By extracting the information from different channels on independent frequency bands (FB with l1-norm regularization, the method proposed fits the training data with much less parameters compared to common spatial pattern (CSP methods in order to reduce overfitting. Moreover, an effective and efficient solution to minimize the optimization object is proposed. The experimental results on dataset IVa of BCI competition III and dataset I of BCI competition IV show that, the proposed method contributes to high classification accuracy and increases generalization performance for the classification of MI EEG. As the training set ratio decreases from 80 to 20%, the average classification accuracy on the two datasets changes from 85.86 and 86.13% to 84.81 and 76.59%, respectively. The classification performance and generalization of the proposed method contribute to the practical application of MI based BCI systems.

  8. Investigating the feasibility of a BCI-driven robot-based writing agent for handicapped individuals

    Science.gov (United States)

    Syan, Chanan S.; Harnarinesingh, Randy E. S.; Beharry, Rishi

    2014-07-01

    Brain-Computer Interfaces (BCIs) predominantly employ output actuators such as virtual keyboards and wheelchair controllers to enable handicapped individuals to interact and communicate with their environment. However, BCI-based assistive technologies are limited in their application. There is minimal research geared towards granting disabled individuals the ability to communicate using written words. This is a drawback because involving a human attendant in writing tasks can entail a breach of personal privacy where the task entails sensitive and private information such as banking matters. BCI-driven robot-based writing however can provide a safeguard for user privacy where it is required. This study investigated the feasibility of a BCI-driven writing agent using the 3 degree-of- freedom Phantom Omnibot. A full alphanumerical English character set was developed and validated using a teach pendant program in MATLAB. The Omnibot was subsequently interfaced to a P300-based BCI. Three subjects utilised the BCI in the online context to communicate words to the writing robot over a Local Area Network (LAN). The average online letter-wise classification accuracy was 91.43%. The writing agent legibly constructed the communicated letters with minor errors in trajectory execution. The developed system therefore provided a feasible platform for BCI-based writing.

  9. Investigating the feasibility of a BCI-driven robot-based writing agent for handicapped individuals

    International Nuclear Information System (INIS)

    Syan, Chanan S; Harnarinesingh, Randy E S; Beharry, Rishi

    2014-01-01

    Brain-Computer Interfaces (BCIs) predominantly employ output actuators such as virtual keyboards and wheelchair controllers to enable handicapped individuals to interact and communicate with their environment. However, BCI-based assistive technologies are limited in their application. There is minimal research geared towards granting disabled individuals the ability to communicate using written words. This is a drawback because involving a human attendant in writing tasks can entail a breach of personal privacy where the task entails sensitive and private information such as banking matters. BCI-driven robot-based writing however can provide a safeguard for user privacy where it is required. This study investigated the feasibility of a BCI-driven writing agent using the 3 degree-of- freedom Phantom Omnibot. A full alphanumerical English character set was developed and validated using a teach pendant program in MATLAB. The Omnibot was subsequently interfaced to a P300-based BCI. Three subjects utilised the BCI in the online context to communicate words to the writing robot over a Local Area Network (LAN). The average online letter-wise classification accuracy was 91.43%. The writing agent legibly constructed the communicated letters with minor errors in trajectory execution. The developed system therefore provided a feasible platform for BCI-based writing

  10. Data Stream Classification Based on the Gamma Classifier

    Directory of Open Access Journals (Sweden)

    Abril Valeria Uriarte-Arcia

    2015-01-01

    Full Text Available The ever increasing data generation confronts us with the problem of handling online massive amounts of information. One of the biggest challenges is how to extract valuable information from these massive continuous data streams during single scanning. In a data stream context, data arrive continuously at high speed; therefore the algorithms developed to address this context must be efficient regarding memory and time management and capable of detecting changes over time in the underlying distribution that generated the data. This work describes a novel method for the task of pattern classification over a continuous data stream based on an associative model. The proposed method is based on the Gamma classifier, which is inspired by the Alpha-Beta associative memories, which are both supervised pattern recognition models. The proposed method is capable of handling the space and time constrain inherent to data stream scenarios. The Data Streaming Gamma classifier (DS-Gamma classifier implements a sliding window approach to provide concept drift detection and a forgetting mechanism. In order to test the classifier, several experiments were performed using different data stream scenarios with real and synthetic data streams. The experimental results show that the method exhibits competitive performance when compared to other state-of-the-art algorithms.

  11. DNA methylation-based classification of central nervous system tumours.

    Science.gov (United States)

    Capper, David; Jones, David T W; Sill, Martin; Hovestadt, Volker; Schrimpf, Daniel; Sturm, Dominik; Koelsche, Christian; Sahm, Felix; Chavez, Lukas; Reuss, David E; Kratz, Annekathrin; Wefers, Annika K; Huang, Kristin; Pajtler, Kristian W; Schweizer, Leonille; Stichel, Damian; Olar, Adriana; Engel, Nils W; Lindenberg, Kerstin; Harter, Patrick N; Braczynski, Anne K; Plate, Karl H; Dohmen, Hildegard; Garvalov, Boyan K; Coras, Roland; Hölsken, Annett; Hewer, Ekkehard; Bewerunge-Hudler, Melanie; Schick, Matthias; Fischer, Roger; Beschorner, Rudi; Schittenhelm, Jens; Staszewski, Ori; Wani, Khalida; Varlet, Pascale; Pages, Melanie; Temming, Petra; Lohmann, Dietmar; Selt, Florian; Witt, Hendrik; Milde, Till; Witt, Olaf; Aronica, Eleonora; Giangaspero, Felice; Rushing, Elisabeth; Scheurlen, Wolfram; Geisenberger, Christoph; Rodriguez, Fausto J; Becker, Albert; Preusser, Matthias; Haberler, Christine; Bjerkvig, Rolf; Cryan, Jane; Farrell, Michael; Deckert, Martina; Hench, Jürgen; Frank, Stephan; Serrano, Jonathan; Kannan, Kasthuri; Tsirigos, Aristotelis; Brück, Wolfgang; Hofer, Silvia; Brehmer, Stefanie; Seiz-Rosenhagen, Marcel; Hänggi, Daniel; Hans, Volkmar; Rozsnoki, Stephanie; Hansford, Jordan R; Kohlhof, Patricia; Kristensen, Bjarne W; Lechner, Matt; Lopes, Beatriz; Mawrin, Christian; Ketter, Ralf; Kulozik, Andreas; Khatib, Ziad; Heppner, Frank; Koch, Arend; Jouvet, Anne; Keohane, Catherine; Mühleisen, Helmut; Mueller, Wolf; Pohl, Ute; Prinz, Marco; Benner, Axel; Zapatka, Marc; Gottardo, Nicholas G; Driever, Pablo Hernáiz; Kramm, Christof M; Müller, Hermann L; Rutkowski, Stefan; von Hoff, Katja; Frühwald, Michael C; Gnekow, Astrid; Fleischhack, Gudrun; Tippelt, Stephan; Calaminus, Gabriele; Monoranu, Camelia-Maria; Perry, Arie; Jones, Chris; Jacques, Thomas S; Radlwimmer, Bernhard; Gessi, Marco; Pietsch, Torsten; Schramm, Johannes; Schackert, Gabriele; Westphal, Manfred; Reifenberger, Guido; Wesseling, Pieter; Weller, Michael; Collins, Vincent Peter; Blümcke, Ingmar; Bendszus, Martin; Debus, Jürgen; Huang, Annie; Jabado, Nada; Northcott, Paul A; Paulus, Werner; Gajjar, Amar; Robinson, Giles W; Taylor, Michael D; Jaunmuktane, Zane; Ryzhova, Marina; Platten, Michael; Unterberg, Andreas; Wick, Wolfgang; Karajannis, Matthias A; Mittelbronn, Michel; Acker, Till; Hartmann, Christian; Aldape, Kenneth; Schüller, Ulrich; Buslei, Rolf; Lichter, Peter; Kool, Marcel; Herold-Mende, Christel; Ellison, David W; Hasselblatt, Martin; Snuderl, Matija; Brandner, Sebastian; Korshunov, Andrey; von Deimling, Andreas; Pfister, Stefan M

    2018-03-22

    Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging-with substantial inter-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show that the availability of this method may have a substantial impact on diagnostic precision compared to standard methods, resulting in a change of diagnosis in up to 12% of prospective cases. For broader accessibility, we have designed a free online classifier tool, the use of which does not require any additional onsite data processing. Our results provide a blueprint for the generation of machine-learning-based tumour classifiers across other cancer entities, with the potential to fundamentally transform tumour pathology.

  12. Estimation of Compaction Parameters Based on Soil Classification

    Science.gov (United States)

    Lubis, A. S.; Muis, Z. A.; Hastuty, I. P.; Siregar, I. M.

    2018-02-01

    Factors that must be considered in compaction of the soil works were the type of soil material, field control, maintenance and availability of funds. Those problems then raised the idea of how to estimate the density of the soil with a proper implementation system, fast, and economical. This study aims to estimate the compaction parameter i.e. the maximum dry unit weight (γ dmax) and optimum water content (Wopt) based on soil classification. Each of 30 samples were being tested for its properties index and compaction test. All of the data’s from the laboratory test results, were used to estimate the compaction parameter values by using linear regression and Goswami Model. From the research result, the soil types were A4, A-6, and A-7 according to AASHTO and SC, SC-SM, and CL based on USCS. By linear regression, the equation for estimation of the maximum dry unit weight (γdmax *)=1,862-0,005*FINES- 0,003*LL and estimation of the optimum water content (wopt *)=- 0,607+0,362*FINES+0,161*LL. By Goswami Model (with equation Y=mLogG+k), for estimation of the maximum dry unit weight (γdmax *) with m=-0,376 and k=2,482, for estimation of the optimum water content (wopt *) with m=21,265 and k=-32,421. For both of these equations a 95% confidence interval was obtained.

  13. Toward a Safety Risk-Based Classification of Unmanned Aircraft

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2016-01-01

    There is a trend of growing interest and demand for greater access of unmanned aircraft (UA) to the National Airspace System (NAS) as the ongoing development of UA technology has created the potential for significant economic benefits. However, the lack of a comprehensive and efficient UA regulatory framework has constrained the number and kinds of UA operations that can be performed. This report presents initial results of a study aimed at defining a safety-risk-based UA classification as a plausible basis for a regulatory framework for UA operating in the NAS. Much of the study up to this point has been at a conceptual high level. The report includes a survey of contextual topics, analysis of safety risk considerations, and initial recommendations for a risk-based approach to safe UA operations in the NAS. The next phase of the study will develop and leverage deeper clarity and insight into practical engineering and regulatory considerations for ensuring that UA operations have an acceptable level of safety.

  14. Superpixel-based classification of gastric chromoendoscopy images

    Science.gov (United States)

    Boschetto, Davide; Grisan, Enrico

    2017-03-01

    Chromoendoscopy (CH) is a gastroenterology imaging modality that involves the staining of tissues with methylene blue, which reacts with the internal walls of the gastrointestinal tract, improving the visual contrast in mucosal surfaces and thus enhancing a doctor's ability to screen precancerous lesions or early cancer. This technique helps identify areas that can be targeted for biopsy or treatment and in this work we will focus on gastric cancer detection. Gastric chromoendoscopy for cancer detection has several taxonomies available, one of which classifies CH images into three classes (normal, metaplasia, dysplasia) based on color, shape and regularity of pit patterns. Computer-assisted diagnosis is desirable to help us improve the reliability of the tissue classification and abnormalities detection. However, traditional computer vision methodologies, mainly segmentation, do not translate well to the specific visual characteristics of a gastroenterology imaging scenario. We propose the exploitation of a first unsupervised segmentation via superpixel, which groups pixels into perceptually meaningful atomic regions, used to replace the rigid structure of the pixel grid. For each superpixel, a set of features is extracted and then fed to a random forest based classifier, which computes a model used to predict the class of each superpixel. The average general accuracy of our model is 92.05% in the pixel domain (86.62% in the superpixel domain), while detection accuracies on the normal and abnormal class are respectively 85.71% and 95%. Eventually, the whole image class can be predicted image through a majority vote on each superpixel's predicted class.

  15. Comprehensive Study on Lexicon-based Ensemble Classification Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Łukasz Augustyniak

    2015-12-01

    Full Text Available We propose a novel method for counting sentiment orientation that outperforms supervised learning approaches in time and memory complexity and is not statistically significantly different from them in accuracy. Our method consists of a novel approach to generating unigram, bigram and trigram lexicons. The proposed method, called frequentiment, is based on calculating the frequency of features (words in the document and averaging their impact on the sentiment score as opposed to documents that do not contain these features. Afterwards, we use ensemble classification to improve the overall accuracy of the method. What is important is that the frequentiment-based lexicons with sentiment threshold selection outperform other popular lexicons and some supervised learners, while being 3–5 times faster than the supervised approach. We compare 37 methods (lexicons, ensembles with lexicon’s predictions as input and supervised learners applied to 10 Amazon review data sets and provide the first statistical comparison of the sentiment annotation methods that include ensemble approaches. It is one of the most comprehensive comparisons of domain sentiment analysis in the literature.

  16. Agent-based modeling of noncommunicable diseases: a systematic review.

    Science.gov (United States)

    Nianogo, Roch A; Arah, Onyebuchi A

    2015-03-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application.

  17. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    Science.gov (United States)

    di Clemente, Riccardo; Pietronero, Luciano

    2012-07-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  18. Capacity Analysis for Parallel Runway through Agent-Based Simulation

    Directory of Open Access Journals (Sweden)

    Yang Peng

    2013-01-01

    Full Text Available Parallel runway is the mainstream structure of China hub airport, runway is often the bottleneck of an airport, and the evaluation of its capacity is of great importance to airport management. This study outlines a model, multiagent architecture, implementation approach, and software prototype of a simulation system for evaluating runway capacity. Agent Unified Modeling Language (AUML is applied to illustrate the inbound and departing procedure of planes and design the agent-based model. The model is evaluated experimentally, and the quality is studied in comparison with models, created by SIMMOD and Arena. The results seem to be highly efficient, so the method can be applied to parallel runway capacity evaluation and the model propose favorable flexibility and extensibility.

  19. Tissue-based standoff biosensors for detecting chemical warfare agents

    Science.gov (United States)

    Greenbaum, Elias; Sanders, Charlene A.

    2003-11-18

    A tissue-based, deployable, standoff air quality sensor for detecting the presence of at least one chemical or biological warfare agent, includes: a cell containing entrapped photosynthetic tissue, the cell adapted for analyzing photosynthetic activity of the entrapped photosynthetic tissue; means for introducing an air sample into the cell and contacting the air sample with the entrapped photosynthetic tissue; a fluorometer in operable relationship with the cell for measuring photosynthetic activity of the entrapped photosynthetic tissue; and transmitting means for transmitting analytical data generated by the fluorometer relating to the presence of at least one chemical or biological warfare agent in the air sample, the sensor adapted for deployment into a selected area.

  20. Hypercompetitive Environments: An Agent-based model approach

    Science.gov (United States)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  1. Complexity and agent-based modelling in urban research

    DEFF Research Database (Denmark)

    Fertner, Christian

    influence on the bigger system. Traditional scientific methods or theories often tried to simplify, not accounting complex relations of actors and decision-making. The introduction of computers in simulation made new approaches in modelling, as for example agent-based modelling (ABM), possible, dealing......Urbanisation processes are results of a broad variety of actors or actor groups and their behaviour and decisions based on different experiences, knowledge, resources, values etc. The decisions done are often on a micro/individual level but resulting in macro/collective behaviour. In urban research...

  2. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    OpenAIRE

    Bo Sun; Qiang Feng; Songjie Li

    2012-01-01

    According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules) based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negoti...

  3. Improving Agent Based Models and Validation through Data Fusion.

    Science.gov (United States)

    Laskowski, Marek; Demianyk, Bryan C P; Friesen, Marcia R; McLeod, Robert D; Mukhi, Shamir N

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level.

  4. Agent-based modelling of consumer energy choices

    Science.gov (United States)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  5. Using the Agent-Based Modeling in Economic Field

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-12-01

    Full Text Available The last ten years of the XX century has been the witnesses of the apparition of a new scientific field, which is usually defined as the study of “Complex adaptive systems”. This field, generic named Complexity Sciences, shares its subject, the general proprieties of complex systems across traditional disciplinary boundaries, with cybernetics and general systems theory. But the development of Complexity Sciences approaches is determined by the extensive use of Agent-Based-Models (ABM as a research tool and an emphasis on systems, such as markets, populations or ecologies, which are less integrated or “organized” than the ones, such as companies and economies, intensively studied by the traditional disciplines. For ABM, a complex system is a system of individual agents who have the freedom to act in ways that are not always totally predictable, and whose actions are interconnected such that one agent’s actions changes the context (environment for other agents. These are many examples of such complex systems: the stock market, the human body immune system, a business organization, an institution, a work-team, a family etc.

  6. Knowledge-based sea ice classification by polarimetric SAR

    DEFF Research Database (Denmark)

    Skriver, Henning; Dierking, Wolfgang

    2004-01-01

    Polarimetric SAR images acquired at C- and L-band over sea ice in the Greenland Sea, Baltic Sea, and Beaufort Sea have been analysed with respect to their potential for ice type classification. The polarimetric data were gathered by the Danish EMISAR and the US AIRSAR which both are airborne...... systems. A hierarchical classification scheme was chosen for sea ice because our knowledge about magnitudes, variations, and dependences of sea ice signatures can be directly considered. The optimal sequence of classification rules and the rules themselves depend on the ice conditions/regimes. The use...... of the polarimetric phase information improves the classification only in the case of thin ice types but is not necessary for thicker ice (above about 30 cm thickness)...

  7. Palm-vein classification based on principal orientation features.

    Directory of Open Access Journals (Sweden)

    Yujia Zhou

    Full Text Available Personal recognition using palm-vein patterns has emerged as a promising alternative for human recognition because of its uniqueness, stability, live body identification, flexibility, and difficulty to cheat. With the expanding application of palm-vein pattern recognition, the corresponding growth of the database has resulted in a long response time. To shorten the response time of identification, this paper proposes a simple and useful classification for palm-vein identification based on principal direction features. In the registration process, the Gaussian-Radon transform is adopted to extract the orientation matrix and then compute the principal direction of a palm-vein image based on the orientation matrix. The database can be classified into six bins based on the value of the principal direction. In the identification process, the principal direction of the test sample is first extracted to ascertain the corresponding bin. One-by-one matching with the training samples is then performed in the bin. To improve recognition efficiency while maintaining better recognition accuracy, two neighborhood bins of the corresponding bin are continuously searched to identify the input palm-vein image. Evaluation experiments are conducted on three different databases, namely, PolyU, CASIA, and the database of this study. Experimental results show that the searching range of one test sample in PolyU, CASIA and our database by the proposed method for palm-vein identification can be reduced to 14.29%, 14.50%, and 14.28%, with retrieval accuracy of 96.67%, 96.00%, and 97.71%, respectively. With 10,000 training samples in the database, the execution time of the identification process by the traditional method is 18.56 s, while that by the proposed approach is 3.16 s. The experimental results confirm that the proposed approach is more efficient than the traditional method, especially for a large database.

  8. Trace elements based classification on clinkers. Application to Spanish clinkers

    Directory of Open Access Journals (Sweden)

    Tamás, F. D.

    2001-12-01

    Full Text Available The qualitative identification to determine the origin (i.e. manufacturing factory of Spanish clinkers is described. The classification of clinkers produced in different factories can be based on their trace element content. Approximately fifteen clinker sorts are analysed, collected from 11 Spanish cement factories to determine their Mg, Sr, Ba, Mn, Ti, Zr, Zn and V content. An expert system formulated by a binary decision tree is designed based on the collected data. The performance of the obtained classifier was measured by ten-fold cross validation. The results show that the proposed method is useful to identify an easy-to-use expert system that is able to determine the origin of the clinker based on its trace element content.

    En el presente trabajo se describe el procedimiento de identificación cualitativa de clínkeres españoles con el objeto de determinar su origen (fábrica. Esa clasificación de los clínkeres se basa en el contenido de sus elementos traza. Se analizaron 15 clínkeres diferentes procedentes de 11 fábricas de cemento españolas, determinándose los contenidos en Mg, Sr, Ba, Mn, Ti, Zr, Zn y V. Se ha diseñado un sistema experto mediante un árbol de decisión binario basado en los datos recogidos. La clasificación obtenida fue examinada mediante la validación cruzada de 10 valores. Los resultados obtenidos muestran que el modelo propuesto es válido para identificar, de manera fácil, un sistema experto capaz de determinar el origen de un clínker basándose en el contenido de sus elementos traza.

  9. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  10. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  11. Radar-Derived Quantitative Precipitation Estimation Based on Precipitation Classification

    Directory of Open Access Journals (Sweden)

    Lili Yang

    2016-01-01

    Full Text Available A method for improving radar-derived quantitative precipitation estimation is proposed. Tropical vertical profiles of reflectivity (VPRs are first determined from multiple VPRs. Upon identifying a tropical VPR, the event can be further classified as either tropical-stratiform or tropical-convective rainfall by a fuzzy logic (FL algorithm. Based on the precipitation-type fields, the reflectivity values are converted into rainfall rate using a Z-R relationship. In order to evaluate the performance of this rainfall classification scheme, three experiments were conducted using three months of data and two study cases. In Experiment I, the Weather Surveillance Radar-1988 Doppler (WSR-88D default Z-R relationship was applied. In Experiment II, the precipitation regime was separated into convective and stratiform rainfall using the FL algorithm, and corresponding Z-R relationships were used. In Experiment III, the precipitation regime was separated into convective, stratiform, and tropical rainfall, and the corresponding Z-R relationships were applied. The results show that the rainfall rates obtained from all three experiments match closely with the gauge observations, although Experiment II could solve the underestimation, when compared to Experiment I. Experiment III significantly reduced this underestimation and generated the most accurate radar estimates of rain rate among the three experiments.

  12. Treatment of esophageal motility disorders based on the chicago classification.

    Science.gov (United States)

    Maradey-Romero, Carla; Gabbard, Scott; Fass, Ronnie

    2014-12-01

    The Chicago Classification divides esophageal motor disorders based on the recorded value of the integrated relaxation pressure (IRP). The first group includes those with an elevated mean IRP that is associated with peristaltic abnormalities such as achalasia and esophagogastric junction outflow obstruction. The second group includes those with a normal mean IRP that is associated with esophageal hypermotility disorders such as distal esophageal spasm, hypercontractile esophagus (jackhammer esophagus), and hypertensive peristalsis (nutcracker esophagus). The third group includes those with a normal mean IRP that is associated with esophageal hypomotility peristaltic abnormalities such as absent peristalsis, weak peristalsis with small or large breaks, and frequent failed peristalsis. The therapeutic options vary greatly between the different groups of esophageal motor disorders. In achalasia patients, potential treatment strategies comprise medical therapy (calcium channel blockers, nitrates, and phosphodiesterase 5 inhibitors), endoscopic procedures (botulinum toxin A injection, pneumatic dilation, or peroral endoscopic myotomy) or surgery (Heller myotomy). Patients with a normal IRP and esophageal hypermotility disorder are candidates for medical therapy (nitrates, calcium channel blockers, phosphodiesterase 5 inhibitors, cimetropium/ipratropium bromide, proton pump inhibitors, benzodiazepines, tricyclic antidepressants, trazodone, selective serotonin reuptake inhibitors, and serotonin-norepinephrine reuptake inhibitors), endoscopic procedures (botulinum toxin A injection and peroral endoscopic myotomy), or surgery (Heller myotomy). Lastly, in patients with a normal IRP and esophageal hypomotility disorder, treatment is primarily focused on controlling the presence of gastroesophageal reflux with proton pump inhibitors and lifestyle modifications (soft and liquid diet and eating in the upright position) to address patient's dysphagia.

  13. China's Classification-Based Forest Management: Procedures, Problems, and Prospects

    Science.gov (United States)

    Dai, Limin; Zhao, Fuqiang; Shao, Guofan; Zhou, Li; Tang, Lina

    2009-06-01

    China’s new Classification-Based Forest Management (CFM) is a two-class system, including Commodity Forest (CoF) and Ecological Welfare Forest (EWF) lands, so named according to differences in their distinct functions and services. The purposes of CFM are to improve forestry economic systems, strengthen resource management in a market economy, ease the conflicts between wood demands and public welfare, and meet the diversified needs for forest services in China. The formative process of China’s CFM has involved a series of trials and revisions. China’s central government accelerated the reform of CFM in the year 2000 and completed the final version in 2003. CFM was implemented at the provincial level with the aid of subsidies from the central government. About a quarter of the forestland in China was approved as National EWF lands by the State Forestry Administration in 2006 and 2007. Logging is prohibited on National EWF lands, and their landowners or managers receive subsidies of about 70 RMB (US10) per hectare from the central government. CFM represents a new forestry strategy in China and its implementation inevitably faces challenges in promoting the understanding of forest ecological services, generalizing nationwide criteria for identifying EWF and CoF lands, setting up forest-specific compensation mechanisms for ecological benefits, enhancing the knowledge of administrators and the general public about CFM, and sustaining EWF lands under China’s current forestland tenure system. CFM does, however, offer a viable pathway toward sustainable forest management in China.

  14. Classification of CT brain images based on deep learning networks.

    Science.gov (United States)

    Gao, Xiaohong W; Hui, Rui; Tian, Zengmin

    2017-01-01

    While computerised tomography (CT) may have been the first imaging tool to study human brain, it has not yet been implemented into clinical decision making process for diagnosis of Alzheimer's disease (AD). On the other hand, with the nature of being prevalent, inexpensive and non-invasive, CT does present diagnostic features of AD to a great extent. This study explores the significance and impact on the application of the burgeoning deep learning techniques to the task of classification of CT brain images, in particular utilising convolutional neural network (CNN), aiming at providing supplementary information for the early diagnosis of Alzheimer's disease. Towards this end, three categories of CT images (N = 285) are clustered into three groups, which are AD, lesion (e.g. tumour) and normal ageing. In addition, considering the characteristics of this collection with larger thickness along the direction of depth (z) (~3-5 mm), an advanced CNN architecture is established integrating both 2D and 3D CNN networks. The fusion of the two CNN networks is subsequently coordinated based on the average of Softmax scores obtained from both networks consolidating 2D images along spatial axial directions and 3D segmented blocks respectively. As a result, the classification accuracy rates rendered by this elaborated CNN architecture are 85.2%, 80% and 95.3% for classes of AD, lesion and normal respectively with an average of 87.6%. Additionally, this improved CNN network appears to outperform the others when in comparison with 2D version only of CNN network as well as a number of state of the art hand-crafted approaches. As a result, these approaches deliver accuracy rates in percentage of 86.3, 85.6 ± 1.10, 86.3 ± 1.04, 85.2 ± 1.60, 83.1 ± 0.35 for 2D CNN, 2D SIFT, 2D KAZE, 3D SIFT and 3D KAZE respectively. The two major contributions of the paper constitute a new 3-D approach while applying deep learning technique to extract signature information

  15. Basic Hand Gestures Classification Based on Surface Electromyography

    Directory of Open Access Journals (Sweden)

    Aleksander Palkowski

    2016-01-01

    Full Text Available This paper presents an innovative classification system for hand gestures using 2-channel surface electromyography analysis. The system developed uses the Support Vector Machine classifier, for which the kernel function and parameter optimisation are conducted additionally by the Cuckoo Search swarm algorithm. The system developed is compared with standard Support Vector Machine classifiers with various kernel functions. The average classification rate of 98.12% has been achieved for the proposed method.

  16. Renoprotection and the Bardoxolone Methyl Story - Is This the Right Way Forward A Novel View of Renoprotection in CKD Trials: A New Classification Scheme for Renoprotective Agents

    Directory of Open Access Journals (Sweden)

    Macaulay Onuigbo

    2013-04-01

    Full Text Available In the June 2011 issue of the New England Journal of Medicine, the BEAM (Bardoxolone Methyl Treatment: Renal Function in CKD/Type 2 Diabetes trial investigators rekindled new interest and also some controversy regarding the concept of renoprotection and the role of renoprotective agents, when they reported significant increases in the mean estimated glomerular filtration rate (eGFR in diabetic chronic kidney disease (CKD patients with an eGFR of 20-45 ml/min/1.73 m2 of body surface area at enrollment who received the trial drug bardoxolone methyl versus placebo. Unfortunately, subsequent phase IIIb trials failed to show that the drug is a safe alternative renoprotective agent. Current renoprotection paradigms depend wholly and entirely on angiotensin blockade; however, these agents [angiotensin converting enzyme (ACE inhibitors and angiotensin receptor blockers (ARBs] have proved to be imperfect renoprotective agents. In this review, we examine the mechanistic limitations of the various previous randomized controlled trials on CKD renoprotection, including the paucity of veritable, elaborate and systematic assessment methods for the documentation and reporting of individual patient-level, drug-related adverse events. We review the evidence base for the presence of putative, multiple independent and unrelated pathogenetic mechanisms that drive (diabetic and non-diabetic CKD progression. Furthermore, we examine the validity, or lack thereof, of the hyped notion that the blockade of a single molecule (angiotensin II, which can only antagonize the angiotensin cascade, would veritably successfully, consistently and unfailingly deliver adequate and qualitative renoprotection results in (diabetic and non-diabetic CKD patients. We clearly posit that there is this overarching impetus to arrive at the inference that multiple, disparately diverse and independent pathways, including any veritable combination of the mechanisms that we examine in this review

  17. Ligand and structure-based classification models for Prediction of P-glycoprotein inhibitors

    DEFF Research Database (Denmark)

    Klepsch, Freya; Poongavanam, Vasanthanathan; Ecker, Gerhard Franz

    2014-01-01

    an algorithm based on Euclidean distance. Results show that random forest and SVM performed best for classification of P-gp inhibitors and non-inhibitors, correctly predicting 73/75 % of the external test set compounds. Classification based on the docking experiments using the scoring function Chem...

  18. Tweet-based Target Market Classification Using Ensemble Method

    Directory of Open Access Journals (Sweden)

    Muhammad Adi Khairul Anshary

    2016-09-01

    Full Text Available Target market classification is aimed at focusing marketing activities on the right targets. Classification of target markets can be done through data mining and by utilizing data from social media, e.g. Twitter. The end result of data mining are learning models that can classify new data. Ensemble methods can improve the accuracy of the models and therefore provide better results. In this study, classification of target markets was conducted on a dataset of 3000 tweets in order to extract features. Classification models were constructed to manipulate the training data using two ensemble methods (bagging and boosting. To investigate the effectiveness of the ensemble methods, this study used the CART (classification and regression tree algorithm for comparison. Three categories of consumer goods (computers, mobile phones and cameras and three categories of sentiments (positive, negative and neutral were classified towards three target-market categories. Machine learning was performed using Weka 3.6.9. The results of the test data showed that the bagging method improved the accuracy of CART with 1.9% (to 85.20%. On the other hand, for sentiment classification, the ensemble methods were not successful in increasing the accuracy of CART. The results of this study may be taken into consideration by companies who approach their customers through social media, especially Twitter.

  19. Agent-Based Model of Information Security System: Architecture and Formal Framework for Coordinated Intelligent Agents Behavior Specification

    National Research Council Canada - National Science Library

    Gorodetski, Vladimir

    2001-01-01

    The contractor will research and further develop the technology supporting an agent-based architecture for an information security system and a formal framework to specify a model of distributed knowledge...

  20. Research on Remote Sensing Image Classification Based on Feature Level Fusion

    Science.gov (United States)

    Yuan, L.; Zhu, G.

    2018-04-01

    Remote sensing image classification, as an important direction of remote sensing image processing and application, has been widely studied. However, in the process of existing classification algorithms, there still exists the phenomenon of misclassification and missing points, which leads to the final classification accuracy is not high. In this paper, we selected Sentinel-1A and Landsat8 OLI images as data sources, and propose a classification method based on feature level fusion. Compare three kind of feature level fusion algorithms (i.e., Gram-Schmidt spectral sharpening, Principal Component Analysis transform and Brovey transform), and then select the best fused image for the classification experimental. In the classification process, we choose four kinds of image classification algorithms (i.e. Minimum distance, Mahalanobis distance, Support Vector Machine and ISODATA) to do contrast experiment. We use overall classification precision and Kappa coefficient as the classification accuracy evaluation criteria, and the four classification results of fused image are analysed. The experimental results show that the fusion effect of Gram-Schmidt spectral sharpening is better than other methods. In four kinds of classification algorithms, the fused image has the best applicability to Support Vector Machine classification, the overall classification precision is 94.01 % and the Kappa coefficients is 0.91. The fused image with Sentinel-1A and Landsat8 OLI is not only have more spatial information and spectral texture characteristics, but also enhances the distinguishing features of the images. The proposed method is beneficial to improve the accuracy and stability of remote sensing image classification.

  1. Agent-based simulation of electricity markets. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Sensfuss, F.; Ragwitz, M. [Fraunhofer-Institut fuer Systemtechnik und Innovationsforschung (ISI), Karlsruhe (Germany); Genoese, M.; Moest, D. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Industriebetriebslehre und Industrielle Produktion

    2007-07-01

    Liberalisation, climate policy and promotion of renewable energy are challenges to players of the electricity sector in many countries. Policy makers have to con-sider issues like market power, bounded rationality of players and the appear-ance of fluctuating energy sources in order to provide adequate legislation. Fur-thermore the interactions between markets and environmental policy instru-ments become an issue of increasing importance. A promising approach for the scientific analysis of these developments is the field of agent-based simulation. The goal of this article is to provide an overview of the current work applying this methodology to the analysis of electricity markets. (orig.)

  2. Agent-based Algorithm for Spatial Distribution of Objects

    KAUST Repository

    Collier, Nathan

    2012-06-02

    In this paper we present an agent-based algorithm for the spatial distribution of objects. The algorithm is a generalization of the bubble mesh algorithm, initially created for the point insertion stage of the meshing process of the finite element method. The bubble mesh algorithm treats objects in space as bubbles, which repel and attract each other. The dynamics of each bubble are approximated by solving a series of ordinary differential equations. We present numerical results for a meshing application as well as a graph visualization application.

  3. Web of Data Evolution by Exploiting Agent Based-Argumentation

    OpenAIRE

    Chamekh , Fatma; Boulanger , Danielle; Talens , Guilaine

    2015-01-01

    International audience; Sharing knowledge and data coming from different sources is one of the biggest advantage of linked data. Keeping this knowledge graph up to date may take in account both ontology vocabularies and data since they should be consistent. Our general problem is to deal with web of data evolution in particular: We aim at assisting user in a such complex process. In this research work, we propose an agent based-argumentation framework to help user linked data changes. We assi...

  4. Intelligence system based classification approach for medical disease diagnosis

    Science.gov (United States)

    Sagir, Abdu Masanawa; Sathasivam, Saratha

    2017-08-01

    The prediction of breast cancer in women who have no signs or symptoms of the disease as well as survivability after undergone certain surgery has been a challenging problem for medical researchers. The decision about presence or absence of diseases depends on the physician's intuition, experience and skill for comparing current indicators with previous one than on knowledge rich data hidden in a database. This measure is a very crucial and challenging task. The goal is to predict patient condition by using an adaptive neuro fuzzy inference system (ANFIS) pre-processed by grid partitioning. To achieve an accurate diagnosis at this complex stage of symptom analysis, the physician may need efficient diagnosis system. A framework describes methodology for designing and evaluation of classification performances of two discrete ANFIS systems of hybrid learning algorithms least square estimates with Modified Levenberg-Marquardt and Gradient descent algorithms that can be used by physicians to accelerate diagnosis process. The proposed method's performance was evaluated based on training and test datasets with mammographic mass and Haberman's survival Datasets obtained from benchmarked datasets of University of California at Irvine's (UCI) machine learning repository. The robustness of the performance measuring total accuracy, sensitivity and specificity is examined. In comparison, the proposed method achieves superior performance when compared to conventional ANFIS based gradient descent algorithm and some related existing methods. The software used for the implementation is MATLAB R2014a (version 8.3) and executed in PC Intel Pentium IV E7400 processor with 2.80 GHz speed and 2.0 GB of RAM.

  5. Identification of walking human model using agent-based modelling

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  6. Hierarchical structure for audio-video based semantic classification of sports video sequences

    Science.gov (United States)

    Kolekar, M. H.; Sengupta, S.

    2005-07-01

    A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.

  7. Router Agent Technology for Policy-Based Network Management

    Science.gov (United States)

    Chow, Edward T.; Sudhir, Gurusham; Chang, Hsin-Ping; James, Mark; Liu, Yih-Chiao J.; Chiang, Winston

    2011-01-01

    This innovation can be run as a standalone network application on any computer in a networked environment. This design can be configured to control one or more routers (one instance per router), and can also be configured to listen to a policy server over the network to receive new policies based on the policy- based network management technology. The Router Agent Technology transforms the received policies into suitable Access Control List syntax for the routers it is configured to control. It commits the newly generated access control lists to the routers and provides feedback regarding any errors that were faced. The innovation also automatically generates a time-stamped log file regarding all updates to the router it is configured to control. This technology, once installed on a local network computer and started, is autonomous because it has the capability to keep listening to new policies from the policy server, transforming those policies to router-compliant access lists, and committing those access lists to a specified interface on the specified router on the network with any error feedback regarding commitment process. The stand-alone application is named RouterAgent and is currently realized as a fully functional (version 1) implementation for the Windows operating system and for CISCO routers.

  8. An agent-based approach to financial stylized facts

    Science.gov (United States)

    Shimokawa, Tetsuya; Suzuki, Kyoko; Misawa, Tadanobu

    2007-06-01

    An important challenge of the financial theory in recent years is to construct more sophisticated models which have consistencies with as many financial stylized facts that cannot be explained by traditional models. Recently, psychological studies on decision making under uncertainty which originate in Kahneman and Tversky's research attract a lot of interest as key factors which figure out the financial stylized facts. These psychological results have been applied to the theory of investor's decision making and financial equilibrium modeling. This paper, following these behavioral financial studies, would like to propose an agent-based equilibrium model with prospect theoretical features of investors. Our goal is to point out a possibility that loss-averse feature of investors explains vast number of financial stylized facts and plays a crucial role in price formations of financial markets. Price process which is endogenously generated through our model has consistencies with, not only the equity premium puzzle and the volatility puzzle, but great kurtosis, asymmetry of return distribution, auto-correlation of return volatility, cross-correlation between return volatility and trading volume. Moreover, by using agent-based simulations, the paper also provides a rigorous explanation from the viewpoint of a lack of market liquidity to the size effect, which means that small-sized stocks enjoy excess returns compared to large-sized stocks.

  9. Strengthening Theoretical Testing in Criminology Using Agent-based Modeling.

    Science.gov (United States)

    Johnson, Shane D; Groff, Elizabeth R

    2014-07-01

    The Journal of Research in Crime and Delinquency ( JRCD ) has published important contributions to both criminological theory and associated empirical tests. In this article, we consider some of the challenges associated with traditional approaches to social science research, and discuss a complementary approach that is gaining popularity-agent-based computational modeling-that may offer new opportunities to strengthen theories of crime and develop insights into phenomena of interest. Two literature reviews are completed. The aim of the first is to identify those articles published in JRCD that have been the most influential and to classify the theoretical perspectives taken. The second is intended to identify those studies that have used an agent-based model (ABM) to examine criminological theories and to identify which theories have been explored. Ecological theories of crime pattern formation have received the most attention from researchers using ABMs, but many other criminological theories are amenable to testing using such methods. Traditional methods of theory development and testing suffer from a number of potential issues that a more systematic use of ABMs-not without its own issues-may help to overcome. ABMs should become another method in the criminologists toolbox to aid theory testing and falsification.

  10. SPAM CLASSIFICATION BASED ON SUPERVISED LEARNING USING MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Hamsapriya

    2011-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Spam emails are invading users without their consent and filling their mail boxes. They consume more network capacity as well as time in checking and deleting spam mails. The vast majority of Internet users are outspoken in their disdain for spam, although enough of them respond to commercial offers that spam remains a viable source of income to spammers. While most of the users want to do right think to avoid and get rid of spam, they need clear and simple guidelines on how to behave. In spite of all the measures taken to eliminate spam, they are not yet eradicated. Also when the counter measures are over sensitive, even legitimate emails will be eliminated. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifier-related issues. In recent days, Machine learning for spam classification is an important research issue. The effectiveness of the proposed work is explores and identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysis among the algorithms has also been presented.

  11. Classification of urine sediment based on convolution neural network

    Science.gov (United States)

    Pan, Jingjing; Jiang, Cunbo; Zhu, Tiantian

    2018-04-01

    By designing a new convolution neural network framework, this paper breaks the constraints of the original convolution neural network framework requiring large training samples and samples of the same size. Move and cropping the input images, generate the same size of the sub-graph. And then, the generated sub-graph uses the method of dropout, increasing the diversity of samples and preventing the fitting generation. Randomly select some proper subset in the sub-graphic set and ensure that the number of elements in the proper subset is same and the proper subset is not the same. The proper subsets are used as input layers for the convolution neural network. Through the convolution layer, the pooling, the full connection layer and output layer, we can obtained the classification loss rate of test set and training set. In the red blood cells, white blood cells, calcium oxalate crystallization classification experiment, the classification accuracy rate of 97% or more.

  12. Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems

    Science.gov (United States)

    Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen

    Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.

  13. Data classification based on the hybrid intellectual technology

    Directory of Open Access Journals (Sweden)

    Demidova Liliya

    2018-01-01

    Full Text Available In this paper the data classification technique, implying the consistent application of the SVM and Parzen classifiers, has been suggested. The Parser classifier applies to data which can be both correctly and erroneously classified using the SVM classifier, and are located in the experimentally defined subareas near the hyperplane which separates the classes. A herewith, the SVM classifier is used with the default parameters values, and the optimal parameters values of the Parser classifier are determined using the genetic algorithm. The experimental results confirming the effectiveness of the proposed hybrid intellectual data classification technology have been presented.

  14. Woven fabric defects detection based on texture classification algorithm

    International Nuclear Information System (INIS)

    Ben Salem, Y.; Nasri, S.

    2011-01-01

    In this paper we have compared two famous methods in texture classification to solve the problem of recognition and classification of defects occurring in a textile manufacture. We have compared local binary patterns method with co-occurrence matrix. The classifier used is the support vector machines (SVM). The system has been tested using TILDA database. The results obtained are interesting and show that LBP is a good method for the problems of recognition and classifcation defects, it gives a good running time especially for the real time applications.

  15. Classification of Gait Types Based on the Duty-factor

    DEFF Research Database (Denmark)

    Fihl, Preben; Moeslund, Thomas B.

    2007-01-01

    on the speed of the human, the cameras setup etc. and hence a robust descriptor for gait classification. The dutyfactor is basically a matter of measuring the ground support of the feet with respect to the stride. We estimate this by comparing the incoming silhouettes to a database of silhouettes with known...... ground support. Silhouettes are extracted using the Codebook method and represented using Shape Contexts. The matching with database silhouettes is done using the Hungarian method. While manually estimated duty-factors show a clear classification the presented system contains misclassifications due...

  16. SVM-based Partial Discharge Pattern Classification for GIS

    Science.gov (United States)

    Ling, Yin; Bai, Demeng; Wang, Menglin; Gong, Xiaojin; Gu, Chao

    2018-01-01

    Partial discharges (PD) occur when there are localized dielectric breakdowns in small regions of gas insulated substations (GIS). It is of high importance to recognize the PD patterns, through which we can diagnose the defects caused by different sources so that predictive maintenance can be conducted to prevent from unplanned power outage. In this paper, we propose an approach to perform partial discharge pattern classification. It first recovers the PRPD matrices from the PRPD2D images; then statistical features are extracted from the recovered PRPD matrix and fed into SVM for classification. Experiments conducted on a dataset containing thousands of images demonstrates the high effectiveness of the method.

  17. Multispace Behavioral Model for Face-Based Affective Social Agents

    Directory of Open Access Journals (Sweden)

    DiPaola Steve

    2007-01-01

    Full Text Available This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces: knowledge, personality, and mood. These spaces control a lower-level geometry space that provides parameters at the facial feature level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowledge encapsulates the tasks to be performed and the decision-making process using a specially designed XML-based language. While the geometry space provides an MPEG-4 compatible set of parameters for low-level control, the behavioral extensions available through the triple spaces provide flexible means of designing complicated personality types, facial expression, and dynamic interactive scenarios.

  18. Agent Based Modeling on Organizational Dynamics of Terrorist Network

    Directory of Open Access Journals (Sweden)

    Bo Li

    2015-01-01

    Full Text Available Modeling organizational dynamics of terrorist network is a critical issue in computational analysis of terrorism research. The first step for effective counterterrorism and strategic intervention is to investigate how the terrorists operate with the relational network and what affects the performance. In this paper, we investigate the organizational dynamics by employing a computational experimentation methodology. The hierarchical cellular network model and the organizational dynamics model are developed for modeling the hybrid relational structure and complex operational processes, respectively. To intuitively elucidate this method, the agent based modeling is used to simulate the terrorist network and test the performance in diverse scenarios. Based on the experimental results, we show how the changes of operational environments affect the development of terrorist organization in terms of its recovery and capacity to perform future tasks. The potential strategies are also discussed, which can be used to restrain the activities of terrorists.

  19. Multispace Behavioral Model for Face-Based Affective Social Agents

    Directory of Open Access Journals (Sweden)

    Ali Arya

    2007-03-01

    Full Text Available This paper describes a behavioral model for affective social agents based on three independent but interacting parameter spaces: knowledge, personality, and mood. These spaces control a lower-level geometry space that provides parameters at the facial feature level. Personality and mood use findings in behavioral psychology to relate the perception of personality types and emotional states to the facial actions and expressions through two-dimensional models for personality and emotion. Knowledge encapsulates the tasks to be performed and the decision-making process using a specially designed XML-based language. While the geometry space provides an MPEG-4 compatible set of parameters for low-level control, the behavioral extensions available through the triple spaces provide flexible means of designing complicated personality types, facial expression, and dynamic interactive scenarios.

  20. Ensemble Classification of Data Streams Based on Attribute Reduction and a Sliding Window

    Directory of Open Access Journals (Sweden)

    Yingchun Chen

    2018-04-01

    Full Text Available With the current increasing volume and dimensionality of data, traditional data classification algorithms are unable to satisfy the demands of practical classification applications of data streams. To deal with noise and concept drift in data streams, we propose an ensemble classification algorithm based on attribute reduction and a sliding window in this paper. Using mutual information, an approximate attribute reduction algorithm based on rough sets is used to reduce data dimensionality and increase the diversity of reduced results in the algorithm. A double-threshold concept drift detection method and a three-stage sliding window control strategy are introduced to improve the performance of the algorithm when dealing with both noise and concept drift. The classification precision is further improved by updating the base classifiers and their nonlinear weights. Experiments on synthetic datasets and actual datasets demonstrate the performance of the algorithm in terms of classification precision, memory use, and time efficiency.

  1. An implementation of norm-based agent negotiation.

    NARCIS (Netherlands)

    Dijkstra, Pieter; Prakken, H.; Vey Mestdagh, C.N.J. de

    2007-01-01

    In this paper, we develop our previous outline of a multi-agent architecture for regulated information exchange in crime investigations. Interactions about information exchange between agents (representing police officers) are further analysed as negotiation dialogues with embedded persuasion

  2. An application-based classification to understand buyer-seller interaction in business services

    NARCIS (Netherlands)

    Valk, van der W.; Wynstra, J.Y.F.; Axelsson, B.

    2006-01-01

    Abstract: Purpose – Most existing classifications of business services have taken the perspective of the supplier as opposed to that of the buyer. To address this imbalance, the purpose of this paper is to propose a classification of business services based on how the buying company applies the

  3. Initial steps towards an evidence-based classification system for golfers with a physical impairment

    NARCIS (Netherlands)

    Stoter, Inge K.; Hettinga, Florentina J.; Altmann, Viola; Eisma, Wim; Arendzen, Hans; Bennett, Tony; van der Woude, Lucas H.; Dekker, Rienk

    2017-01-01

    Purpose: The present narrative review aims to make a first step towards an evidence-based classification system in handigolf following the International Paralympic Committee (IPC). It intends to create a conceptual framework of classification for handigolf and an agenda for future research. Method:

  4. Vision-Based Perception and Classification of Mosquitoes Using Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Masataka Fuchida

    2017-01-01

    Full Text Available The need for a novel automated mosquito perception and classification method is becoming increasingly essential in recent years, with steeply increasing number of mosquito-borne diseases and associated casualties. There exist remote sensing and GIS-based methods for mapping potential mosquito inhabitants and locations that are prone to mosquito-borne diseases, but these methods generally do not account for species-wise identification of mosquitoes in closed-perimeter regions. Traditional methods for mosquito classification involve highly manual processes requiring tedious sample collection and supervised laboratory analysis. In this research work, we present the design and experimental validation of an automated vision-based mosquito classification module that can deploy in closed-perimeter mosquito inhabitants. The module is capable of identifying mosquitoes from other bugs such as bees and flies by extracting the morphological features, followed by support vector machine-based classification. In addition, this paper presents the results of three variants of support vector machine classifier in the context of mosquito classification problem. This vision-based approach to the mosquito classification problem presents an efficient alternative to the conventional methods for mosquito surveillance, mapping and sample image collection. Experimental results involving classification between mosquitoes and a predefined set of other bugs using multiple classification strategies demonstrate the efficacy and validity of the proposed approach with a maximum recall of 98%.

  5. Modern approaches to agent-based complex automated negotiation

    CERN Document Server

    Bai, Quan; Ito, Takayuki; Zhang, Minjie; Ren, Fenghui; Aydoğan, Reyhan; Hadfi, Rafik

    2017-01-01

    This book addresses several important aspects of complex automated negotiations and introduces a number of modern approaches for facilitating agents to conduct complex negotiations. It demonstrates that autonomous negotiation is one of the most important areas in the field of autonomous agents and multi-agent systems. Further, it presents complex automated negotiation scenarios that involve negotiation encounters that may have, for instance, a large number of agents, a large number of issues with strong interdependencies and/or real-time constraints.

  6. Agent-Based Decentralized Control Method for Islanded Microgrids

    DEFF Research Database (Denmark)

    Li, Qiang; Chen, Feixiong; Chen, Minyou

    2016-01-01

    as a local control processor together with communication devices, so agents can collect present states of distributed generators and loads, when communication lines are added between two layers. Moreover, each agent can also exchange information with its neighboring agents of the network. After information...

  7. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  8. Multi-label literature classification based on the Gene Ontology graph

    Directory of Open Access Journals (Sweden)

    Lu Xinghua

    2008-12-01

    Full Text Available Abstract Background The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. Results In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Conclusion Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate

  9. Colour based off-road environment and terrain type classification

    NARCIS (Netherlands)

    Jansen, P.; Mark, W. van der; Heuvel, J.C. van den; Groen, F.C.A.

    2005-01-01

    Terrain classification is an important problem that still remains to be solved for off-road autonomous robot vehicle guidance. Often, obstacle detection systems are used which cannot distinguish between solid obstacles such as rocks or soft obstacles such as tall patches of grass. Terrain

  10. Emotion of Physiological Signals Classification Based on TS Feature Selection

    Institute of Scientific and Technical Information of China (English)

    Wang Yujing; Mo Jianlin

    2015-01-01

    This paper propose a method of TS-MLP about emotion recognition of physiological signal.It can recognize emotion successfully by Tabu search which selects features of emotion’s physiological signals and multilayer perceptron that is used to classify emotion.Simulation shows that it has achieved good emotion classification performance.

  11. A vegetation-based hierarchical classification for seasonally pulsed ...

    African Journals Online (AJOL)

    A classification scheme is presented for seasonal floodplains of the Boro-Xudum distributary of the Okavango Delta, Botswana. This distributary is subject to an annual flood-pulse, the inundated area varying from a mean low of 3 600 km2 to a mean high of 5 400 km2 between 2000 and 2006. A stratified random sample of ...

  12. A Classification System for Hospital-Based Infection Outbreaks

    Directory of Open Access Journals (Sweden)

    Paul S. Ganney

    2010-01-01

    Full Text Available Outbreaks of infection within semi-closed environments such as hospitals, whether inherent in the environment (such as Clostridium difficile (C.Diff or Methicillinresistant Staphylococcus aureus (MRSA or imported from the wider community (such as Norwalk-like viruses (NLVs, are difficult to manage. As part of our work on modelling such outbreaks, we have developed a classification system to describe the impact of a particular outbreak upon an organization. This classification system may then be used in comparing appropriate computer models to real outbreaks, as well as in comparing different real outbreaks in, for example, the comparison of differing management and containment techniques and strategies. Data from NLV outbreaks in the Hull and East Yorkshire Hospitals NHS Trust (the Trust over several previous years are analysed and classified, both for infection within staff (where the end of infection date may not be known and within patients (where it generally is known. A classification system consisting of seven elements is described, along with a goodness-of-fit method for comparing a new classification to previously known ones, for use in evaluating a simulation against history and thereby determining how ‘realistic’ (or otherwise it is.

  13. A classification system for hospital-based infection outbreaks.

    Science.gov (United States)

    Ganney, Paul S; Madeo, Maurice; Phillips, Roger

    2010-12-01

    Outbreaks of infection within semi-closed environments such as hospitals, whether inherent in the environment (such as Clostridium difficile (C.Diff) or Methicillin-resistant Staphylococcus aureus (MRSA) or imported from the wider community (such as Norwalk-like viruses (NLVs)), are difficult to manage. As part of our work on modelling such outbreaks, we have developed a classification system to describe the impact of a particular outbreak upon an organization. This classification system may then be used in comparing appropriate computer models to real outbreaks, as well as in comparing different real outbreaks in, for example, the comparison of differing management and containment techniques and strategies. Data from NLV outbreaks in the Hull and East Yorkshire Hospitals NHS Trust (the Trust) over several previous years are analysed and classified, both for infection within staff (where the end of infection date may not be known) and within patients (where it generally is known). A classification system consisting of seven elements is described, along with a goodness-of-fit method for comparing a new classification to previously known ones, for use in evaluating a simulation against history and thereby determining how 'realistic' (or otherwise) it is.

  14. Agent-based modelling of heating system adoption in Norway

    Energy Technology Data Exchange (ETDEWEB)

    Sopha, Bertha Maya; Kloeckner, Christian A.; Hertwich, Edgar G.

    2010-07-01

    Full text: This paper introduces agent-based modelling as a methodological approach to understand the effect of decision making mechanism on the adoption of heating systems in Norway. The model is used as an experimental/learning tool to design possible interventions, not for prediction. The intended users of the model are therefore policy designers. Primary heating system adoptions of electric heating, heat pump and wood pellet heating were selected. Random topology was chosen to represent social network among households. Agents were households with certain location, number of peers, current adopted heating system, employed decision strategy, and degree of social influence in decision making. The overall framework of decision-making integrated theories from different disciplines; customer behavior theory, behavioral economics, theory of planned behavior, and diffusion of innovation, in order to capture possible decision making processes in households. A mail survey of 270 Norwegian households conducted in 2008 was designed specifically for acquiring data for the simulation. The model represents real geographic area of households and simulates the overall fraction of adopted heating system under study. The model was calibrated with historical data from Statistics Norway (SSB). Interventions with respects to total cost, norms, indoor air quality, reliability, supply security, required work, could be explored using the model. For instance, the model demonstrates that a considerable total cost (investment and operating cost) increase of electric heating and heat pump, rather than a reduction of wood pellet heating's total cost, are required to initiate and speed up wood pellet adoption. (Author)

  15. Naturally Occurring Wound Healing Agents: An Evidence-Based Review.

    Science.gov (United States)

    Karapanagioti, E G; Assimopoulou, A N

    2016-01-01

    Nature constitutes a pool of medicines for thousands of years. Nowadays, trust in nature is increasingly growing, as many effective medicines are naturally derived. Over the last decades, the potential of plants as wound healing agents is being investigated. Wounds and ulcers affect the patients' life quality and often lead to amputations. Approximately 43,000,000 patients suffer from diabetic foot ulcers worldwide. Annually, $25 billion are expended for the treatment of chronic wounds, with the number growing due to aging population and increased incidents of diabetes and obesity. Therefore a timely, orderly and effective wound management and treatment is crucial. This paper aims to systematically review natural products, mainly plants, with scientifically well documented wound healing activity, focusing on articles based on animal and clinical studies performed worldwide and approved medicinal products. Moreover, a brief description of the wound healing mechanism is presented, to provide a better understanding. Although a plethora of natural products are in vitro and in vivo evaluated for wound healing activity, only a few go through clinical trials and even fewer launch the market as approved medicines. Most of them rely on traditional medicine, indicating that ethnopharmacology is a successful strategy for drug development. Since only 6% of plants have been systematically investigated pharmacologically, more intensified efforts and emerging advancements are needed to exploit the potentials of nature for the development of novel medicines. This paper aims to provide a reliable database and matrix for thorough further investigation towards the discovery of wound healing agents.

  16. An agent-based model for energy service companies

    International Nuclear Information System (INIS)

    Robinson, Marguerite; Varga, Liz; Allen, Peter

    2015-01-01

    Highlights: • An agent-based model for household energy efficiency upgrades is considered. • Energy service companies provide an alternative to traditional utility providers. • Household self-financing is a limiting factor to widespread efficiency upgrading. • Longer term service contracts can lead to reduced household energy costs. • Future energy price increases enable service providers to retain their customer base. - Abstract: The residential housing sector is a major consumer of energy accounting for approximately one third of carbon emissions in the United Kingdom. Achieving a sustainable, low-carbon infrastructure necessitates a reduced and more efficient use of domestic energy supplies. Energy service companies offer an alternative to traditional providers, which supply a single utility product to satisfy the unconstrained demand of end users, and have been identified as a potentially important actor in sustainable future economies. An agent-based model is developed to examine the potential of energy service companies to contribute to the large scale upgrading of household energy efficiency, which would ultimately lead to a more sustainable and secure energy infrastructure. The migration of households towards energy service companies is described by an attractiveness array, through which potential customers can evaluate the future benefits, in terms of household energy costs, of changing provider. It is shown that self-financing is a limiting factor to the widespread upgrading of residential energy efficiency. Greater reductions in household energy costs could be achieved by committing to longer term contracts, allowing upgrade costs to be distributed over greater time intervals. A steadily increasing cost of future energy usage lends an element of stability to the market, with energy service companies displaying the ability to retain customers on contract expiration. The model highlights how a greater focus on the provision of energy services, as

  17. Proposing a Hybrid Model Based on Robson's Classification for Better Impact on Trends of Cesarean Deliveries.

    Science.gov (United States)

    Hans, Punit; Rohatgi, Renu

    2017-06-01

    To construct a hybrid model classification for cesarean section (CS) deliveries based on the woman-characteristics (Robson's classification with additional layers of indications for CS, keeping in view low-resource settings available in India). This is a cross-sectional study conducted at Nalanda Medical College, Patna. All the women delivered from January 2016 to May 2016 in the labor ward were included. Results obtained were compared with the values obtained for India, from secondary analysis of WHO multi-country survey (2010-2011) by Joshua Vogel and colleagues' study published in "The Lancet Global Health." The three classifications (indication-based, Robson's and hybrid model) applied for categorization of the cesarean deliveries from the same sample of data and a semiqualitative evaluations done, considering the main characteristics, strengths and weaknesses of each classification system. The total number of women delivered during study period was 1462, out of which CS deliveries were 471. Overall, CS rate calculated for NMCH, hospital in this specified period, was 32.21% ( p  = 0.001). Hybrid model scored 23/23, and scores of Robson classification and indication-based classification were 21/23 and 10/23, respectively. Single-study centre and referral bias are the limitations of the study. Given the flexibility of the classifications, we constructed a hybrid model based on the woman-characteristics system with additional layers of other classification. Indication-based classification answers why, Robson classification answers on whom, while through our hybrid model we get to know why and on whom cesarean deliveries are being performed.

  18. Applying Topographic Classification, Based on the Hydrological Process, to Design Habitat Linkages for Climate Change

    Directory of Open Access Journals (Sweden)

    Yongwon Mo

    2017-11-01

    Full Text Available The use of biodiversity surrogates has been discussed in the context of designing habitat linkages to support the migration of species affected by climate change. Topography has been proposed as a useful surrogate in the coarse-filter approach, as the hydrological process caused by topography such as erosion and accumulation is the basis of ecological processes. However, some studies that have designed topographic linkages as habitat linkages, so far have focused much on the shape of the topography (morphometric topographic classification with little emphasis on the hydrological processes (generic topographic classification to find such topographic linkages. We aimed to understand whether generic classification was valid for designing these linkages. First, we evaluated whether topographic classification is more appropriate for describing actual (coniferous and deciduous and potential (mammals and amphibians habitat distributions. Second, we analyzed the difference in the linkages between the morphometric and generic topographic classifications. The results showed that the generic classification represented the actual distribution of the trees, but neither the morphometric nor the generic classification could represent the potential animal distributions adequately. Our study demonstrated that the topographic classes, according to the generic classification, were arranged successively according to the flow of water, nutrients, and sediment; therefore, it would be advantageous to secure linkages with a width of 1 km or more. In addition, the edge effect would be smaller than with the morphometric classification. Accordingly, we suggest that topographic characteristics, based on the hydrological process, are required to design topographic linkages for climate change.

  19. Agent-based Modelling, a new kind of research

    DEFF Research Database (Denmark)

    Held, Fabian P.; Wilkinson, Ian F.; Marks, Robert E.

    2014-01-01

    guidelines to help plan and structure the development of a theory about the causes of such a phenomenon in conjunction with a matching ABM. We argue that research about complex social phenomena is still largely fundamental research and therefore an iterative and cyclical development process of both theory......We discuss the use of Agent-based Modelling for the development and testing of theories about emergent social phenomena in marketing and the social sciences in general. We address both theoretical aspects about the types of phenomena that are suitably addressed with this approach and practical...... development. The main goal of this paper was to make research on complex social systems more accessible and help anticipate and structure the research process....

  20. Climate Shocks and Migration: An Agent-Based Modeling Approach

    Science.gov (United States)

    Entwisle, Barbara; Williams, Nathalie E.; Verdery, Ashton M.; Rindfuss, Ronald R.; Walsh, Stephen J.; Malanson, George P.; Mucha, Peter J.; Frizzelle, Brian G.; McDaniel, Philip M.; Yao, Xiaozheng; Heumann, Benjamin W.; Prasartkul, Pramote; Sawangdee, Yothin; Jampaklay, Aree

    2016-01-01

    This is a study of migration responses to climate shocks. We construct an agent-based model that incorporates dynamic linkages between demographic behaviors, such as migration, marriage, and births, and agriculture and land use, which depend on rainfall patterns. The rules and parameterization of our model are empirically derived from qualitative and quantitative analyses of a well-studied demographic field site, Nang Rong district, Northeast Thailand. With this model, we simulate patterns of migration under four weather regimes in a rice economy: 1) a reference, ‘normal’ scenario; 2) seven years of unusually wet weather; 3) seven years of unusually dry weather; and 4) seven years of extremely variable weather. Results show relatively small impacts on migration. Experiments with the model show that existing high migration rates and strong selection factors, which are unaffected by climate change, are likely responsible for the weak migration response. PMID:27594725

  1. Agent-Based Modeling in Molecular Systems Biology.

    Science.gov (United States)

    Soheilypour, Mohammad; Mofrad, Mohammad R K

    2018-06-08

    Molecular systems orchestrating the biology of the cell typically involve a complex web of interactions among various components and span a vast range of spatial and temporal scales. Computational methods have advanced our understanding of the behavior of molecular systems by enabling us to test assumptions and hypotheses, explore the effect of different parameters on the outcome, and eventually guide experiments. While several different mathematical and computational methods are developed to study molecular systems at different spatiotemporal scales, there is still a need for methods that bridge the gap between spatially-detailed and computationally-efficient approaches. In this review, we summarize the capabilities of agent-based modeling (ABM) as an emerging molecular systems biology technique that provides researchers with a new tool in exploring the dynamics of molecular systems/pathways in health and disease. © 2018 WILEY Periodicals, Inc.

  2. Affordability and Paradigms in Agent-Based Systems

    Directory of Open Access Journals (Sweden)

    Boldur E. Barbat

    2007-07-01

    Full Text Available The paper aims at substantiating that in universities with scarce resources, applied Information Technologies (IT research is affordable, even in most advanced and dynamic sub-domains. This target is split into four specific objectives: a to set up a framework for IT research affordability in universities representative for current East-European circumstances; b to outline a workable approach based on synergistic leverage and to assess the paradigms prevalent in modern artificial intelligence through this ``affordability filter''; c to describe the evolution and the current stages of two undertakings exploiting paradigms founded on emergence (the sub-domains are stigmergic coordination and agent self-awareness; d to summarise for both sub domains the mechanisms and the architectonics (the focus is on computer science aspects; implementation details will be given in future papers. The results in both directions appear as promising and reveal significant potential for transdisciplinarity. From this perspective, the paper is a call to improved cooperation.

  3. System-Awareness for Agent-based Power System Control

    DEFF Research Database (Denmark)

    Heussen, Kai; Saleem, Arshad; Lind, Morten

    2010-01-01

    transition. This paper presents a concept for the representation and organization of control- and resource-allocation, enabling computational reasoning and system awareness. The principles are discussed with respect to a recently proposed Subgrid operation concept.......Operational intelligence in electric power systems is focused in a small number of control rooms that coordinate their actions. A clear division of responsibility and a command hierarchy organize system operation. With multi-agent based control systems, this control paradigm may be shifted...... to a more decentralized openaccess collaboration control paradigm. This shift cannot happen at once, but must fit also with current operation principles. In order to establish a scalable and transparent system control architecture, organizing principles have to be identified that allow for a smooth...

  4. Agent Based Control of Electric Power Systems with Distributed Generation

    DEFF Research Database (Denmark)

    Saleem, Arshad

    and subsystems that are able to coordinate, communicate, cooperate, adapt to emerging situations and self organize in an intelligent way. At the same time, rapid development in information and and communication technologies (ICT) have brought new opportunities and elucidations. New Technologies and standards...... control strategies. The results have been discussed from case studies of multiagent based distributed control scenarios in electric power systems. The main contribution of this work is a proposal for system design methodology for application of intelligent agent technology in power systems....... Situation in Denmark is even more interesting, with a current 20% penetration of wind energy it is moving towards an ambitious goal of 50% penetration by the year 2050. Realization of these concepts requires that power systems should be of distributed nature { consisting of autonomous components...

  5. An Agent Based approach to design Serious Game

    Directory of Open Access Journals (Sweden)

    Manuel Gentile

    2014-06-01

    Full Text Available Serious games are designed to train and educate learners, opening up new learning approaches like exploratory learning and situated cognition.  Despite growing interest in these games, their design is still an artisan process.On the basis of experiences in designing computer simulation, this paper proposes an agent-based approach to guide the design process of a serious game. The proposed methodology allows the designer to strike the right equilibrium between educational effectiveness and entertainment, realism and complexity.The design of the PNPVillage game is used as a case study. The PNPVillage game aims to introduce and foster an entrepreneurial mindset among young students. It was implemented within the framework of the European project “I  can… I cannot… I go!” Rev.2

  6. Distributed Research Project Scheduling Based on Multi-Agent Methods

    Directory of Open Access Journals (Sweden)

    Constanta Nicoleta Bodea

    2011-01-01

    Full Text Available Different project planning and scheduling approaches have been developed. The Operational Research (OR provides two major planning techniques: CPM (Critical Path Method and PERT (Program Evaluation and Review Technique. Due to projects complexity and difficulty to use classical methods, new approaches were developed. Artificial Intelligence (AI initially promoted the automatic planner concept, but model-based planning and scheduling methods emerged later on. The paper adresses the project scheduling optimization problem, when projects are seen as Complex Adaptive Systems (CAS. Taken into consideration two different approaches for project scheduling optimization: TCPSP (Time- Constrained Project Scheduling and RCPSP (Resource-Constrained Project Scheduling, the paper focuses on a multiagent implementation in MATLAB for TCSP. Using the research project as a case study, the paper includes a comparison between two multi-agent methods: Genetic Algorithm (GA and Ant Colony Algorithm (ACO.

  7. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    Science.gov (United States)

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  8. Agent-Based Computational Modeling of Cell Culture ...

    Science.gov (United States)

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assumed a “fried egg shape” but became increasingly cuboidal with increasing confluency. The surface area presented by each cell to the overlying medium varies from cell-to-cell and is a determinant of diffusional flux of toxicant from the medium into the cell. Thus, dose varies among cells for a given concentration of toxicant in the medium. Computer code describing diffusion of H2O2 from medium into each cell and clearance of H2O2 was calibrated against H2O2 time-course data (25, 50, or 75 uM H2O2 for 60 min) obtained with the Amplex Red assay for the medium and the H2O2-sensitive fluorescent reporter, HyPer, for cytosol. Cellular H2O2 concentrations peaked at about 5 min and were near baseline by 10 min. The model predicted a skewed distribution of surface areas, with between cell variation usually 2 fold or less. Predicted variability in cellular dose was in rough agreement with the variation in the HyPer data. These results are preliminary, as the model was not calibrated to the morphology of a specific cell type. Future work will involve morphology model calibration against human bronchial epithelial (BEAS-2B) cells. Our results show, however, the potential of agent-based modeling

  9. Hydrologic-Process-Based Soil Texture Classifications for Improved Visualization of Landscape Function

    Science.gov (United States)

    Groenendyk, Derek G.; Ferré, Ty P.A.; Thorp, Kelly R.; Rice, Amy K.

    2015-01-01

    Soils lie at the interface between the atmosphere and the subsurface and are a key component that control ecosystem services, food production, and many other processes at the Earth’s surface. There is a long-established convention for identifying and mapping soils by texture. These readily available, georeferenced soil maps and databases are used widely in environmental sciences. Here, we show that these traditional soil classifications can be inappropriate, contributing to bias and uncertainty in applications from slope stability to water resource management. We suggest a new approach to soil classification, with a detailed example from the science of hydrology. Hydrologic simulations based on common meteorological conditions were performed using HYDRUS-1D, spanning textures identified by the United States Department of Agriculture soil texture triangle. We consider these common conditions to be: drainage from saturation, infiltration onto a drained soil, and combined infiltration and drainage events. Using a k-means clustering algorithm, we created soil classifications based on the modeled hydrologic responses of these soils. The hydrologic-process-based classifications were compared to those based on soil texture and a single hydraulic property, Ks. Differences in classifications based on hydrologic response versus soil texture demonstrate that traditional soil texture classification is a poor predictor of hydrologic response. We then developed a QGIS plugin to construct soil maps combining a classification with georeferenced soil data from the Natural Resource Conservation Service. The spatial patterns of hydrologic response were more immediately informative, much simpler, and less ambiguous, for use in applications ranging from trafficability to irrigation management to flood control. The ease with which hydrologic-process-based classifications can be made, along with the improved quantitative predictions of soil responses and visualization of landscape

  10. Comparison of hand-craft feature based SVM and CNN based deep learning framework for automatic polyp classification.

    Science.gov (United States)

    Younghak Shin; Balasingham, Ilangko

    2017-07-01

    Colonoscopy is a standard method for screening polyps by highly trained physicians. Miss-detected polyps in colonoscopy are potential risk factor for colorectal cancer. In this study, we investigate an automatic polyp classification framework. We aim to compare two different approaches named hand-craft feature method and convolutional neural network (CNN) based deep learning method. Combined shape and color features are used for hand craft feature extraction and support vector machine (SVM) method is adopted for classification. For CNN approach, three convolution and pooling based deep learning framework is used for classification purpose. The proposed framework is evaluated using three public polyp databases. From the experimental results, we have shown that the CNN based deep learning framework shows better classification performance than the hand-craft feature based methods. It achieves over 90% of classification accuracy, sensitivity, specificity and precision.

  11. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    Science.gov (United States)

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  12. In Defense of Agent-Based Virtue Ethics | Van Zyl | Philosophical ...

    African Journals Online (AJOL)

    In 'Against agent-based virtue ethics' (2004) Michael Brady rejects agent-based virtue ethics on the grounds that it fails to capture the commonsense distinction between an agent's doing the right thing, and her doing it for the right reason. In his view, the failure to account for this distinction has paradoxical results, making it ...

  13. Cell-based therapy technology classifications and translational challenges

    Science.gov (United States)

    Mount, Natalie M.; Ward, Stephen J.; Kefalas, Panos; Hyllner, Johan

    2015-01-01

    Cell therapies offer the promise of treating and altering the course of diseases which cannot be addressed adequately by existing pharmaceuticals. Cell therapies are a diverse group across cell types and therapeutic indications and have been an active area of research for many years but are now strongly emerging through translation and towards successful commercial development and patient access. In this article, we present a description of a classification of cell therapies on the basis of their underlying technologies rather than the more commonly used classification by cell type because the regulatory path and manufacturing solutions are often similar within a technology area due to the nature of the methods used. We analyse the progress of new cell therapies towards clinical translation, examine how they are addressing the clinical, regulatory, manufacturing and reimbursement requirements, describe some of the remaining challenges and provide perspectives on how the field may progress for the future. PMID:26416686

  14. Support Vector Machine Based Tool for Plant Species Taxonomic Classification

    OpenAIRE

    Manimekalai .K; Vijaya.MS

    2014-01-01

    Plant species are living things and are generally categorized in terms of Domain, Kingdom, Phylum, Class, Order, Family, Genus and name of Species in a hierarchical fashion. This paper formulates the taxonomic leaf categorization problem as the hierarchical classification task and provides a suitable solution using a supervised learning technique namely support vector machine. Features are extracted from scanned images of plant leaves and trained using SVM. Only class, order, family of plants...

  15. Image Analysis and Classification Based on Soil Strength

    Science.gov (United States)

    2016-08-01

    Impact Hammer, which is light, easy to operate, and cost effective . The Clegg Impact Hammer measures stiffness of the soil surface by drop- ping a... effect on out-of-scene classifications. More statistical analy- sis should, however, be done to compare the measured field spectra, the WV2 training...DISCLAIMER: The contents of this report are not to be used for advertising , publication, or promotional purposes. Ci- tation of trade names does not

  16. Three-Class Mammogram Classification Based on Descriptive CNN Features

    Directory of Open Access Journals (Sweden)

    M. Mohsin Jadoon

    2017-01-01

    Full Text Available In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases. In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW and convolutional neural network-curvelet transform (CNN-CT. An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE. In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT, while in the second method discrete curvelet transform (DCT is used. In both methods, dense scale invariant feature (DSIFT for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN. Softmax layer and support vector machine (SVM layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques.

  17. Interrater reliability of a Pilates movement-based classification system.

    Science.gov (United States)

    Yu, Kwan Kenny; Tulloch, Evelyn; Hendrick, Paul

    2015-01-01

    To determine the interrater reliability for identification of a specific movement pattern using a Pilates Classification system. Videos of 5 subjects performing specific movement tasks were sent to raters trained in the DMA-CP classification system. Ninety-six raters completed the survey. Interrater reliability for the detection of a directional bias was excellent (Pi = 0.92, and K(free) = 0.89). Interrater reliability for classifying an individual into a specific subgroup was moderate (Pi = 0.64, K(free) = 0.55) however raters who had completed levels 1-4 of the DMA-CP training and reported using the assessment daily demonstrated excellent reliability (Pi = 0.89 and K(free) = 0.87). The reliability of the classification system demonstrated almost perfect agreement in determining the existence of a specific movement pattern and classifying into a subgroup for experienced raters. There was a trend for greater reliability associated with increased levels of training and experience of the raters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A Computational Agent-Based Modeling Approach for Competitive Wireless Service Market

    KAUST Repository

    Douglas, C C; Hyoseop Lee,; Wonsuck Lee,

    2011-01-01

    Using an agent-based modeling method, we study market dynamism with regard to wireless cellular services that are in competition for a greater market share and profit. In the proposed model, service providers and consumers are described as agents

  19. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    Science.gov (United States)

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  20. AN AGENT BASED TRANSACTION PROCESSING SCHEME FOR DISCONNECTED MOBILE NODES

    Directory of Open Access Journals (Sweden)

    J.L. Walter Jeyakumar

    2010-12-01

    Full Text Available We present a mobile transaction framework in which mobile users can share data which is stored in the cache of a mobile agent. This mobile agent is a special mobile node which coordinates the sharing process. The proposed framework allows mobile affiliation work groups to be formed dynamically with a mobile agent and mobile hosts. Using short range wireless communication technology, mobile users can simultaneously access the data from the cache of the mobile agent. The data Access Manager module at the mobile agent enforces concurrency control using cache invalidation technique. This model supports disconnected mobile computing allowing mobile agent to move along with the Mobile Hosts. The proposed Transaction frame work has been simulated in Java 2 and performance of this scheme is compared with existing frame works.

  1. Shape Effects in Nanoparticle-Based Imaging Agents

    Science.gov (United States)

    Culver, Kayla Shani Brook

    At the nanoscale, material properties become highly size and shape dependent. These properties can be manipulated and exploited for a variety of biomedical applications, including sensing, drug delivery, diagnostics, and imaging. In particular, nanoparticles of different materials, sizes and shapes have been developed as high-performance contrast agents for optical, electron, and medical imaging. In this thesis, I focus on gold nanoparticles because they are widely used as contrast agents in multiple types of imaging modalities. Additionally, the surface of gold can be readily functionalized with ligands and the structure of the particles can be manipulated to modulate their performance as imaging agents. The properties of nanoparticles can generate contrast directly. For example, the light scattering properties of gold particles can be visualized in optical microscopy, the high electron density of gold produces contrast in electron microscopy, and the x-ray absorption properties of gold can be detected in medical x-ray and computed tomography imaging. Alternatively, the properties of the nanomaterial can be exploited to modulate the signal produced by other molecules that are bound to the particle surface. The light emission of molecular fluorophores can be quenched or dramatically increased by coupling to the optical field enhancements of gold nanoparticles, and the performance of gadolinium (Gd(III))-based magnetic resonance imaging (MRI) contrast agents can be increased by coupling to the rotational motion of nanoparticles. In this dissertation, I focus specifically on how the structure of star-shaped gold particles (nanostars) can be exploited as single-particle optical probes and to dramatically enhance the relaxivity of Gd(III) bound to the surface. Differential interference contrast (DIC) is a type of wide-field diffraction-limited optical microscopy that is commonly used by biologists to image cells without labels. Here, I demonstrate the DIC can be used

  2. Smart Agent Based Mobile Tutoring and Querying System

    Directory of Open Access Journals (Sweden)

    Suresh Sankaranarayanan

    2012-08-01

    Full Text Available With our busy schedules today and the rising cost of education there is a need to find a convenient and cost effective means of maximizing our educational/training experiences. New trends in the delivery/access of information are becoming more technology based in all areas of society with education being no exception. The ubiquitous use of mobile devices has led to a boom in m-commerce. Mobile devices provide many services in commercial environments such as mobile banking, mobile purchasing, mobile learning, etc. It is therefore fitting that we seek to use mobile devices as a platform in delivering our convenient and cost effective solution. The proposed agent based Mobile tutoring system seeks to provide a student with a rich learning experience that will provide them with the relevant reading material based on their stage of development which allows them to move at their own pace. The system will allow the user to be able to ask certain questions and get explanations as if they were interacting with a human tutor but with the added benefit of being able to do this anytime in any location via their mobile phone.

  3. Support Vector Machine and Parametric Wavelet-Based Texture Classification of Stem Cell Images

    National Research Council Canada - National Science Library

    Jeffreys, Christopher

    2004-01-01

    .... Since colony texture is a major discriminating feature in determining quality, we introduce a non-invasive, semi-automated texture-based stem cell colony classification methodology to aid researchers...

  4. Single-labelled music genre classification using content-based features

    CSIR Research Space (South Africa)

    Ajoodha, R

    2015-11-01

    Full Text Available In this paper we use content-based features to perform automatic classification of music pieces into genres. We categorise these features into four groups: features extracted from the Fourier transform’s magnitude spectrum, features designed...

  5. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  6. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  7. Agent-Based Health Monitoring System, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose combination of software intelligent agents to achieve decentralized reasoning, with fault detection and diagnosis using PCA, neural nets, and maximum...

  8. Desert plains classification based on Geomorphometrical parameters (Case study: Aghda, Yazd)

    Science.gov (United States)

    Tazeh, mahdi; Kalantari, Saeideh

    2013-04-01

    This research focuses on plains. There are several tremendous methods and classification which presented for plain classification. One of The natural resource based classification which is mostly using in Iran, classified plains into three types, Erosional Pediment, Denudation Pediment Aggradational Piedmont. The qualitative and quantitative factors to differentiate them from each other are also used appropriately. In this study effective Geomorphometrical parameters in differentiate landforms were applied for plain. Geomorphometrical parameters are calculable and can be extracted using mathematical equations and the corresponding relations on digital elevation model. Geomorphometrical parameters used in this study included Percent of Slope, Plan Curvature, Profile Curvature, Minimum Curvature, the Maximum Curvature, Cross sectional Curvature, Longitudinal Curvature and Gaussian Curvature. The results indicated that the most important affecting Geomorphometrical parameters for plain and desert classifications includes: Percent of Slope, Minimum Curvature, Profile Curvature, and Longitudinal Curvature. Key Words: Plain, Geomorphometry, Classification, Biophysical, Yazd Khezarabad.

  9. [Classification of cell-based medicinal products and legal implications: An overview and an update].

    Science.gov (United States)

    Scherer, Jürgen; Flory, Egbert

    2015-11-01

    In general, cell-based medicinal products do not represent a uniform class of medicinal products, but instead comprise medicinal products with diverse regulatory classification as advanced-therapy medicinal products (ATMP), medicinal products (MP), tissue preparations, or blood products. Due to the legal and scientific consequences of the development and approval of MPs, classification should be clarified as early as possible. This paper describes the legal situation in Germany and highlights specific criteria and concepts for classification, with a focus on, but not limited to, ATMPs and non-ATMPs. Depending on the stage of product development and the specific application submitted to a competent authority, legally binding classification is done by the German Länder Authorities, Paul-Ehrlich-Institut, or European Medicines Agency. On request by the applicants, the Committee for Advanced Therapies may issue scientific recommendations for classification.

  10. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    Science.gov (United States)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  11. Polsar Land Cover Classification Based on Hidden Polarimetric Features in Rotation Domain and Svm Classifier

    Science.gov (United States)

    Tao, C.-S.; Chen, S.-W.; Li, Y.-Z.; Xiao, S.-P.

    2017-09-01

    Land cover classification is an important application for polarimetric synthetic aperture radar (PolSAR) data utilization. Rollinvariant polarimetric features such as H / Ani / text-decoration: overline">α / Span are commonly adopted in PolSAR land cover classification. However, target orientation diversity effect makes PolSAR images understanding and interpretation difficult. Only using the roll-invariant polarimetric features may introduce ambiguity in the interpretation of targets' scattering mechanisms and limit the followed classification accuracy. To address this problem, this work firstly focuses on hidden polarimetric feature mining in the rotation domain along the radar line of sight using the recently reported uniform polarimetric matrix rotation theory and the visualization and characterization tool of polarimetric coherence pattern. The former rotates the acquired polarimetric matrix along the radar line of sight and fully describes the rotation characteristics of each entry of the matrix. Sets of new polarimetric features are derived to describe the hidden scattering information of the target in the rotation domain. The latter extends the traditional polarimetric coherence at a given rotation angle to the rotation domain for complete interpretation. A visualization and characterization tool is established to derive new polarimetric features for hidden information exploration. Then, a classification scheme is developed combing both the selected new hidden polarimetric features in rotation domain and the commonly used roll-invariant polarimetric features with a support vector machine (SVM) classifier. Comparison experiments based on AIRSAR and multi-temporal UAVSAR data demonstrate that compared with the conventional classification scheme which only uses the roll-invariant polarimetric features, the proposed classification scheme achieves both higher classification accuracy and better robustness. For AIRSAR data, the overall classification

  12. POLSAR LAND COVER CLASSIFICATION BASED ON HIDDEN POLARIMETRIC FEATURES IN ROTATION DOMAIN AND SVM CLASSIFIER

    Directory of Open Access Journals (Sweden)

    C.-S. Tao

    2017-09-01

    Full Text Available Land cover classification is an important application for polarimetric synthetic aperture radar (PolSAR data utilization. Rollinvariant polarimetric features such as H / Ani / α / Span are commonly adopted in PolSAR land cover classification. However, target orientation diversity effect makes PolSAR images understanding and interpretation difficult. Only using the roll-invariant polarimetric features may introduce ambiguity in the interpretation of targets’ scattering mechanisms and limit the followed classification accuracy. To address this problem, this work firstly focuses on hidden polarimetric feature mining in the rotation domain along the radar line of sight using the recently reported uniform polarimetric matrix rotation theory and the visualization and characterization tool of polarimetric coherence pattern. The former rotates the acquired polarimetric matrix along the radar line of sight and fully describes the rotation characteristics of each entry of the matrix. Sets of new polarimetric features are derived to describe the hidden scattering information of the target in the rotation domain. The latter extends the traditional polarimetric coherence at a given rotation angle to the rotation domain for complete interpretation. A visualization and characterization tool is established to derive new polarimetric features for hidden information exploration. Then, a classification scheme is developed combing both the selected new hidden polarimetric features in rotation domain and the commonly used roll-invariant polarimetric features with a support vector machine (SVM classifier. Comparison experiments based on AIRSAR and multi-temporal UAVSAR data demonstrate that compared with the conventional classification scheme which only uses the roll-invariant polarimetric features, the proposed classification scheme achieves both higher classification accuracy and better robustness. For AIRSAR data, the overall classification accuracy

  13. Serious games experiment toward agent-based simulation

    Science.gov (United States)

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  14. A classification model of Hyperion image base on SAM combined decision tree

    Science.gov (United States)

    Wang, Zhenghai; Hu, Guangdao; Zhou, YongZhang; Liu, Xin

    2009-10-01

    Monitoring the Earth using imaging spectrometers has necessitated more accurate analyses and new applications to remote sensing. A very high dimensional input space requires an exponentially large amount of data to adequately and reliably represent the classes in that space. On the other hand, with increase in the input dimensionality the hypothesis space grows exponentially, which makes the classification performance highly unreliable. Traditional classification algorithms Classification of hyperspectral images is challenging. New algorithms have to be developed for hyperspectral data classification. The Spectral Angle Mapper (SAM) is a physically-based spectral classification that uses an ndimensional angle to match pixels to reference spectra. The algorithm determines the spectral similarity between two spectra by calculating the angle between the spectra, treating them as vectors in a space with dimensionality equal to the number of bands. The key and difficulty is that we should artificial defining the threshold of SAM. The classification precision depends on the rationality of the threshold of SAM. In order to resolve this problem, this paper proposes a new automatic classification model of remote sensing image using SAM combined with decision tree. It can automatic choose the appropriate threshold of SAM and improve the classify precision of SAM base on the analyze of field spectrum. The test area located in Heqing Yunnan was imaged by EO_1 Hyperion imaging spectrometer using 224 bands in visual and near infrared. The area included limestone areas, rock fields, soil and forests. The area was classified into four different vegetation and soil types. The results show that this method choose the appropriate threshold of SAM and eliminates the disturbance and influence of unwanted objects effectively, so as to improve the classification precision. Compared with the likelihood classification by field survey data, the classification precision of this model

  15. Agent-Based Modeling of Consumer Decision making Process Based on Power Distance and Personality

    NARCIS (Netherlands)

    Roozmand, O.; Ghasem-Aghaee, N.; Hofstede, G.J.; Nematbakhsh, M.A.; Baraani, A.; Verwaart, T.

    2011-01-01

    Simulating consumer decision making processes involves different disciplines such as: sociology, social psychology, marketing, and computer science. In this paper, we propose an agent-based conceptual and computational model of consumer decision-making based on culture, personality and human needs.

  16. Agent-based mapping of credit risk for sustainable microfinance.

    Directory of Open Access Journals (Sweden)

    Joung-Hun Lee

    Full Text Available By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  17. Agent-based mapping of credit risk for sustainable microfinance.

    Science.gov (United States)

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  18. A learning-based agent for home neurorehabilitation.

    Science.gov (United States)

    Lydakis, Andreas; Meng, Yuanliang; Munroe, Christopher; Wu, Yi-Ning; Begum, Momotaz

    2017-07-01

    This paper presents the iterative development of an artificially intelligent system to promote home-based neurorehabilitation. Although proper, structured practice of rehabilitation exercises at home is the key to successful recovery of motor functions, there is no home-program out there which can monitor a patient's exercise-related activities and provide corrective feedback in real time. To this end, we designed a Learning from Demonstration (LfD) based home-rehabilitation framework that combines advanced robot learning algorithms with commercially available wearable technologies. The proposed system uses exercise-related motion information and electromyography signals (EMG) of a patient to train a Markov Decision Process (MDP). The trained MDP model can enable an agent to serve as a coach for a patient. On a system level, this is the first initiative, to the best of our knowledge, to employ LfD in an health-care application to enable lay users to program an intelligent system. From a rehabilitation research perspective, this is a completely novel initiative to employ machine learning to provide interactive corrective feedback to a patient in home settings.

  19. Agent-based simulation for human-induced hazard analysis.

    Science.gov (United States)

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  20. Performance Evaluation of Frequency Transform Based Block Classification of Compound Image Segmentation Techniques

    Science.gov (United States)

    Selwyn, Ebenezer Juliet; Florinabel, D. Jemi

    2018-04-01

    Compound image segmentation plays a vital role in the compression of computer screen images. Computer screen images are images which are mixed with textual, graphical, or pictorial contents. In this paper, we present a comparison of two transform based block classification of compound images based on metrics like speed of classification, precision and recall rate. Block based classification approaches normally divide the compound images into fixed size blocks of non-overlapping in nature. Then frequency transform like Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are applied over each block. Mean and standard deviation are computed for each 8 × 8 block and are used as features set to classify the compound images into text/graphics and picture/background block. The classification accuracy of block classification based segmentation techniques are measured by evaluation metrics like precision and recall rate. Compound images of smooth background and complex background images containing text of varying size, colour and orientation are considered for testing. Experimental evidence shows that the DWT based segmentation provides significant improvement in recall rate and precision rate approximately 2.3% than DCT based segmentation with an increase in block classification time for both smooth and complex background images.

  1. Bidding Strategies in Agent-based Continuous Double Auctions

    NARCIS (Netherlands)

    H. Ma (Huiye); H.-F. Leung

    2008-01-01

    htmlabstractOnline auctions are a platform to trade goods on the Internet. In this context, negotiation capabilities for software agents in continuous double auctions (CDAs) are a central concern. Agents need to be able to prepare bids for and evaluate offers on behalf of the users they represent

  2. Preference-based reasoning in BDI Agent Systems

    NARCIS (Netherlands)

    Visser, Simeon; Thangarajah, John; Harland, James; Dignum, F.P.M.

    2016-01-01

    An important feature of BDI agent systems is number of different ways in which an agent can achieve its goals. The choice of means to achieve the goal in made by the system at run time, depending on contextual information that is not available in advance. In this article, we explore ways that the

  3. Emergence of heterogeneity in an agent-based model

    OpenAIRE

    Abdullah, Wan Ahmad Tajuddin Wan

    2002-01-01

    We study an interacting agent model of a game-theoretical economy. The agents play a minority-subsequently-majority game and they learn, using backpropagation networks, to obtain higher payoffs. We study the relevance of heterogeneity to performance, and how heterogeneity emerges.

  4. An Agent-Based Auction Protocol on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yu-Fang Chung

    2014-01-01

    Full Text Available This paper proposes an English auction protocol to preserve a secure, fair, and effective online auction environment, where the operations are integrated with mobile agent technology for bidders participating in online auctions. The protocol consists of four participants, namely, registration manager, agent house, auction house, and bidder.

  5. The institutional stance in agent-based simulations

    NARCIS (Netherlands)

    Sileno, G.; Boer, A.; van Engers, T.; Filipe, J.; Fred, A.L.N.

    2013-01-01

    This paper presents a multi-agent framework intended to animate scenarios of compliance and non-compliance in a normative system. With the purpose of describing social human behaviour, we choose to reduce social complexity by creating models of the involved agents starting from stories, and

  6. An Analysis of Social Class Classification Based on Linguistic Variables

    Institute of Scientific and Technical Information of China (English)

    QU Xia-sha

    2016-01-01

    Since language is an influential tool in social interaction, the relationship of speech and social factors, such as social class, gender, even age is worth studying. People employ different linguistic variables to imply their social class, status and iden-tity in the social interaction. Thus the linguistic variation involves vocabulary, sounds, grammatical constructions, dialects and so on. As a result, a classification of social class draws people’s attention. Linguistic variable in speech interactions indicate the social relationship between people. This paper attempts to illustrate three main linguistic variables which influence the social class, and further sociolinguistic studies need to be more concerned about.

  7. Extreme Facial Expressions Classification Based on Reality Parameters

    Science.gov (United States)

    Rahim, Mohd Shafry Mohd; Rad, Abdolvahab Ehsani; Rehman, Amjad; Altameem, Ayman

    2014-09-01

    Extreme expressions are really type of emotional expressions that are basically stimulated through the strong emotion. An example of those extreme expression is satisfied through tears. So to be able to provide these types of features; additional elements like fluid mechanism (particle system) plus some of physics techniques like (SPH) are introduced. The fusion of facile animation with SPH exhibits promising results. Accordingly, proposed fluid technique using facial animation is the real tenor for this research to get the complex expression, like laugh, smile, cry (tears emergence) or the sadness until cry strongly, as an extreme expression classification that's happens on the human face in some cases.

  8. Conditional Mutual Information Based Feature Selection for Classification Task

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Somol, Petr; Haindl, Michal; Pudil, Pavel

    2007-01-01

    Roč. 45, č. 4756 (2007), s. 417-426 ISSN 0302-9743 R&D Projects: GA MŠk 1M0572; GA AV ČR IAA2075302 EU Projects: European Commission(XE) 507752 - MUSCLE Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Pattern classification * feature selection * conditional mutual information * text categorization Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.402, year: 2005

  9. A canonical correlation analysis based EMG classification algorithm for eliminating electrode shift effect.

    Science.gov (United States)

    Zhe Fan; Zhong Wang; Guanglin Li; Ruomei Wang

    2016-08-01

    Motion classification system based on surface Electromyography (sEMG) pattern recognition has achieved good results in experimental condition. But it is still a challenge for clinical implement and practical application. Many factors contribute to the difficulty of clinical use of the EMG based dexterous control. The most obvious and important is the noise in the EMG signal caused by electrode shift, muscle fatigue, motion artifact, inherent instability of signal and biological signals such as Electrocardiogram. In this paper, a novel method based on Canonical Correlation Analysis (CCA) was developed to eliminate the reduction of classification accuracy caused by electrode shift. The average classification accuracy of our method were above 95% for the healthy subjects. In the process, we validated the influence of electrode shift on motion classification accuracy and discovered the strong correlation with correlation coefficient of >0.9 between shift position data and normal position data.

  10. Persuasion Model and Its Evaluation Based on Positive Change Degree of Agent Emotion

    Science.gov (United States)

    Jinghua, Wu; Wenguang, Lu; Hailiang, Meng

    For it can meet needs of negotiation among organizations take place in different time and place, and for it can make its course more rationality and result more ideal, persuasion based on agent can improve cooperation among organizations well. Integrated emotion change in agent persuasion can further bring agent advantage of artificial intelligence into play. Emotion of agent persuasion is classified, and the concept of positive change degree is given. Based on this, persuasion model based on positive change degree of agent emotion is constructed, which is explained clearly through an example. Finally, the method of relative evaluation is given, which is also verified through a calculation example.

  11. Multi-agent based modeling for electric vehicle integration in a distribution network operation

    DEFF Research Database (Denmark)

    Hu, Junjie; Morais, Hugo; Lind, Morten

    2016-01-01

    The purpose of this paper is to present a multi-agent based modeling technology for simulating and operating a hierarchical energy management of a power distribution system with focus on EVs integration. The proposed multi-agent system consists of four types of agents: i) Distribution system...... operator (DSO) technical agent and ii) DSO market agents that both belong to the top layer of the hierarchy and their roles are to manage the distribution network by avoiding grid congestions and using congestion prices to coordinate the energy scheduled; iii) Electric vehicle virtual power plant agents...

  12. Dynamic Allocation of a Domestic Heating Task to Gas-Based and Heatpump-Based Heating Agents

    NARCIS (Netherlands)

    Treur, J.

    2013-01-01

    In this paper a multi-agent model for a domestic heating task is introduced and analysed. The model includes two alternative heating agents (for gas-based heating and for heatpump-based heating), and a third allocation agent which determines the most economic allocation of the heating task to these

  13. Ship Classification with High Resolution TerraSAR-X Imagery Based on Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Zhi Zhao

    2013-01-01

    Full Text Available Ship surveillance using space-borne synthetic aperture radar (SAR, taking advantages of high resolution over wide swaths and all-weather working capability, has attracted worldwide attention. Recent activity in this field has concentrated mainly on the study of ship detection, but the classification is largely still open. In this paper, we propose a novel ship classification scheme based on analytic hierarchy process (AHP in order to achieve better performance. The main idea is to apply AHP on both feature selection and classification decision. On one hand, the AHP based feature selection constructs a selection decision problem based on several feature evaluation measures (e.g., discriminability, stability, and information measure and provides objective criteria to make comprehensive decisions for their combinations quantitatively. On the other hand, we take the selected feature sets as the input of KNN classifiers and fuse the multiple classification results based on AHP, in which the feature sets’ confidence is taken into account when the AHP based classification decision is made. We analyze the proposed classification scheme and demonstrate its results on a ship dataset that comes from TerraSAR-X SAR images.

  14. A space-based classification system for RF transients

    International Nuclear Information System (INIS)

    Moore, K.R.; Call, D.; Johnson, S.; Payne, T.; Ford, W.; Spencer, K.; Wilkerson, J.F.; Baumgart, C.

    1993-01-01

    The FORTE (Fast On-Orbit Recording of Transient Events) small satellite is scheduled for launch in mid 1995. The mission is to measure and classify VHF (30--300 MHz) electromagnetic pulses, primarily due to lightning, within a high noise environment dominated by continuous wave carriers such as TV and FM stations. The FORTE Event Classifier will use specialized hardware to implement signal processing and neural network algorithms that perform onboard classification of RF transients and carriers. Lightning events will also be characterized with optical data telemetered to the ground. A primary mission science goal is to develop a comprehensive understanding of the correlation between the optical flash and the VHF emissions from lightning. By combining FORTE measurements with ground measurements and/or active transmitters, other science issues can be addressed. Examples include the correlation of global precipitation rates with lightning flash rates and location, the effects of large scale structures within the ionosphere (such as traveling ionospheric disturbances and horizontal gradients in the total electron content) on the propagation of broad bandwidth RF signals, and various areas of lightning physics. Event classification is a key feature of the FORTE mission. Neural networks are promising candidates for this application. The authors describe the proposed FORTE Event Classifier flight system, which consists of a commercially available digital signal processing board and a custom board, and discuss work on signal processing and neural network algorithms

  15. Classification of high resolution imagery based on fusion of multiscale texture features

    International Nuclear Information System (INIS)

    Liu, Jinxiu; Liu, Huiping; Lv, Ying; Xue, Xiaojuan

    2014-01-01

    In high resolution data classification process, combining texture features with spectral bands can effectively improve the classification accuracy. However, the window size which is difficult to choose is regarded as an important factor influencing overall classification accuracy in textural classification and current approaches to image texture analysis only depend on a single moving window which ignores different scale features of various land cover types. In this paper, we propose a new method based on the fusion of multiscale texture features to overcome these problems. The main steps in new method include the classification of fixed window size spectral/textural images from 3×3 to 15×15 and comparison of all the posterior possibility values for every pixel, as a result the biggest probability value is given to the pixel and the pixel belongs to a certain land cover type automatically. The proposed approach is tested on University of Pavia ROSIS data. The results indicate that the new method improve the classification accuracy compared to results of methods based on fixed window size textural classification

  16. Graph-Based Semi-Supervised Hyperspectral Image Classification Using Spatial Information

    Science.gov (United States)

    Jamshidpour, N.; Homayouni, S.; Safari, A.

    2017-09-01

    Hyperspectral image classification has been one of the most popular research areas in the remote sensing community in the past decades. However, there are still some problems that need specific attentions. For example, the lack of enough labeled samples and the high dimensionality problem are two most important issues which degrade the performance of supervised classification dramatically. The main idea of semi-supervised learning is to overcome these issues by the contribution of unlabeled samples, which are available in an enormous amount. In this paper, we propose a graph-based semi-supervised classification method, which uses both spectral and spatial information for hyperspectral image classification. More specifically, two graphs were designed and constructed in order to exploit the relationship among pixels in spectral and spatial spaces respectively. Then, the Laplacians of both graphs were merged to form a weighted joint graph. The experiments were carried out on two different benchmark hyperspectral data sets. The proposed method performed significantly better than the well-known supervised classification methods, such as SVM. The assessments consisted of both accuracy and homogeneity analyses of the produced classification maps. The proposed spectral-spatial SSL method considerably increased the classification accuracy when the labeled training data set is too scarce.When there were only five labeled samples for each class, the performance improved 5.92% and 10.76% compared to spatial graph-based SSL, for AVIRIS Indian Pine and Pavia University data sets respectively.

  17. GRAPH-BASED SEMI-SUPERVISED HYPERSPECTRAL IMAGE CLASSIFICATION USING SPATIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    N. Jamshidpour

    2017-09-01

    Full Text Available Hyperspectral image classification has been one of the most popular research areas in the remote sensing community in the past decades. However, there are still some problems that need specific attentions. For example, the lack of enough labeled samples and the high dimensionality problem are two most important issues which degrade the performance of supervised classification dramatically. The main idea of semi-supervised learning is to overcome these issues by the contribution of unlabeled samples, which are available in an enormous amount. In this paper, we propose a graph-based semi-supervised classification method, which uses both spectral and spatial information for hyperspectral image classification. More specifically, two graphs were designed and constructed in order to exploit the relationship among pixels in spectral and spatial spaces respectively. Then, the Laplacians of both graphs were merged to form a weighted joint graph. The experiments were carried out on two different benchmark hyperspectral data sets. The proposed method performed significantly better than the well-known supervised classification methods, such as SVM. The assessments consisted of both accuracy and homogeneity analyses of the produced classification maps. The proposed spectral-spatial SSL method considerably increased the classification accuracy when the labeled training data set is too scarce.When there were only five labeled samples for each class, the performance improved 5.92% and 10.76% compared to spatial graph-based SSL, for AVIRIS Indian Pine and Pavia University data sets respectively.

  18. Reliability of a treatment-based classification system for subgrouping people with low back pain.

    Science.gov (United States)

    Henry, Sharon M; Fritz, Julie M; Trombley, Andrea R; Bunn, Janice Y

    2012-09-01

    Observational, cross-sectional reliability study. To examine the interrater reliability of novice raters in their use of the treatment-based classification (TBC) system for low back pain and to explore the patterns of disagreement in classification errors. Although the interrater reliability of individual test items in the TBC system is moderate to good, some error persists in classification decision making. Understanding which classification errors are common could direct further refinement of the TBC system. Using previously recorded patient data (n = 24), 12 novice raters classified patients according to the TBC schema. These classification results were combined with those of 7 other raters, allowing examination of the overall agreement using the kappa statistic, as well as agreement/disagreement among pairwise comparisons in classification assignments. A chi-square test examined differences in percent agreement between the novice and more experienced raters and differences in classification distributions between these 2 groups of raters. Among 12 novice raters, there was 80.9% agreement in the pairs of classification (κ = 0.62; 95% confidence interval: 0.59, 0.65) and an overall 75.5% agreement (κ = 0.57; 95% confidence interval: 0.55, 0.69) for the combined data set. Raters were least likely to agree on a classification of stabilization (77.5% agreement). The overall percentage of pairwise classification judgments that disagreed was 24.5%, with the most common disagreement being between manipulation and stabilization (11.0%), followed by a mismatch between stabilization and specific exercise (8.2%). Additional refinement is needed to reduce rater disagreement that persists in the TBC decision-making algorithm, particularly in the stabilization category. J Orthop Sports Phys Ther 2012;42(9):797-805, Epub 7 June 2012. doi:10.2519/jospt.2012.4078.

  19. An agent-based information management model of the Chinese pig sector

    NARCIS (Netherlands)

    Osinga, S.A.; Kramer, M.R.; Hofstede, G.J.; Roozmand, O.; Beulens, A.J.M.

    2010-01-01

    This paper investigates the effect of a selected top-down measure (what-if scenario) on actual agent behaviour and total system behaviour by means of an agent-based simulation model, when agents’ behaviour cannot fully be managed because the agents are autonomous. The Chinese pork sector serves as

  20. Socially rational agents in spatial land use planning: a heuristic proposal based negotiation mechanism

    NARCIS (Netherlands)

    Ghavami, S.M.; Taleai, M.; Arentze, T.A.

    2016-01-01

    This paper introduces a novel heuristic based negotiation model for urban land use planning by using multi-agent systems. The model features two kinds of agents: facilitator and advocate. Facilitator agent runs the negotiation according to a certain protocol that defines the procedure. Two roles are

  1. Agent-based models for higher-order theory of mind

    NARCIS (Netherlands)

    de Weerd, Harmen; Verbrugge, Rineke; Verheij, Bart; Kamiński, Bogumił; Koloch, Grzegorz

    2014-01-01

    Agent-based models are a powerful tool for explaining the emergence of social phenomena in a society. In such models, individual agents typically have little cognitive ability. In this paper, we model agents with the cognitive ability to make use of theory of mind. People use this ability to reason

  2. A Multi Agent Based Approach for Prehospital Emergency Management.

    Science.gov (United States)

    Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh

    2017-07-01

    To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities.  The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement.

  3. A review of supervised object-based land-cover image classification

    Science.gov (United States)

    Ma, Lei; Li, Manchun; Ma, Xiaoxue; Cheng, Liang; Du, Peijun; Liu, Yongxue

    2017-08-01

    Object-based image classification for land-cover mapping purposes using remote-sensing imagery has attracted significant attention in recent years. Numerous studies conducted over the past decade have investigated a broad array of sensors, feature selection, classifiers, and other factors of interest. However, these research results have not yet been synthesized to provide coherent guidance on the effect of different supervised object-based land-cover classification processes. In this study, we first construct a database with 28 fields using qualitative and quantitative information extracted from 254 experimental cases described in 173 scientific papers. Second, the results of the meta-analysis are reported, including general characteristics of the studies (e.g., the geographic range of relevant institutes, preferred journals) and the relationships between factors of interest (e.g., spatial resolution and study area or optimal segmentation scale, accuracy and number of targeted classes), especially with respect to the classification accuracy of different sensors, segmentation scale, training set size, supervised classifiers, and land-cover types. Third, useful data on supervised object-based image classification are determined from the meta-analysis. For example, we find that supervised object-based classification is currently experiencing rapid advances, while development of the fuzzy technique is limited in the object-based framework. Furthermore, spatial resolution correlates with the optimal segmentation scale and study area, and Random Forest (RF) shows the best performance in object-based classification. The area-based accuracy assessment method can obtain stable classification performance, and indicates a strong correlation between accuracy and training set size, while the accuracy of the point-based method is likely to be unstable due to mixed objects. In addition, the overall accuracy benefits from higher spatial resolution images (e.g., unmanned aerial

  4. Enzymatic Synthesis of Lignin-Based Concrete Dispersing Agents.

    Science.gov (United States)

    Jankowska, Dagmara; Heck, Tobias; Schubert, Mark; Yerlikaya, Alpaslan; Weymuth, Christophe; Rentsch, Daniel; Schober, Irene; Richter, Michael

    2018-03-15

    Lignin is the most abundant aromatic biopolymer, functioning as an integral component of woody materials. In its unmodified form it shows limited water solubility and is relatively unreactive, so biotechnological lignin valorisation for high-performance applications is greatly underexploited. Lignin can be obtained from the pulp and paper industry as a by-product. To expand its application, a new synthesis route to new dispersing agents for use as concrete additives was developed. The route is based on lignin functionalisation by enzymatic transformation. Screening of lignin-modifying systems resulted in functionalised lignin polymers with improved solubility in aqueous systems. Through grafting of sulfanilic acid or p-aminobenzoic acid by fungal laccases, lignin became soluble in water at pH≤4 or pH≤7, respectively. Products were analysed and evaluated in miniaturised application tests in cement paste and mortar. Their dispersing properties match the performance criteria of commercially available lignosulfonates. The study provides examples of new perspectives for the use of lignin. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. MAST – A Mobile Agent-based Security Tool

    Directory of Open Access Journals (Sweden)

    Marco Carvalho

    2004-08-01

    Full Text Available One of the chief computer security problems is not the long list of viruses and other potential vulnerabilities, but the vast number of systems that continue to be easy prey, as their system administrators or owners simply are not able to keep up with all of the available patches, updates, or needed configuration changes in order to protect them from those known vulnerabilities. Even up-to-date systems could become vulnerable to attacks, due to inappropriate configuration or combined used of applications and services. Our mobile agent-based security tool (MAST is designed to bridge this gap, and provide automated methods to make sure that all of the systems in a specific domain or network are secured and up-to-date with all patches and updates. The tool is also designed to check systems for misconfigurations that make them vulnerable. Additionally, this user interface is presented in a domain knowledge model known as a Concept Map that provides a continuous learning experience for the system administrator.

  6. E-laboratories : agent-based modeling of electricity markets

    International Nuclear Information System (INIS)

    North, M.; Conzelmann, G.; Koritarov, V.; Macal, C.; Thimmapuram, P.; Veselka, T.

    2002-01-01

    Electricity markets are complex adaptive systems that operate under a wide range of rules that span a variety of time scales. These rules are imposed both from above by society and below by physics. Many electricity markets are undergoing or are about to undergo a transition from centrally regulated systems to decentralized markets. Furthermore, several electricity markets have recently undergone this transition with extremely unsatisfactory results, most notably in California. These high stakes transitions require the introduction of largely untested regulatory structures. Suitable laboratories that can be used to test regulatory structures before they are applied to real systems are needed. Agent-based models can provide such electronic laboratories or ''e-laboratories.'' To better understand the requirements of an electricity market e-laboratory, a live electricity market simulation was created. This experience helped to shape the development of the Electricity Market Complex Adaptive Systems (EMCAS) model. To explore EMCAS' potential as an e-laboratory, several variations of the live simulation were created. These variations probed the possible effects of changing power plant outages and price setting rules on electricity market prices

  7. Improved Concrete Materials with Hydrogel-Based Internal Curing Agents

    Directory of Open Access Journals (Sweden)

    Matthew J. Krafcik

    2017-11-01

    Full Text Available This research article will describe the design and use of polyelectrolyte hydrogel particles as internal curing agents in concrete and present new results on relevant hydrogel-ion interactions. When incorporated into concrete, hydrogel particles release their stored water to fuel the curing reaction, resulting in reduced volumetric shrinkage and cracking and thus increasing concrete service life. The hydrogel’s swelling performance and mechanical properties are strongly sensitive to multivalent cations that are naturally present in concrete mixtures, including calcium and aluminum. Model poly(acrylic acid(AA-acrylamide(AM-based hydrogel particles with different chemical compositions (AA:AM monomer ratio were synthesized and immersed in sodium, calcium, and aluminum salt solutions. The presence of multivalent cations resulted in decreased swelling capacity and altered swelling kinetics to the point where some hydrogel compositions displayed rapid deswelling behavior and the formation of a mechanically stiff shell. Interestingly, when incorporated into mortar, hydrogel particles reduced mixture shrinkage while encouraging the formation of specific inorganic phases (calcium hydroxide and calcium silicate hydrate within the void space previously occupied by the swollen particle.

  8. Dynamic calibration of agent-based models using data assimilation.

    Science.gov (United States)

    Ward, Jonathan A; Evans, Andrew J; Malleson, Nicolas S

    2016-04-01

    A widespread approach to investigating the dynamical behaviour of complex social systems is via agent-based models (ABMs). In this paper, we describe how such models can be dynamically calibrated using the ensemble Kalman filter (EnKF), a standard method of data assimilation. Our goal is twofold. First, we want to present the EnKF in a simple setting for the benefit of ABM practitioners who are unfamiliar with it. Second, we want to illustrate to data assimilation experts the value of using such methods in the context of ABMs of complex social systems and the new challenges these types of model present. We work towards these goals within the context of a simple question of practical value: how many people are there in Leeds (or any other major city) right now? We build a hierarchy of exemplar models that we use to demonstrate how to apply the EnKF and calibrate these using open data of footfall counts in Leeds.

  9. An Agent Based Model of Household Water Use

    Directory of Open Access Journals (Sweden)

    Clinton J. Andrews

    2013-07-01

    Full Text Available Households consume a significant fraction of total potable water production. Strategies to improve the efficiency of water use tend to emphasize technological interventions to reduce or shift water demand. Behavioral water use reduction strategies can also play an important role, but a flexible framework for exploring the “what-ifs” has not been available. This paper introduces such a framework, presenting an agent-based model of household water-consuming behavior. The model simulates hourly water-using activities of household members within a rich technological and behavioral context, calibrated with appropriate data. Illustrative experiments compare the resulting water usage of U.S. and Dutch households and their associated water-using technologies, different household types (singles, families with children, and retired couples, different water metering regimes, and educational campaigns. All else equal, Dutch and metered households use less water. Retired households use more water because they are more often at home. Water-saving educational campaigns are effective for the part of the population that is receptive. Important interactions among these factors, both technological and behavioral, highlight the value of this framework for integrated analysis of the human-technology-water system.

  10. Agents Based e-Commerce and Securing Exchanged Information

    Science.gov (United States)

    Al-Jaljouli, Raja; Abawajy, Jemal

    Mobile agents have been implemented in e-Commerce to search and filter information of interest from electronic markets. When the information is very sensitive and critical, it is important to develop a novel security protocol that can efficiently protect the information from malicious tampering as well as unauthorized disclosure or at least detect any malicious act of intruders. In this chapter, we describe robust security techniques that ensure a sound security of information gathered throughout agent’s itinerary against various security attacks, as well as truncation attacks. A sound security protocol is described, which implements the various security techniques that would jointly prevent or at least detect any malicious act of intruders. We reason about the soundness of the protocol usingSymbolic Trace Analyzer (STA), a formal verification tool that is based on symbolic techniques. We analyze the protocol in key configurations and show that it is free of flaws. We also show that the protocol fulfils the various security requirements of exchanged information in MAS, including data-integrity, data-confidentiality, data-authenticity, origin confidentiality and data non-repudiability.

  11. A SIMULATION OF CONTRACT FARMING USING AGENT BASED MODELING

    Directory of Open Access Journals (Sweden)

    Yuanita Handayati

    2016-12-01

    Full Text Available This study aims to simulate the effects of contract farming and farmer commitment to contract farming on supply chain performance by using agent based modeling as a methodology. Supply chain performance is represented by profits and service levels. The simulation results indicate that farmers should pay attention to customer requirements and plan their agricultural activities in order to fulfill these requirements. Contract farming helps farmers deal with demand and price uncertainties. We also find that farmer commitment is crucial to fulfilling contract requirements. This study contributes to this field from a conceptual as well as a practical point of view. From the conceptual point of view, our simulation results show that different levels of farmer commitment have an impact on farmer performance when implementing contract farming. From a practical point of view, the uncertainty faced by farmers and the market can be managed by implementing cultivation and harvesting scheduling, information sharing, and collective learning as ways of committing to contract farming.

  12. Agent-Based Crowd Simulation Considering Emotion Contagion for Emergency Evacuation Problem

    Science.gov (United States)

    Faroqi, H.; Mesgari, M.-S.

    2015-12-01

    During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  13. AGENT-BASED CROWD SIMULATION CONSIDERING EMOTION CONTAGION FOR EMERGENCY EVACUATION PROBLEM

    Directory of Open Access Journals (Sweden)

    H. Faroqi

    2015-12-01

    Full Text Available During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  14. Characterization of nanoparticle-based contrast agents for molecular magnetic resonance imaging

    International Nuclear Information System (INIS)

    Shan, Liang; Chopra, Arvind; Leung, Kam; Eckelman, William C.; Menkens, Anne E.

    2012-01-01

    The development of molecular imaging agents is currently undergoing a dramatic expansion. As of October 2011, ∼4,800 newly developed agents have been synthesized and characterized in vitro and in animal models of human disease. Despite this rapid progress, the transfer of these agents to clinical practice is rather slow. To address this issue, the National Institutes of Health launched the Molecular Imaging and Contrast Agents Database (MICAD) in 2005 to provide freely accessible online information regarding molecular imaging probes and contrast agents for the imaging community. While compiling information regarding imaging agents published in peer-reviewed journals, the MICAD editors have observed that some important information regarding the characterization of a contrast agent is not consistently reported. This makes it difficult for investigators to evaluate and meta-analyze data generated from different studies of imaging agents, especially for the agents based on nanoparticles. This article is intended to serve as a guideline for new investigators for the characterization of preclinical studies performed with nanoparticle-based MRI contrast agents. The common characterization parameters are summarized into seven categories: contrast agent designation, physicochemical properties, magnetic properties, in vitro studies, animal studies, MRI studies, and toxicity. Although no single set of parameters is suitable to define the properties of the various types of contrast agents, it is essential to ensure that these agents meet certain quality control parameters at the preclinical stage, so that they can be used without delay for clinical studies.

  15. Can aquatic macrophytes be biofilters for gadolinium based contrasting agents?

    Science.gov (United States)

    Braun, Mihály; Zavanyi, Györgyi; Laczovics, Attila; Berényi, Ervin; Szabó, Sándor

    2018-05-15

    The use of gadolinium-based contrasting agents (GBCA) is increasing because of the intensive usage of these agents in magnetic resonance imaging (MRI). Waste-water treatment does not reduce anthropogenic Gd-concentration significantly. Anomalous Gd-concentration in surface waters have been reported worldwide. However, removal of GBCA-s by aquatic macrophytes has still hardly been investigated. Four aquatic plant species (Lemna gibba, Ceratophyllum demersum, Elodea nuttallii, E. canadensis) were investigated as potential biological filters for removal of commonly used but structurally different GBCA-s (Omniscan, Dotarem) from water. These plant species are known to accumulate heavy metals and are used for removing pollutants in constructed wetlands. The Gd uptake and release of the plants was examined under laboratory conditions. Concentration-dependent infiltration of Gd into the body of the macrophytes was measured, however significant bioaccumulation was not observed. The tissue concentration of Gd reached its maximum value between day one and four in L. gibba and C. demersum, respectively, and its volume was significantly higher in C. demersum than in L. gibba. In C. demersum, the open-chain ligand Omniscan causes two-times higher tissue Gd concentration than the macrocyclic ligand Dotarem. Gadolinium was released from Gd-treated duckweeds into the water as they were grown further in Gd-free nutrient solution. Tissue Gd concentration dropped by 50% in duckweed treated by Omniscan and by Dotarem within 1.9 and 2.9 days respectively. None of the macrophytes had a significant impact on the Gd concentration of water in low and medium concentration levels (1-256 μg L -1 ). Biofiltration of GBCA-s by common macrophytes could not be detected in our experiments. Therefore it seems that in constructed wetlands, aquatic plants are not able to reduce the concentration of GBCA-s in the water. Furthermore there is a low risk that these plants cause the

  16. Improving the Computational Performance of Ontology-Based Classification Using Graph Databases

    Directory of Open Access Journals (Sweden)

    Thomas J. Lampoltshammer

    2015-07-01

    Full Text Available The increasing availability of very high-resolution remote sensing imagery (i.e., from satellites, airborne laser scanning, or aerial photography represents both a blessing and a curse for researchers. The manual classification of these images, or other similar geo-sensor data, is time-consuming and leads to subjective and non-deterministic results. Due to this fact, (semi- automated classification approaches are in high demand in affected research areas. Ontologies provide a proper way of automated classification for various kinds of sensor data, including remotely sensed data. However, the processing of data entities—so-called individuals—is one of the most cost-intensive computational operations within ontology reasoning. Therefore, an approach based on graph databases is proposed to overcome the issue of a high time consumption regarding the classification task. The introduced approach shifts the classification task from the classical Protégé environment and its common reasoners to the proposed graph-based approaches. For the validation, the authors tested the approach on a simulation scenario based on a real-world example. The results demonstrate a quite promising improvement of classification speed—up to 80,000 times faster than the Protégé-based approach.

  17. An object-oriented classification method of high resolution imagery based on improved AdaTree

    International Nuclear Information System (INIS)

    Xiaohe, Zhang; Liang, Zhai; Jixian, Zhang; Huiyong, Sang

    2014-01-01

    With the popularity of the application using high spatial resolution remote sensing image, more and more studies paid attention to object-oriented classification on image segmentation as well as automatic classification after image segmentation. This paper proposed a fast method of object-oriented automatic classification. First, edge-based or FNEA-based segmentation was used to identify image objects and the values of most suitable attributes of image objects for classification were calculated. Then a certain number of samples from the image objects were selected as training data for improved AdaTree algorithm to get classification rules. Finally, the image objects could be classified easily using these rules. In the AdaTree, we mainly modified the final hypothesis to get classification rules. In the experiment with WorldView2 image, the result of the method based on AdaTree showed obvious accuracy and efficient improvement compared with the method based on SVM with the kappa coefficient achieving 0.9242

  18. Multi-material classification of dry recyclables from municipal solid waste based on thermal imaging.

    Science.gov (United States)

    Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul

    2017-12-01

    There has been a significant rise in municipal solid waste (MSW) generation in the last few decades due to rapid urbanization and industrialization. Due to the lack of source segregation practice, a need for automated segregation of recyclables from MSW exists in the developing countries. This paper reports a thermal imaging based system for classifying useful recyclables from simulated MSW sample. Experimental results have demonstrated the possibility to use thermal imaging technique for classification and a robotic system for sorting of recyclables in a single process step. The reported classification system yields an accuracy in the range of 85-96% and is comparable with the existing single-material recyclable classification techniques. We believe that the reported thermal imaging based system can emerge as a viable and inexpensive large-scale classification-cum-sorting technology in recycling plants for processing MSW in developing countries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A Novel Imbalanced Data Classification Approach Based on Logistic Regression and Fisher Discriminant

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2015-01-01

    Full Text Available We introduce an imbalanced data classification approach based on logistic regression significant discriminant and Fisher discriminant. First of all, a key indicators extraction model based on logistic regression significant discriminant and correlation analysis is derived to extract features for customer classification. Secondly, on the basis of the linear weighted utilizing Fisher discriminant, a customer scoring model is established. And then, a customer rating model where the customer number of all ratings follows normal distribution is constructed. The performance of the proposed model and the classical SVM classification method are evaluated in terms of their ability to correctly classify consumers as default customer or nondefault customer. Empirical results using the data of 2157 customers in financial engineering suggest that the proposed approach better performance than the SVM model in dealing with imbalanced data classification. Moreover, our approach contributes to locating the qualified customers for the banks and the bond investors.

  20. Uav-Based Crops Classification with Joint Features from Orthoimage and Dsm Data

    Science.gov (United States)

    Liu, B.; Shi, Y.; Duan, Y.; Wu, W.

    2018-04-01

    Accurate crops classification remains a challenging task due to the same crop with different spectra and different crops with same spectrum phenomenon. Recently, UAV-based remote sensing approach gains popularity not only for its high spatial and temporal resolution, but also for its ability to obtain spectraand spatial data at the same time. This paper focus on how to take full advantages of spatial and spectrum features to improve crops classification accuracy, based on an UAV platform equipped with a general digital camera. Texture and spatial features extracted from the RGB orthoimage and the digital surface model of the monitoring area are analysed and integrated within a SVM classification framework. Extensive experiences results indicate that the overall classification accuracy is drastically improved from 72.9 % to 94.5 % when the spatial features are combined together, which verified the feasibility and effectiveness of the proposed method.

  1. Resting State fMRI Functional Connectivity-Based Classification Using a Convolutional Neural Network Architecture.

    Science.gov (United States)

    Meszlényi, Regina J; Buza, Krisztian; Vidnyánszky, Zoltán

    2017-01-01

    Machine learning techniques have become increasingly popular in the field of resting state fMRI (functional magnetic resonance imaging) network based classification. However, the application of convolutional networks has been proposed only very recently and has remained largely unexplored. In this paper we describe a convolutional neural network architecture for functional connectome classification called connectome-convolutional neural network (CCNN). Our results on simulated datasets and a publicly available dataset for amnestic mild cognitive impairment classification demonstrate that our CCNN model can efficiently distinguish between subject groups. We also show that the connectome-convolutional network is capable to combine information from diverse functional connectivity metrics and that models using a combination of different connectivity descriptors are able to outperform classifiers using only one metric. From this flexibility follows that our proposed CCNN model can be easily adapted to a wide range of connectome based classification or regression tasks, by varying which connectivity descriptor combinations are used to train the network.

  2. Cancer Classification Based on Support Vector Machine Optimized by Particle Swarm Optimization and Artificial Bee Colony.

    Science.gov (United States)

    Gao, Lingyun; Ye, Mingquan; Wu, Changrong

    2017-11-29

    Intelligent optimization algorithms have advantages in dealing with complex nonlinear problems accompanied by good flexibility and adaptability. In this paper, the FCBF (Fast Correlation-Based Feature selection) method is used to filter irrelevant and redundant features in order to improve the quality of cancer classification. Then, we perform classification based on SVM (Support Vector Machine) optimized by PSO (Particle Swarm Optimization) combined with ABC (Artificial Bee Colony) approaches, which is represented as PA-SVM. The proposed PA-SVM method is applied to nine cancer datasets, including five datasets of outcome prediction and a protein dataset of ovarian cancer. By comparison with other classification methods, the results demonstrate the effectiveness and the robustness of the proposed PA-SVM method in handling various types of data for cancer classification.

  3. Sparse Representation Based Multi-Instance Learning for Breast Ultrasound Image Classification.

    Science.gov (United States)

    Bing, Lu; Wang, Wei

    2017-01-01

    We propose a novel method based on sparse representation for breast ultrasound image classification under the framework of multi-instance learning (MIL). After image enhancement and segmentation, concentric circle is used to extract the global and local features for improving the accuracy in diagnosis and prediction. The classification problem of ultrasound image is converted to sparse representation based MIL problem. Each instance of a bag is represented as a sparse linear combination of all basis vectors in the dictionary, and then the bag is represented by one feature vector which is obtained via sparse representations of all instances within the bag. The sparse and MIL problem is further converted to a conventional learning problem that is solved by relevance vector machine (RVM). Results of single classifiers are combined to be used for classification. Experimental results on the breast cancer datasets demonstrate the superiority of the proposed method in terms of classification accuracy as compared with state-of-the-art MIL methods.

  4. Accurate crop classification using hierarchical genetic fuzzy rule-based systems

    Science.gov (United States)

    Topaloglou, Charalampos A.; Mylonas, Stelios K.; Stavrakoudis, Dimitris G.; Mastorocostas, Paris A.; Theocharis, John B.

    2014-10-01

    This paper investigates the effectiveness of an advanced classification system for accurate crop classification using very high resolution (VHR) satellite imagery. Specifically, a recently proposed genetic fuzzy rule-based classification system (GFRBCS) is employed, namely, the Hierarchical Rule-based Linguistic Classifier (HiRLiC). HiRLiC's model comprises a small set of simple IF-THEN fuzzy rules, easily interpretable by humans. One of its most important attributes is that its learning algorithm requires minimum user interaction, since the most important learning parameters affecting the classification accuracy are determined by the learning algorithm automatically. HiRLiC is applied in a challenging crop classification task, using a SPOT5 satellite image over an intensively cultivated area in a lake-wetland ecosystem in northern Greece. A rich set of higher-order spectral and textural features is derived from the initial bands of the (pan-sharpened) image, resulting in an input space comprising 119 features. The experimental analysis proves that HiRLiC compares favorably to other interpretable classifiers of the literature, both in terms of structural complexity and classification accuracy. Its testing accuracy was very close to that obtained by complex state-of-the-art classification systems, such as the support vector machines (SVM) and random forest (RF) classifiers. Nevertheless, visual inspection of the derived classification maps shows that HiRLiC is characterized by higher generalization properties, providing more homogeneous classifications that the competitors. Moreover, the runtime requirements for producing the thematic map was orders of magnitude lower than the respective for the competitors.

  5. Efficacy measures associated to a plantar pressure based classification system in diabetic foot medicine.

    Science.gov (United States)

    Deschamps, Kevin; Matricali, Giovanni Arnoldo; Desmet, Dirk; Roosen, Philip; Keijsers, Noel; Nobels, Frank; Bruyninckx, Herman; Staes, Filip

    2016-09-01

    The concept of 'classification' has, similar to many other diseases, been found to be fundamental in the field of diabetic medicine. In the current study, we aimed at determining efficacy measures of a recently published plantar pressure based classification system. Technical efficacy of the classification system was investigated by applying a high resolution, pixel-level analysis on the normalized plantar pressure pedobarographic fields of the original experimental dataset consisting of 97 patients with diabetes and 33 persons without diabetes. Clinical efficacy was assessed by considering the occurence of foot ulcers at the plantar aspect of the forefoot in this dataset. Classification efficacy was assessed by determining the classification recognition rate as well as its sensitivity and specificity using cross-validation subsets of the experimental dataset together with a novel cohort of 12 patients with diabetes. Pixel-level comparison of the four groups associated to the classification system highlighted distinct regional differences. Retrospective analysis showed the occurence of eleven foot ulcers in the experimental dataset since their gait analysis. Eight out of the eleven ulcers developed in a region of the foot which had the highest forces. Overall classification recognition rate exceeded 90% for all cross-validation subsets. Sensitivity and specificity of the four groups associated to the classification system exceeded respectively the 0.7 and 0.8 level in all cross-validation subsets. The results of the current study support the use of the novel plantar pressure based classification system in diabetic foot medicine. It may particularly serve in communication, diagnosis and clinical decision making. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Case base classification on digital mammograms: improving the performance of case base classifier

    Science.gov (United States)

    Raman, Valliappan; Then, H. H.; Sumari, Putra; Venkatesa Mohan, N.

    2011-10-01

    Breast cancer continues to be a significant public health problem in the world. Early detection is the key for improving breast cancer prognosis. The aim of the research presented here is in twofold. First stage of research involves machine learning techniques, which segments and extracts features from the mass of digital mammograms. Second level is on problem solving approach which includes classification of mass by performance based case base classifier. In this paper we build a case-based Classifier in order to diagnose mammographic images. We explain different methods and behaviors that have been added to the classifier to improve the performance of the classifier. Currently the initial Performance base Classifier with Bagging is proposed in the paper and it's been implemented and it shows an improvement in specificity and sensitivity.

  7. Voting-based Classification for E-mail Spam Detection

    Directory of Open Access Journals (Sweden)

    Bashar Awad Al-Shboul

    2016-06-01

    Full Text Available The problem of spam e-mail has gained a tremendous amount of attention. Although entities tend to use e-mail spam filter applications to filter out received spam e-mails, marketing companies still tend to send unsolicited e-mails in bulk and users still receive a reasonable amount of spam e-mail despite those filtering applications. This work proposes a new method for classifying e-mails into spam and non-spam. First, several e-mail content features are extracted and then those features are used for classifying each e-mail individually. The classification results of three different classifiers (i.e. Decision Trees, Random Forests and k-Nearest Neighbor are combined in various voting schemes (i.e. majority vote, average probability, product of probabilities, minimum probability and maximum probability for making the final decision. To validate our method, two different spam e-mail collections were used.

  8. A CNN Based Approach for Garments Texture Design Classification

    Directory of Open Access Journals (Sweden)

    S.M. Sofiqul Islam

    2017-05-01

    Full Text Available Identifying garments texture design automatically for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several Hand-Engineered feature coding exists for identifying garments design classes. Recently, Deep Convolutional Neural Networks (CNNs have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data more accurately. In this paper, a CNN model for identifying garments design classes has been proposed. Experimental results on two different datasets show better results than existing two well-known CNN models (AlexNet and VGGNet and some state-of-the-art Hand-Engineered feature extraction methods.

  9. Agent Based Modeling of Human Gut Microbiome Interactions and Perturbations.

    Directory of Open Access Journals (Sweden)

    Tatiana Shashkova

    Full Text Available Intestinal microbiota plays an important role in the human health. It is involved in the digestion and protects the host against external pathogens. Examination of the intestinal microbiome interactions is required for understanding of the community influence on host health. Studies of the microbiome can provide insight on methods of improving health, including specific clinical procedures for individual microbial community composition modification and microbiota correction by colonizing with new bacterial species or dietary changes.In this work we report an agent-based model of interactions between two bacterial species and between species and the gut. The model is based on reactions describing bacterial fermentation of polysaccharides to acetate and propionate and fermentation of acetate to butyrate. Antibiotic treatment was chosen as disturbance factor and used to investigate stability of the system. System recovery after antibiotic treatment was analyzed as dependence on quantity of feedback interactions inside the community, therapy duration and amount of antibiotics. Bacterial species are known to mutate and acquire resistance to the antibiotics. The ability to mutate was considered to be a stochastic process, under this suggestion ratio of sensitive to resistant bacteria was calculated during antibiotic therapy and recovery.The model confirms a hypothesis of feedbacks mechanisms necessity for providing functionality and stability of the system after disturbance. High fraction of bacterial community was shown to mutate during antibiotic treatment, though sensitive strains could become dominating after recovery. The recovery of sensitive strains is explained by fitness cost of the resistance. The model demonstrates not only quantitative dynamics of bacterial species, but also gives an ability to observe the emergent spatial structure and its alteration, depending on various feedback mechanisms. Visual version of the model shows that spatial

  10. An Agent-Based Model of Evolving Community Flood Risk.

    Science.gov (United States)

    Tonn, Gina L; Guikema, Seth D

    2017-11-17

    Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.

  11. New approaches in agent-based modeling of complex financial systems

    Science.gov (United States)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  12. Confidence and the stock market: an agent-based approach.

    Science.gov (United States)

    Bertella, Mario A; Pires, Felipe R; Feng, Ling; Stanley, Harry Eugene

    2014-01-01

    Using a behavioral finance approach we study the impact of behavioral bias. We construct an artificial market consisting of fundamentalists and chartists to model the decision-making process of various agents. The agents differ in their strategies for evaluating stock prices, and exhibit differing memory lengths and confidence levels. When we increase the heterogeneity of the strategies used by the agents, in particular the memory lengths, we observe excess volatility and kurtosis, in agreement with real market fluctuations--indicating that agents in real-world financial markets exhibit widely differing memory lengths. We incorporate the behavioral traits of adaptive confidence and observe a positive correlation between average confidence and return rate, indicating that market sentiment is an important driver in price fluctuations. The introduction of market confidence increases price volatility, reflecting the negative effect of irrationality in market behavior.

  13. Plan Validation Using DES and Agent-based Simulation

    National Research Council Canada - National Science Library

    Wong, Teck H; Ong, Kim S

    2008-01-01

    .... This thesis explores the possibility of using a multi-agent system (MAS) to generate the aggressor's air strike plans, which could be coupled with a low resolution Discrete Event Simulation (DES...

  14. An Agent Based Software Approach towards Building Complex Systems

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Agent-oriented techniques represent an exciting new means of analyzing, designing and building complex software systems. They have the potential to significantly improve current practice in software engineering and to extend the range of applications that can feasibly be tackled. Yet, to date, there have been few serious attempts to cast agent systems as a software engineering paradigm. This paper seeks to rectify this omission. Specifically, points to be argued include:firstly, the conceptual apparatus of agent-oriented systems is well-suited to building software solutions for complex systems and secondly, agent-oriented approaches represent a genuine advance over the current state of the art for engineering complex systems. Following on from this view, the major issues raised by adopting an agentoriented approach to software engineering are highlighted and discussed in this paper.

  15. Enhancement of force patterns classification based on Gaussian distributions.

    Science.gov (United States)

    Ertelt, Thomas; Solomonovs, Ilja; Gronwald, Thomas

    2018-01-23

    Description of the patterns of ground reaction force is a standard method in areas such as medicine, biomechanics and robotics. The fundamental parameter is the time course of the force, which is classified visually in particular in the field of clinical diagnostics. Here, the knowledge and experience of the diagnostician is relevant for its assessment. For an objective and valid discrimination of the ground reaction force pattern, a generic method, especially in the medical field, is absolutely necessary to describe the qualities of the time-course. The aim of the presented method was to combine the approaches of two existing procedures from the fields of machine learning and the Gauss approximation in order to take advantages of both methods for the classification of ground reaction force patterns. The current limitations of both methods could be eliminated by an overarching method. Twenty-nine male athletes from different sports were examined. Each participant was given the task of performing a one-legged stopping maneuver on a force plate from the maximum possible starting speed. The individual time course of the ground reaction force of each subject was registered and approximated on the basis of eight Gaussian distributions. The descriptive coefficients were then classified using Bayesian regulated neural networks. The different sports served as the distinguishing feature. Although the athletes were all given the same task, all sports referred to a different quality in the time course of ground reaction force. Meanwhile within each sport, the athletes were homogeneous. With an overall prediction (R = 0.938) all subjects/sports were classified correctly with 94.29% accuracy. The combination of the two methods: the mathematical description of the time course of ground reaction forces on the basis of Gaussian distributions and their classification by means of Bayesian regulated neural networks, seems an adequate and promising method to discriminate the

  16. Deep Galaxy: Classification of Galaxies based on Deep Convolutional Neural Networks

    OpenAIRE

    Khalifa, Nour Eldeen M.; Taha, Mohamed Hamed N.; Hassanien, Aboul Ella; Selim, I. M.

    2017-01-01

    In this paper, a deep convolutional neural network architecture for galaxies classification is presented. The galaxy can be classified based on its features into main three categories Elliptical, Spiral, and Irregular. The proposed deep galaxies architecture consists of 8 layers, one main convolutional layer for features extraction with 96 filters, followed by two principles fully connected layers for classification. It is trained over 1356 images and achieved 97.272% in testing accuracy. A c...

  17. [Surgical treatment of chronic pancreatitis based on classification of M. Buchler and coworkers].

    Science.gov (United States)

    Krivoruchko, I A; Boĭko, V V; Goncharova, N N; Andreeshchev, S A

    2011-08-01

    The results of surgical treatment of 452 patients, suffering chronic pancreatitis (CHP), were analyzed. The CHP classification, elaborated by M. Buchler and coworkers (2009), based on clinical signs, morphological peculiarities and pancreatic function analysis, contains scientifically substantiated recommendations for choice of diagnostic methods and complex treatment of the disease. The classification proposed is simple in application and constitutes an instrument for studying and comparison of the CHP course severity, the patients prognosis and treatment.

  18. An agent-based intelligent environmental monitoring system

    OpenAIRE

    Athanasiadis, Ioannis N; Mitkas, Pericles A

    2004-01-01

    Fairly rapid environmental changes call for continuous surveillance and on-line decision making. There are two main areas where IT technologies can be valuable. In this paper we present a multi-agent system for monitoring and assessing air-quality attributes, which uses data coming from a meteorological station. A community of software agents is assigned to monitor and validate measurements coming from several sensors, to assess air-quality, and, finally, to fire alarms to appropriate recipie...

  19. Using the Agent-Based Modeling in Economic Field

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-10-01

    For ABM, a complex system is a system of individual agents who have the freedom to act in ways that are not always totally predictable, and whose actions are interconnected such that one agent’s actions changes the context (environment for other agents. These are many examples of such complex systems: the stock market, the human body immune system, a business organization, an institution, a work-team, a family etc.

  20. Classification of right-hand grasp movement based on EMOTIV Epoc+

    Science.gov (United States)

    Tobing, T. A. M. L.; Prawito, Wijaya, S. K.

    2017-07-01

    Combinations of BCT elements for right-hand grasp movement have been obtained, providing the average value of their classification accuracy. The aim of this study is to find a suitable combination for best classification accuracy of right-hand grasp movement based on EEG headset, EMOTIV Epoc+. There are three movement classifications: grasping hand, relax, and opening hand. These classifications take advantage of Event-Related Desynchronization (ERD) phenomenon that makes it possible to differ relaxation, imagery, and movement state from each other. The combinations of elements are the usage of Independent Component Analysis (ICA), spectrum analysis by Fast Fourier Transform (FFT), maximum mu and beta power with their frequency as features, and also classifier Probabilistic Neural Network (PNN) and Radial Basis Function (RBF). The average values of classification accuracy are ± 83% for training and ± 57% for testing. To have a better understanding of the signal quality recorded by EMOTIV Epoc+, the result of classification accuracy of left or right-hand grasping movement EEG signal (provided by Physionet) also be given, i.e.± 85% for training and ± 70% for testing. The comparison of accuracy value from each combination, experiment condition, and external EEG data are provided for the purpose of value analysis of classification accuracy.

  1. Multi-Frequency Polarimetric SAR Classification Based on Riemannian Manifold and Simultaneous Sparse Representation

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2015-07-01

    Full Text Available Normally, polarimetric SAR classification is a high-dimensional nonlinear mapping problem. In the realm of pattern recognition, sparse representation is a very efficacious and powerful approach. As classical descriptors of polarimetric SAR, covariance and coherency matrices are Hermitian semidefinite and form a Riemannian manifold. Conventional Euclidean metrics are not suitable for a Riemannian manifold, and hence, normal sparse representation classification cannot be applied to polarimetric SAR directly. This paper proposes a new land cover classification approach for polarimetric SAR. There are two principal novelties in this paper. First, a Stein kernel on a Riemannian manifold instead of Euclidean metrics, combined with sparse representation, is employed for polarimetric SAR land cover classification. This approach is named Stein-sparse representation-based classification (SRC. Second, using simultaneous sparse representation and reasonable assumptions of the correlation of representation among different frequency bands, Stein-SRC is generalized to simultaneous Stein-SRC for multi-frequency polarimetric SAR classification. These classifiers are assessed using polarimetric SAR images from the Airborne Synthetic Aperture Radar (AIRSAR sensor of the Jet Propulsion Laboratory (JPL and the Electromagnetics Institute Synthetic Aperture Radar (EMISAR sensor of the Technical University of Denmark (DTU. Experiments on single-band and multi-band data both show that these approaches acquire more accurate classification results in comparison to many conventional and advanced classifiers.

  2. The development of a classification schema for arts-based approaches to knowledge translation.

    Science.gov (United States)

    Archibald, Mandy M; Caine, Vera; Scott, Shannon D

    2014-10-01

    Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.

  3. Non-target adjacent stimuli classification improves performance of classical ERP-based brain computer interface

    Science.gov (United States)

    Ceballos, G. A.; Hernández, L. F.

    2015-04-01

    Objective. The classical ERP-based speller, or P300 Speller, is one of the most commonly used paradigms in the field of Brain Computer Interfaces (BCI). Several alterations to the visual stimuli presentation system have been developed to avoid unfavorable effects elicited by adjacent stimuli. However, there has been little, if any, regard to useful information contained in responses to adjacent stimuli about spatial location of target symbols. This paper aims to demonstrate that combining the classification of non-target adjacent stimuli with standard classification (target versus non-target) significantly improves classical ERP-based speller efficiency. Approach. Four SWLDA classifiers were trained and combined with the standard classifier: the lower row, upper row, right column and left column classifiers. This new feature extraction procedure and the classification method were carried out on three open databases: the UAM P300 database (Universidad Autonoma Metropolitana, Mexico), BCI competition II (dataset IIb) and BCI competition III (dataset II). Main results. The inclusion of the classification of non-target adjacent stimuli improves target classification in the classical row/column paradigm. A gain in mean single trial classification of 9.6% and an overall improvement of 25% in simulated spelling speed was achieved. Significance. We have provided further evidence that the ERPs produced by adjacent stimuli present discriminable features, which could provide additional information about the spatial location of intended symbols. This work promotes the searching of information on the peripheral stimulation responses to improve the performance of emerging visual ERP-based spellers.

  4. AN ADABOOST OPTIMIZED CCFIS BASED CLASSIFICATION MODEL FOR BREAST CANCER DETECTION

    Directory of Open Access Journals (Sweden)

    CHANDRASEKAR RAVI

    2017-06-01

    Full Text Available Classification is a Data Mining technique used for building a prototype of the data behaviour, using which an unseen data can be classified into one of the defined classes. Several researchers have proposed classification techniques but most of them did not emphasis much on the misclassified instances and storage space. In this paper, a classification model is proposed that takes into account the misclassified instances and storage space. The classification model is efficiently developed using a tree structure for reducing the storage complexity and uses single scan of the dataset. During the training phase, Class-based Closed Frequent ItemSets (CCFIS were mined from the training dataset in the form of a tree structure. The classification model has been developed using the CCFIS and a similarity measure based on Longest Common Subsequence (LCS. Further, the Particle Swarm Optimization algorithm is applied on the generated CCFIS, which assigns weights to the itemsets and their associated classes. Most of the classifiers are correctly classifying the common instances but they misclassify the rare instances. In view of that, AdaBoost algorithm has been used to boost the weights of the misclassified instances in the previous round so as to include them in the training phase to classify the rare instances. This improves the accuracy of the classification model. During the testing phase, the classification model is used to classify the instances of the test dataset. Breast Cancer dataset from UCI repository is used for experiment. Experimental analysis shows that the accuracy of the proposed classification model outperforms the PSOAdaBoost-Sequence classifier by 7% superior to other approaches like Naïve Bayes Classifier, Support Vector Machine Classifier, Instance Based Classifier, ID3 Classifier, J48 Classifier, etc.

  5. Agent based Particle Swarm Optimization for Load Frequency Control of Distribution Grid

    DEFF Research Database (Denmark)

    Cha, Seung-Tae; Saleem, Arshad; Wu, Qiuwei

    2012-01-01

    This paper presents a Particle Swarm Optimization (PSO) based on multi-agent controller. Real-time digital simulator (RTDS) is used for modelling the power system, while a PSO based multi-agent LFC algorithm is developed in JAVA for communicating with resource agents and determines the scenario...... to stabilize the frequency and voltage after the system enters into the islanding operation mode. The proposed algorithm is based on the formulation of an optimization problem using agent based PSO. The modified IEEE 9-bus system is employed to illustrate the performance of the proposed controller via RTDS...

  6. The selection of adhesive systems for resin-based luting agents.

    Science.gov (United States)

    Carville, Rebecca; Quinn, Frank

    2008-01-01

    The use of resin-based luting agents is ever expanding with the development of adhesive dentistry. A multitude of different adhesive systems are used with resin-based luting agents, and new products are introduced to the market frequently. Traditional adhesives generally required a multiple step bonding procedure prior to cementing with active resin-based luting materials; however, combined agents offer a simple application procedure. Self-etching 'all-in-one' systems claim that there is no need for the use of a separate adhesive process. The following review addresses the advantages and disadvantages of the available adhesive systems used with resin-based luting agents.

  7. An approach for classification of hydrogeological systems at the regional scale based on groundwater hydrographs

    Science.gov (United States)

    Haaf, Ezra; Barthel, Roland

    2016-04-01

    When assessing hydrogeological conditions at the regional scale, the analyst is often confronted with uncertainty of structures, inputs and processes while having to base inference on scarce and patchy data. Haaf and Barthel (2015) proposed a concept for handling this predicament by developing a groundwater systems classification framework, where information is transferred from similar, but well-explored and better understood to poorly described systems. The concept is based on the central hypothesis that similar systems react similarly to the same inputs and vice versa. It is conceptually related to PUB (Prediction in ungauged basins) where organization of systems and processes by quantitative methods is intended and used to improve understanding and prediction. Furthermore, using the framework it is expected that regional conceptual and numerical models can be checked or enriched by ensemble generated data from neighborhood-based estimators. In a first step, groundwater hydrographs from a large dataset in Southern Germany are compared in an effort to identify structural similarity in groundwater dynamics. A number of approaches to group hydrographs, mostly based on a similarity measure - which have previously only been used in local-scale studies, can be found in the literature. These are tested alongside different global feature extraction techniques. The resulting classifications are then compared to a visual "expert assessment"-based classification which serves as a reference. A ranking of the classification methods is carried out and differences shown. Selected groups from the classifications are related to geological descriptors. Here we present the most promising results from a comparison of classifications based on series correlation, different series distances and series features, such as the coefficients of the discrete Fourier transform and the intrinsic mode functions of empirical mode decomposition. Additionally, we show examples of classes

  8. Optimization of Cholinesterase-Based Catalytic Bioscavengers Against Organophosphorus Agents.

    Science.gov (United States)

    Lushchekina, Sofya V; Schopfer, Lawrence M; Grigorenko, Bella L; Nemukhin, Alexander V; Varfolomeev, Sergei D; Lockridge, Oksana; Masson, Patrick

    2018-01-01

    Organophosphorus agents (OPs) are irreversible inhibitors of acetylcholinesterase (AChE). OP poisoning causes major cholinergic syndrome. Current medical counter-measures mitigate the acute effects but have limited action against OP-induced brain damage. Bioscavengers are appealing alternative therapeutic approach because they neutralize OPs in bloodstream before they reach physiological targets. First generation bioscavengers are stoichiometric bioscavengers. However, stoichiometric neutralization requires administration of huge doses of enzyme. Second generation bioscavengers are catalytic bioscavengers capable of detoxifying OPs with a turnover. High bimolecular rate constants ( k cat / K m > 10 6 M -1 min -1 ) are required, so that low enzyme doses can be administered. Cholinesterases (ChE) are attractive candidates because OPs are hemi-substrates. Moderate OP hydrolase (OPase) activity has been observed for certain natural ChEs and for G117H-based human BChE mutants made by site-directed mutagenesis. However, before mutated ChEs can become operational catalytic bioscavengers their dephosphylation rate constant must be increased by several orders of magnitude. New strategies for converting ChEs into fast OPase are based either on combinational approaches or on computer redesign of enzyme. The keystone for rational conversion of ChEs into OPases is to understand the reaction mechanisms with OPs. In the present work we propose that efficient OP hydrolysis can be achieved by re-designing the configuration of enzyme active center residues and by creating specific routes for attack of water molecules and proton transfer. Four directions for nucleophilic attack of water on phosphorus atom were defined. Changes must lead to a novel enzyme, wherein OP hydrolysis wins over competing aging reactions. Kinetic, crystallographic, and computational data have been accumulated that describe mechanisms of reactions involving ChEs. From these studies, it appears that introducing

  9. Optimization of Cholinesterase-Based Catalytic Bioscavengers Against Organophosphorus Agents

    Directory of Open Access Journals (Sweden)

    Sofya V. Lushchekina

    2018-03-01

    Full Text Available Organophosphorus agents (OPs are irreversible inhibitors of acetylcholinesterase (AChE. OP poisoning causes major cholinergic syndrome. Current medical counter-measures mitigate the acute effects but have limited action against OP-induced brain damage. Bioscavengers are appealing alternative therapeutic approach because they neutralize OPs in bloodstream before they reach physiological targets. First generation bioscavengers are stoichiometric bioscavengers. However, stoichiometric neutralization requires administration of huge doses of enzyme. Second generation bioscavengers are catalytic bioscavengers capable of detoxifying OPs with a turnover. High bimolecular rate constants (kcat/Km > 106 M−1min−1 are required, so that low enzyme doses can be administered. Cholinesterases (ChE are attractive candidates because OPs are hemi-substrates. Moderate OP hydrolase (OPase activity has been observed for certain natural ChEs and for G117H-based human BChE mutants made by site-directed mutagenesis. However, before mutated ChEs can become operational catalytic bioscavengers their dephosphylation rate constant must be increased by several orders of magnitude. New strategies for converting ChEs into fast OPase are based either on combinational approaches or on computer redesign of enzyme. The keystone for rational conversion of ChEs into OPases is to understand the reaction mechanisms with OPs. In the present work we propose that efficient OP hydrolysis can be achieved by re-designing the configuration of enzyme active center residues and by creating specific routes for attack of water molecules and proton transfer. Four directions for nucleophilic attack of water on phosphorus atom were defined. Changes must lead to a novel enzyme, wherein OP hydrolysis wins over competing aging reactions. Kinetic, crystallographic, and computational data have been accumulated that describe mechanisms of reactions involving ChEs. From these studies, it appears that

  10. Polyaniline emeraldine base nanofibers as a radiostabilizing agent for PMMA

    International Nuclear Information System (INIS)

    Araujo, Patricia L.B.; Ferreira, Carlas C.; Araujo, Elmo S.

    2007-01-01

    Polyaniline (PANI) presents antioxidant and radical-scavenging properties. Substances having these characteristics are good candidates for radioprotecting agents. Some studies have also shown results pointing out to biocompatibility and biodegradability of PANI. These characteristics are desirable for substances in contact with biological tissues and have important implications for inclusion of PANI in physical mixtures with conventional radiosterilizable polymers. In this work, nanofibers of polyaniline emeraldine doped with (±)-camphor-10-sulfonic acid (PANI-(±)-CSA) were prepared by self-assembly method. Polyaniline emeraldine base (PANI-EB) nanofibers were obtained after dedoping with NH 4 OH and used as additives in films of commercial poly (methyl methacrylate) (PMMA). In order to assess possible radiostabilizing effects of PANI-EB and its aniline monomer (An) on the PMMA matrix, films containing 0.075 and 0.15% (wt/wt) of these substances were submitted to gamma irradiation from 25 to 75 kGy doses. Variation on viscosity-average molar mass (Mv) of the PMMA matrix at 25 kGy dose showed that samples containing An and PANI-EB nanofibers in amounts of 0.15% (wt/wt) underwent less degradation than control sample. When nanofibers were used as additives, no measurable variation of Mv could be detected in PMMA samples at this dose. At 75 kGy, all composites containing PANI-EB nanofibers underwent less degradation than control samples, suggesting that these additives are able to retain their action at doses higher than standard sterilization dose. These evidences show that PANI-EB nanofibers could be useful additives in commercial PMMA used in medical applications. FTIR spectroscopic characterization and scanning electron microscopy (SEM) of PANI samples were also performed. (author)

  11. Optimizing agent-based transmission models for infectious diseases.

    Science.gov (United States)

    Willem, Lander; Stijven, Sean; Tijskens, Engelbert; Beutels, Philippe; Hens, Niel; Broeckhove, Jan

    2015-06-02

    Infectious disease modeling and computational power have evolved such that large-scale agent-based models (ABMs) have become feasible. However, the increasing hardware complexity requires adapted software designs to achieve the full potential of current high-performance workstations. We have found large performance differences with a discrete-time ABM for close-contact disease transmission due to data locality. Sorting the population according to the social contact clusters reduced simulation time by a factor of two. Data locality and model performance can also be improved by storing person attributes separately instead of using person objects. Next, decreasing the number of operations by sorting people by health status before processing disease transmission has also a large impact on model performance. Depending of the clinical attack rate, target population and computer hardware, the introduction of the sort phase decreased the run time from 26% up to more than 70%. We have investigated the application of parallel programming techniques and found that the speedup is significant but it drops quickly with the number of cores. We observed that the effect of scheduling and workload chunk size is model specific and can make a large difference. Investment in performance optimization of ABM simulator code can lead to significant run time reductions. The key steps are straightforward: the data structure for the population and sorting people on health status before effecting disease propagation. We believe these conclusions to be valid for a wide range of infectious disease ABMs. We recommend that future studies evaluate the impact of data management, algorithmic procedures and parallelization on model performance.

  12. A fingerprint classification algorithm based on combination of local and global information

    Science.gov (United States)

    Liu, Chongjin; Fu, Xiang; Bian, Junjie; Feng, Jufu

    2011-12-01

    Fingerprint recognition is one of the most important technologies in biometric identification and has been wildly applied in commercial and forensic areas. Fingerprint classification, as the fundamental procedure in fingerprint recognition, can sharply decrease the quantity for fingerprint matching and improve the efficiency of fingerprint recognition. Most fingerprint classification algorithms are based on the number and position of singular points. Because the singular points detecting method only considers the local information commonly, the classification algorithms are sensitive to noise. In this paper, we propose a novel fingerprint classification algorithm combining the local and global information of fingerprint. Firstly we use local information to detect singular points and measure their quality considering orientation structure and image texture in adjacent areas. Furthermore the global orientation model is adopted to measure the reliability of singular points group. Finally the local quality and global reliability is weighted to classify fingerprint. Experiments demonstrate the accuracy and effectivity of our algorithm especially for the poor quality fingerprint images.

  13. Hyperspectral Image Classification Based on the Combination of Spatial-spectral Feature and Sparse Representation

    Directory of Open Access Journals (Sweden)

    YANG Zhaoxia

    2015-07-01

    Full Text Available In order to avoid the problem of being over-dependent on high-dimensional spectral feature in the traditional hyperspectral image classification, a novel approach based on the combination of spatial-spectral feature and sparse representation is proposed in this paper. Firstly, we extract the spatial-spectral feature by reorganizing the local image patch with the first d principal components(PCs into a vector representation, followed by a sorting scheme to make the vector invariant to local image rotation. Secondly, we learn the dictionary through a supervised method, and use it to code the features from test samples afterwards. Finally, we embed the resulting sparse feature coding into the support vector machine(SVM for hyperspectral image classification. Experiments using three hyperspectral data show that the proposed method can effectively improve the classification accuracy comparing with traditional classification methods.

  14. A Discrete Wavelet Based Feature Extraction and Hybrid Classification Technique for Microarray Data Analysis

    Directory of Open Access Journals (Sweden)

    Jaison Bennet

    2014-01-01

    Full Text Available Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN, naive Bayes, and support vector machine (SVM. Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT and moving window technique (MWT is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.

  15. Rule-based land cover classification from very high-resolution satellite image with multiresolution segmentation

    Science.gov (United States)

    Haque, Md. Enamul; Al-Ramadan, Baqer; Johnson, Brian A.

    2016-07-01

    Multiresolution segmentation and rule-based classification techniques are used to classify objects from very high-resolution satellite images of urban areas. Custom rules are developed using different spectral, geometric, and textural features with five scale parameters, which exploit varying classification accuracy. Principal component analysis is used to select the most important features out of a total of 207 different features. In particular, seven different object types are considered for classification. The overall classification accuracy achieved for the rule-based method is 95.55% and 98.95% for seven and five classes, respectively. Other classifiers that are not using rules perform at 84.17% and 97.3% accuracy for seven and five classes, respectively. The results exploit coarse segmentation for higher scale parameter and fine segmentation for lower scale parameter. The major contribution of this research is the development of rule sets and the identification of major features for satellite image classification where the rule sets are transferable and the parameters are tunable for different types of imagery. Additionally, the individual objectwise classification and principal component analysis help to identify the required object from an arbitrary number of objects within images given ground truth data for the training.

  16. Classification of Two Comic Books based on Convolutional Neural Networks

    Directory of Open Access Journals (Sweden)

    Miki UENO

    2017-03-01

    Full Text Available Unphotographic images are the powerful representations described various situations. Thus, understanding intellectual products such as comics and picture books is one of the important topics in the field of artificial intelligence. Hence, stepwise analysis of a comic story, i.e., features of a part of the image, information features, features relating to continuous scene etc., was pursued. Especially, the length and each scene of four-scene comics are limited so as to ensure a clear interpretation of the contents.In this study, as the first step in this direction, the problem to classify two four-scene comics by the same artists were focused as the example. Several classifiers were constructed by utilizing a Convolutional Neural Network(CNN, and the results of classification by a human annotator and by a computational method were compared.From these experiments, we have clearly shown that CNN is efficient way to classify unphotographic gray scaled images and found that characteristic features of images to classify incorrectly.

  17. MR imaging-based diagnosis and classification of meniscal tears.

    Science.gov (United States)

    Nguyen, Jie C; De Smet, Arthur A; Graf, Ben K; Rosas, Humberto G

    2014-01-01

    Magnetic resonance (MR) imaging is currently the modality of choice for detecting meniscal injuries and planning subsequent treatment. A thorough understanding of the imaging protocols, normal meniscal anatomy, surrounding anatomic structures, and anatomic variants and pitfalls is critical to ensure diagnostic accuracy and prevent unnecessary surgery. High-spatial-resolution imaging of the meniscus can be performed using fast spin-echo and three-dimensional MR imaging sequences. Normal anatomic structures that can mimic a tear include the meniscal ligament, meniscofemoral ligaments, popliteomeniscal fascicles, and meniscomeniscal ligament. Anatomic variants and pitfalls that can mimic a tear include discoid meniscus, meniscal flounce, a meniscal ossicle, and chondrocalcinosis. When a meniscal tear is identified, accurate description and classification of the tear pattern can guide the referring clinician in patient education and surgical planning. For example, longitudinal tears are often amenable to repair, whereas horizontal and radial tears may require partial meniscectomy. Tear patterns include horizontal, longitudinal, radial, root, complex, displaced, and bucket-handle tears. Occasionally, meniscal tears can be difficult to detect at imaging; however, secondary indirect signs, such as a parameniscal cyst, meniscal extrusion, or linear subchondral bone marrow edema, should increase the radiologist's suspicion for an underlying tear. Awareness of common diagnostic errors can ensure accurate diagnosis of meniscal tears. Online supplemental material is available for this article. ©RSNA, 2014.

  18. Object-Based Classification as an Alternative Approach to the Traditional Pixel-Based Classification to Identify Potential Habitat of the Grasshopper Sparrow

    Science.gov (United States)

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  19. Waste-acceptance criteria and risk-based thinking for radioactive-waste classification

    International Nuclear Information System (INIS)

    Lowenthal, M.D.

    1998-01-01

    The US system of radioactive-waste classification and its development provide a reference point for the discussion of risk-based thinking in waste classification. The official US system is described and waste-acceptance criteria for disposal sites are introduced because they constitute a form of de facto waste classification. Risk-based classification is explored and it is found that a truly risk-based system is context-dependent: risk depends not only on the waste-management activity but, for some activities such as disposal, it depends on the specific physical context. Some of the elements of the official US system incorporate risk-based thinking, but like many proposed alternative schemes, the physical context of disposal is ignored. The waste-acceptance criteria for disposal sites do account for this context dependence and could be used as a risk-based classification scheme for disposal. While different classes would be necessary for different management activities, the waste-acceptance criteria would obviate the need for the current system and could better match wastes to disposal environments saving money or improving safety or both

  20. Color Independent Components Based SIFT Descriptors for Object/Scene Classification

    Science.gov (United States)

    Ai, Dan-Ni; Han, Xian-Hua; Ruan, Xiang; Chen, Yen-Wei

    In this paper, we present a novel color independent components based SIFT descriptor (termed CIC-SIFT) for object/scene classification. We first learn an efficient color transformation matrix based on independent component analysis (ICA), which is adaptive to each category in a database. The ICA-based color transformation can enhance contrast between the objects and the background in an image. Then we compute CIC-SIFT descriptors over all three transformed color independent components. Since the ICA-based color transformation can boost the objects and suppress the background, the proposed CIC-SIFT can extract more effective and discriminative local features for object/scene classification. The comparison is performed among seven SIFT descriptors, and the experimental classification results show that our proposed CIC-SIFT is superior to other conventional SIFT descriptors.