WorldWideScience

Sample records for regime classification approach

  1. A Global Classification of Contemporary Fire Regimes

    Science.gov (United States)

    Norman, S. P.; Kumar, J.; Hargrove, W. W.; Hoffman, F. M.

    2014-12-01

    Fire regimes provide a sensitive indicator of changes in climate and human use as the concept includes fire extent, season, frequency, and intensity. Fires that occur outside the distribution of one or more aspects of a fire regime may affect ecosystem resilience. However, global scale data related to these varied aspects of fire regimes are highly inconsistent due to incomplete or inconsistent reporting. In this study, we derive a globally applicable approach to characterizing similar fire regimes using long geophysical time series, namely MODIS hotspots since 2000. K-means non-hierarchical clustering was used to generate empirically based groups that minimized within-cluster variability. Satellite-based fire detections are known to have shortcomings, including under-detection from obscuring smoke, clouds or dense canopy cover and rapid spread rates, as often occurs with flashy fuels or during extreme weather. Such regions are free from preconceptions, and the empirical, data-mining approach used on this relatively uniform data source allows the region structures to emerge from the data themselves. Comparing such an empirical classification to expectations from climate, phenology, land use or development-based models can help us interpret the similarities and differences among places and how they provide different indicators of changes of concern. Classifications can help identify where large infrequent mega-fires are likely to occur ahead of time such as in the boreal forest and portions of the Interior US West, and where fire reports are incomplete such as in less industrial countries.

  2. A Time Series Regime Classification Approach for Short-Term Forecasting; Identificacion de Mecanismos en Series Temporales para la Prediccion a Corto Plazo

    Energy Technology Data Exchange (ETDEWEB)

    Gallego, C. J.

    2010-03-08

    Abstract: This technical report is focused on the analysis of stochastic processes that switch between different dynamics (also called regimes or mechanisms) over time. The so-called Switching-regime models consider several underlying functions instead of one. In this case, a classification problem arises as the current regime has to be assessed at each time-step. The identification of the regimes allows the performance of regime-switching models for short-term forecasting purposes. Within this framework, identifying different regimes showed by time-series is the aim of this work. The proposed approach is based on a statistical tool called Gamma-test. One of the main advantages of this methodology is the absence of a mathematical definition for the different underlying functions. Applications with both simulated and real wind power data have been considered. Results on simulated time series show that regimes can be successfully identified under certain hypothesis. Nevertheless, this work highlights that further research has to be done when considering real wind power time-series, which usually show different behaviours (e.g. fluctuations or ramps, followed by low variance periods). A better understanding of these events eventually will improve wind power forecasting. (Author) 15 refs.

  3. Spectral classification of coupling regimes in the quantum Rabi model

    Science.gov (United States)

    Rossatto, Daniel Z.; Villas-Bôas, Celso J.; Sanz, Mikel; Solano, Enrique

    2017-07-01

    The quantum Rabi model is in the scientific spotlight due to the recent theoretical and experimental progress. Nevertheless, a full-fledged classification of its coupling regimes remains as a relevant open question. We propose a spectral classification dividing the coupling regimes into three regions based on the validity of perturbative criteria on the quantum Rabi model, which allows us the use of exactly solvable effective Hamiltonians. These coupling regimes are (i) the perturbative ultrastrong coupling regime which comprises the Jaynes-Cummings model, (ii) a region where nonperturbative ultrastrong and nonperturbative deep strong coupling regimes coexist, and (iii) the perturbative deep strong coupling regime. We show that this spectral classification depends not only on the ratio between the coupling strength and the natural frequencies of the unperturbed parts, but also on the energy to which the system can access. These regimes additionally discriminate the completely different behaviors of several static physical properties, namely the total number of excitations, the photon statistics of the field, and the cavity-qubit entanglement. Finally, we explain the dynamical properties which are traditionally associated with the deep strong coupling regime, such as the collapses and revivals of the state population, in the frame of the proposed spectral classification.

  4. Automatic Classification of Offshore Wind Regimes With Weather Radar Observations

    DEFF Research Database (Denmark)

    Trombe, Pierre-Julien; Pinson, Pierre; Madsen, Henrik

    2014-01-01

    Weather radar observations are called to play an important role in offshore wind energy. In particular, they can enable the monitoring of weather conditions in the vicinity of large-scale offshore wind farms and thereby notify the arrival of precipitation systems associated with severe wind...... and amplitude) using reflectivity observations from a single weather radar system. A categorical sequence of most likely wind regimes is estimated from a wind speed time series by combining a Markov-Switching model and a global decoding technique, the Viterbi algorithm. In parallel, attributes of precipitation...... systems are extracted from weather radar images. These attributes describe the global intensity, spatial continuity and motion of precipitation echoes on the images. Finally, a CART classification tree is used to find the broad relationships between precipitation attributes and wind regimes...

  5. A feasibility study on precipitation regime classification by meteorological states

    Science.gov (United States)

    Hamada, A.; Takayabu, Y. N.

    2012-04-01

    Appropriate microphysical models of rainfall systems are essential for accurate precipitation retrievals from satellite measurements. For a better estimate of rainfall from the microwave imager satellites in Global Satellite Mapping of Precipitation (GSMaP), Takayabu (2008, GEWEX Newsletter; hereinafter T08) produced 3-monthly maps of dominant rainfall systems, utilizing TRMM Precipitation Radar (PR) and Lightning Imaging Sensor (LIS) data. It is worthwhile if we can classify different type of rainfall systems not from satellite rainfall data themselves but from the environmental meteorological states. In this feasibility study, precipitation regime classification over the oceans is performed by constructing a look-up-table (LUT) for estimating precipitation types in terms of local state of the atmosphere and ocean. This time, we chose four variables to construct the LUTs; sea surface temperature (SST), pressure vertical velocity at 500hPa (ω500), lower-tropospheric baroclinicity at 900hPa (dT900/dy), and lower-tropospheric stability (LTS), obtained from ERA-interim and OISST. The LUTs are trained with the precipitation types defined by T08. The four-dimensional probability density functions for each precipitation types were utilized to reconstruct precipitation types at each point. The constructed four-dimensional LUT is shown to have a reasonably good skill in estimation over the oceans. The possibility of detection (POD) is above 60% up to 90% for all seasons. The estimation skill is less dependent on months despite that the LUT was trained with only one month climatology, indicating the choice of these state variables is reasonable. The LUT can also describe interannual variations of precipitation regimes, e.g., those differences in El Niño and La Niña periods. The way of separation by selected environmental states is mostly meteorologically reasonable, although some representative variables have some room for improvements especially in the midlatitudes. We

  6. Regime variance testing - a quantile approach

    CERN Document Server

    gajda, Janusz; Wyłomańska, Agnieszka

    2012-01-01

    This paper is devoted to testing time series that exhibit behavior related to two or more regimes with different statistical properties. Motivation of our study are two real data sets from plasma physics with observable two-regimes structure. In this paper we develop estimation procedure for critical point of division the structure change of a time series. Moreover we propose three tests for recognition such specific behavior. The presented methodology is based on the empirical second moment and its main advantage is lack of the distribution assumption. Moreover, the examined statistical properties we express in the language of empirical quantiles of the squared data therefore the methodology is an extension of the approach known from the literature. The theoretical results we confirm by simulations and analysis of real data of turbulent laboratory plasma.

  7. Combinatorial Approach of Associative Classification

    OpenAIRE

    P. R. Pal; R.C. Jain

    2010-01-01

    Association rule mining and classification are two important techniques of data mining in knowledge discovery process. Integration of these two has produced class association rule mining or associative classification techniques, which in many cases have shown better classification accuracy than conventional classifiers. Motivated by this study we have explored and applied the combinatorial mathematics in class association rule mining in this paper. Our algorithm is based on producing co...

  8. Quantum regime of a free-electron laser: relativistic approach

    Science.gov (United States)

    Kling, Peter; Sauerbrey, Roland; Preiss, Paul; Giese, Enno; Endrich, Rainer; Schleich, Wolfgang P.

    2017-01-01

    In the quantum regime of the free-electron laser, the dynamics of the electrons is not governed by continuous trajectories but by discrete jumps in momentum. In this article, we rederive the two crucial conditions to enter this quantum regime: (1) a large quantum mechanical recoil of the electron caused by the scattering with the laser and the wiggler field and (2) a small energy spread of the electron beam. In contrast to our recent approach based on nonrelativistic quantum mechanics in a co-moving frame of reference, we now pursue a model in the laboratory frame employing relativistic quantum electrodynamics.

  9. Multivariate Approaches to Classification in Extragalactic Astronomy

    Science.gov (United States)

    Fraix-Burnet, Didier; Thuillard, Marc; Chattopadhyay, Asis Kumar

    2015-08-01

    Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  10. Multivariate Approaches to Classification in Extragalactic Astronomy

    Directory of Open Access Journals (Sweden)

    Didier eFraix-Burnet

    2015-08-01

    Full Text Available Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  11. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  12. Multivariate Approaches to Classification in Extragalactic Astronomy

    CERN Document Server

    Fraix-Burnet, Didier; Chattopadhyay, Asis Kumar

    2015-01-01

    Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono-or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  13. Text Classification: A Sequential Reading Approach

    CERN Document Server

    Dulac-Arnold, Gabriel; Gallinari, Patrick

    2011-01-01

    We propose to model the text classification process as a sequential decision process. In this process, an agent learns to classify documents into topics while reading the document sentences sequentially and learns to stop as soon as enough information was read for deciding. The proposed algorithm is based on a modelisation of Text Classification as a Markov Decision Process and learns by using Reinforcement Learning. Experiments on four different classical mono-label corpora show that the proposed approach performs comparably to classical SVM approaches for large training sets, and better for small training sets. In addition, the model automatically adapts its reading process to the quantity of training information provided.

  14. A statistical approach to root system classification.

    Directory of Open Access Journals (Sweden)

    Gernot eBodner

    2013-08-01

    Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement

  15. A Shallow Approach to Subjectivity Classification

    NARCIS (Netherlands)

    Raaijmakers, S.A.; Kraaij, W.

    2008-01-01

    We present a shallow linguistic approach to subjectivity classification. Using multinomial kernel machines, we demonstrate that a data representation based on counting character n-grams is able to improve on results previously attained on the MPQA corpus using word-based n-grams and syntactic inform

  16. Exploring different approaches for music genre classification

    Directory of Open Access Journals (Sweden)

    Antonio Jose Homsi Goulart

    2012-07-01

    Full Text Available In this letter, we present different approaches for music genre classification. The proposed techniques, which are composed of a feature extraction stage followed by a classification procedure, explore both the variations of parameters used as input and the classifier architecture. Tests were carried out with three styles of music, namely blues, classical, and lounge, which are considered informally by some musicians as being “big dividers” among music genres, showing the efficacy of the proposed algorithms and establishing a relationship between the relevance of each set of parameters for each music style and each classifier. In contrast to other works, entropies and fractal dimensions are the features adopted for the classifications.

  17. Information theoretic approach for accounting classification

    CERN Document Server

    Ribeiro, E M S

    2014-01-01

    In this paper we consider an information theoretic approach for the accounting classification process. We propose a matrix formalism and an algorithm for calculations of information theoretic measures associated to accounting classification. The formalism may be useful for further generalizations, and computer based implementation. Information theoretic measures, mutual information and symmetric uncertainty, were evaluated for daily transactions recorded in the chart of accounts of a small company during two years. Variation in the information measures due the aggregation of data in the process of accounting classification is observed. In particular, the symmetric uncertainty seems to be a useful parameter for comparing companies over time or in different sectors; or different accounting choices and standards.

  18. A statistical approach to root system classification.

    Science.gov (United States)

    Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter

    2013-01-01

    Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for "plant functional type" identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential.

  19. Classification of regimes of internal solitary waves transformation over a shelf-slope topography

    Science.gov (United States)

    Terletska, Kateryna; Maderich, Vladimir; Talipova, Tatiana; Brovchenko, Igor; Jung, Kyung Tae

    2015-04-01

    The internal waves shoal and dissipate as they cross abrupt changes of the topography in the coastal ocean, estuaries and in the enclosed water bodies. They can form near the coast internal bores propagating into the shallows and re-suspend seabed pollutants that may have serious ecological consequences. Internal solitary waves (ISW) with trapped core can transport masses of water and marine organisms for some distance. The transport of cold, low-oxygen waters results in nutrient pumping. These facts require development of classification of regimes of the ISWs transformation over a shelf-slope topography to recognize 'hot spots' of wave energy dissipation on the continental shelf. A new classification of regimes of internal solitary wave interaction with the shelf-slope topography in the framework of two-layer fluid is proposed. We introduce a new three-dimensional diagram based on parameters α ,β , γ. Here α is the nondimensional wave amplitude normalized on the thermocline thickness α = ain/h1 (α > 0), β is the blocking parameter introduced in (Talipova et al., 2013) that is the ratio of the height of the bottom layer on the the shelf step h2+ to the incident wave amplitude ain, β = h2+/ain (β > -3), and γ is the parameter inverse to the slope inclination (γ > 0.01). Two mechanisms are important during wave shoaling: (i) wave breaking resulting in mixing and (ii) changing of the polarity of the initial wave of depression on the slope. Range of the parameters at which wave breaking occurs can be defined using the criteria, obtained empirically (Vlasenko and Hutter, 2002). In the three-dimensional diagram this criteria is represented by the surface f1(β,γ) = 0 that separates the region of parameters where breaking takes place from the region without breaking. The polarity change surface f2(α,β) = 0 is obtained from the condition of equality of the depth of upper layer h1 to the depth of the lower layer h2. In the two-layer stratification waves of

  20. Flow Analysis: A Novel Approach For Classification.

    Science.gov (United States)

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented.

  1. AUTOMATIC APPROACH TO VHR SATELLITE IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    P. Kupidura

    2016-06-01

    Full Text Available In this paper, we present a proposition of a fully automatic classification of VHR satellite images. Unlike the most widespread approaches: supervised classification, which requires prior defining of class signatures, or unsupervised classification, which must be followed by an interpretation of its results, the proposed method requires no human intervention except for the setting of the initial parameters. The presented approach bases on both spectral and textural analysis of the image and consists of 3 steps. The first step, the analysis of spectral data, relies on NDVI values. Its purpose is to distinguish between basic classes, such as water, vegetation and non-vegetation, which all differ significantly spectrally, thus they can be easily extracted basing on spectral analysis. The second step relies on granulometric maps. These are the product of local granulometric analysis of an image and present information on the texture of each pixel neighbourhood, depending on the texture grain. The purpose of texture analysis is to distinguish between different classes, spectrally similar, but yet of different texture, e.g. bare soil from a built-up area, or low vegetation from a wooded area. Due to the use of granulometric analysis, based on mathematical morphology opening and closing, the results are resistant to the border effect (qualifying borders of objects in an image as spaces of high texture, which affect other methods of texture analysis like GLCM statistics or fractal analysis. Therefore, the effectiveness of the analysis is relatively high. Several indices based on values of different granulometric maps have been developed to simplify the extraction of classes of different texture. The third and final step of the process relies on a vegetation index, based on near infrared and blue bands. Its purpose is to correct partially misclassified pixels. All the indices used in the classification model developed relate to reflectance values, so the

  2. Automatic Approach to Vhr Satellite Image Classification

    Science.gov (United States)

    Kupidura, P.; Osińska-Skotak, K.; Pluto-Kossakowska, J.

    2016-06-01

    In this paper, we present a proposition of a fully automatic classification of VHR satellite images. Unlike the most widespread approaches: supervised classification, which requires prior defining of class signatures, or unsupervised classification, which must be followed by an interpretation of its results, the proposed method requires no human intervention except for the setting of the initial parameters. The presented approach bases on both spectral and textural analysis of the image and consists of 3 steps. The first step, the analysis of spectral data, relies on NDVI values. Its purpose is to distinguish between basic classes, such as water, vegetation and non-vegetation, which all differ significantly spectrally, thus they can be easily extracted basing on spectral analysis. The second step relies on granulometric maps. These are the product of local granulometric analysis of an image and present information on the texture of each pixel neighbourhood, depending on the texture grain. The purpose of texture analysis is to distinguish between different classes, spectrally similar, but yet of different texture, e.g. bare soil from a built-up area, or low vegetation from a wooded area. Due to the use of granulometric analysis, based on mathematical morphology opening and closing, the results are resistant to the border effect (qualifying borders of objects in an image as spaces of high texture), which affect other methods of texture analysis like GLCM statistics or fractal analysis. Therefore, the effectiveness of the analysis is relatively high. Several indices based on values of different granulometric maps have been developed to simplify the extraction of classes of different texture. The third and final step of the process relies on a vegetation index, based on near infrared and blue bands. Its purpose is to correct partially misclassified pixels. All the indices used in the classification model developed relate to reflectance values, so the preliminary step

  3. Weather regimes over Senegal during the summer monsoon season using self-organizing maps and hierarchical ascendant classification. Part II: interannual time scale

    Energy Technology Data Exchange (ETDEWEB)

    Gueye, A.K. [ESP, UCAD, Dakar (Senegal); Janicot, Serge; Sultan, Benjamin [LOCEAN/IPSL, IRD, Universite Pierre et Marie Curie, Paris cedex 05 (France); Niang, A. [LTI, ESP/UCAD, Dakar (Senegal); Sawadogo, S. [LTI, EPT, Thies (Senegal); Diongue-Niang, A. [ANACIM, Dakar (Senegal); Thiria, S. [LOCEAN/IPSL, UPMC, Paris (France)

    2012-11-15

    The aim of this work is to define over the period 1979-2002 the main synoptic weather regimes relevant for understanding the daily variability of rainfall during the summer monsoon season over Senegal. ''Interannual'' synoptic weather regimes are defined by removing the influence of the mean 1979-2002 seasonal cycle. This is different from Part I where the seasonal evolution of each year was removed, then removing also the contribution of interannual variability. As in Part I, the self-organizing maps approach, a clustering methodology based on non-linear artificial neural network, is combined with a hierarchical ascendant classification to compute these regimes. Nine weather regimes are identified using the mean sea level pressure and 850 hPa wind field as variables. The composite circulation patterns of all these nine weather regimes are very consistent with the associated anomaly patterns of precipitable water, mid-troposphere vertical velocity and rainfall. They are also consistent with the distribution of rainfall extremes. These regimes have been then gathered into different groups. A first group of four regimes is included in an inner circuit and is characterized by a modulation of the semi-permanent trough located along the western coast of West Africa and an opposite modulation on the east. This circuit is important because it associates the two wettest and highly persistent weather regimes over Senegal with the driest and the most persistent one. One derivation of this circuit is highlighted, including the two driest regimes and the most persistent one, what can provide important dry sequences occurrence. An exit of this circuit is characterised by a filling of the Saharan heat low. An entry into the main circuit includes a southward location of the Saharan heat low followed by its deepening. The last weather regime is isolated from the other ones and it has no significant impact on Senegal. It is present in June and September, and

  4. Non-invasive classification of gas-liquid two-phase horizontal flow regimes using an ultrasonic Doppler sensor and a neural network

    Science.gov (United States)

    Musa Abbagoni, Baba; Yeung, Hoi

    2016-08-01

    The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas-liquid flow regimes objectively with the gas-liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the

  5. Age Classification Based On Integrated Approach

    Directory of Open Access Journals (Sweden)

    Pullela. SVVSR Kumar

    2014-05-01

    Full Text Available The present paper presents a new age classification method by integrating the features derived from Grey Level Co-occurrence Matrix (GLCM with a new structural approach derived from four distinct LBP's (4-DLBP on a 3 x 3 image. The present paper derived four distinct patterns called Left Diagonal (LD, Right diagonal (RD, vertical centre (VC and horizontal centre (HC LBP's. For all the LBP's the central pixel value of the 3 x 3 neighbourhood is significant. That is the reason in the present research LBP values are evaluated by comparing all 9 pixels of the 3 x 3 neighbourhood with the average value of the neighbourhood. The four distinct LBP's are grouped into two distinct LBP's. Based on these two distinct LBP's GLCM is computed and features are evaluated to classify the human age into four age groups i.e: Child (0-15, Young adult (16-30, Middle aged adult (31-50 and senior adult (>50. The co-occurrence features extracted from the 4-DLBP provides complete texture information about an image which is useful for classification. The proposed 4-DLBP reduces the size of the LBP from 6561 to 79 in the case of original texture spectrum and 2020 to 79 in the case of Fuzzy Texture approach.

  6. The Mechanistic Approach to Psychiatric Classification

    Directory of Open Access Journals (Sweden)

    Elisabetta Sirgiovanni

    2009-01-01

    Full Text Available A Kuhnian reformulation of the recent debate in psychiatric nosography suggested that the current psychiatric classification system (the DSM is in crisis and that a sort of paradigm shift is awaited (Aragona, 2009. Among possible revolutionary alternatives, the proposed fi ve-axes etiopathogenetic taxonomy (Charney et al., 2002 emphasizes the primacy of the genotype over the phenomenological level as the relevant basis for psychiatric nosography. Such a position is along the lines of the micro-reductionist perspective of E. Kandel (1998, 1999, which sees mental disorders reducible to explanations at a fundamental epistemic level of genes and neurotransmitters. This form of micro-reductionism has been criticized as a form of genetic-molecular fundamentalism (e.g. Murphy, 2006 and a multi-level approach, in the form of the burgeoning Cognitive Neuropsychiatry, was proposed. This article focuses on multi-level mechanistic explanations, coming from Cognitive Science, as a possible alternative etiopathogenetic basis for psychiatric classification. The idea of a mechanistic approach to psychiatric taxonomy is here defended on the basis of a better conception of levels and causality. Nevertheless some critical remarks of Mechanism as a psychiatric general view are also offered.

  7. Spatio-temporal analysis of discharge regimes based on hydrograph classification techniques in an agricultural catchment

    Science.gov (United States)

    Chen, Xiaofei; Bloeschl, Guenter; Blaschke, Alfred Paul; Silasari, Rasmiaditya; Exner-Kittridge, Mike

    2016-04-01

    The stream, discharges and groundwater hydro-graphs is an integration in spatial and temporal variations for small-scale hydrological response. Characterizing discharges response regime in a drainage farmland is essential to irrigation strategies and hydrologic modeling. Especially for agricultural basins, diurnal hydro-graphs from drainage discharges have been investigated to achieve drainage process inferences in varying magnitudes. To explore the variability of discharge responses, we developed an impersonal method to characterize and classify discharge hydrograph based on features of magnitude and time-series. A cluster analysis (hierarchical k-means) and principal components analysis techniques are used for discharge time-series and groundwater level hydro-graphs to analyze their event characteristics, using 8 different discharge and 18 groundwater level hydro-graphs to test. As the variability of rainfall activity, system location, discharge regime and soil moisture pre-event condition in the catchment, three main clusters of discharge hydro-graph are identified from the test. The results show that : (1) the hydro-graphs from these drainage discharges had similar shapes but different magnitudes for individual rainstorm; the similarity is also showed in overland flow discharge and spring system; (2) for each cluster, the similarity of shape insisted, but the rising slope are different due to different antecedent wetness condition and the rain accumulation meanwhile the difference of regression slope can be explained by system location and discharge area; and (3) surface water always has a close proportional relation with soil moisture throughout the year, while only after the soil moisture exceeds a certain threshold does the outflow of tile drainage systems have a direct ratio relationship with soil moisture and a inverse relationship with the groundwater levels. Finally, we discussed the potential application of hydrograph classification in a wider range of

  8. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    Science.gov (United States)

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  9. Automatic lung nodule classification with radiomics approach

    Science.gov (United States)

    Ma, Jingchen; Wang, Qian; Ren, Yacheng; Hu, Haibo; Zhao, Jun

    2016-03-01

    Lung cancer is the first killer among the cancer deaths. Malignant lung nodules have extremely high mortality while some of the benign nodules don't need any treatment .Thus, the accuracy of diagnosis between benign or malignant nodules diagnosis is necessary. Notably, although currently additional invasive biopsy or second CT scan in 3 months later may help radiologists to make judgments, easier diagnosis approaches are imminently needed. In this paper, we propose a novel CAD method to distinguish the benign and malignant lung cancer from CT images directly, which can not only improve the efficiency of rumor diagnosis but also greatly decrease the pain and risk of patients in biopsy collecting process. Briefly, according to the state-of-the-art radiomics approach, 583 features were used at the first step for measurement of nodules' intensity, shape, heterogeneity and information in multi-frequencies. Further, with Random Forest method, we distinguish the benign nodules from malignant nodules by analyzing all these features. Notably, our proposed scheme was tested on all 79 CT scans with diagnosis data available in The Cancer Imaging Archive (TCIA) which contain 127 nodules and each nodule is annotated by at least one of four radiologists participating in the project. Satisfactorily, this method achieved 82.7% accuracy in classification of malignant primary lung nodules and benign nodules. We believe it would bring much value for routine lung cancer diagnosis in CT imaging and provide improvement in decision-support with much lower cost.

  10. Music Genre Classification Systems - A Computational Approach

    OpenAIRE

    Ahrendt, Peter; Hansen, Lars Kai

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular...

  11. A practicable approach for periodontal classification

    Directory of Open Access Journals (Sweden)

    Vishnu Mittal

    2013-01-01

    Full Text Available The Diagnosis and classification of periodontal diseases has remained a dilemma since long. Two distinct concepts have been used to define diseases: Essentialism and Nominalism. Essentialistic concept implies the real existence of disease whereas; nominalistic concept states that the names of diseases are the convenient way of stating concisely the endpoint of a diagnostic process. It generally advances from assessment of symptoms and signs toward knowledge of causation and gives a feasible option to name the disease for which etiology is either unknown or it is too complex to access in routine clinical practice. Various classifications have been proposed by the American Academy of Periodontology (AAP in 1986, 1989 and 1999. The AAP 1999 classification is among the most widely used classification. But this classification also has demerits which provide impediment for its use in day to day practice. Hence a classification and diagnostic system is required which can help the clinician to access the patient′s need and provide a suitable treatment which is in harmony with the diagnosis for that particular case. Here is an attempt to propose a practicable classification and diagnostic system of periodontal diseases for better treatment outcome.

  12. Multiple Spectral-Spatial Classification Approach for Hyperspectral Data

    Science.gov (United States)

    Tarabalka, Yuliya; Benediktsson, Jon Atli; Chanussot, Jocelyn; Tilton, James C.

    2010-01-01

    A .new multiple classifier approach for spectral-spatial classification of hyperspectral images is proposed. Several classifiers are used independently to classify an image. For every pixel, if all the classifiers have assigned this pixel to the same class, the pixel is kept as a marker, i.e., a seed of the spatial region, with the corresponding class label. We propose to use spectral-spatial classifiers at the preliminary step of the marker selection procedure, each of them combining the results of a pixel-wise classification and a segmentation map. Different segmentation methods based on dissimilar principles lead to different classification results. Furthermore, a minimum spanning forest is built, where each tree is rooted on a classification -driven marker and forms a region in the spectral -spatial classification: map. Experimental results are presented for two hyperspectral airborne images. The proposed method significantly improves classification accuracies, when compared to previously proposed classification techniques.

  13. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  14. An Efficient Audio Classification Approach Based on Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Lhoucine Bahatti

    2016-05-01

    Full Text Available In order to achieve an audio classification aimed to identify the composer, the use of adequate and relevant features is important to improve performance especially when the classification algorithm is based on support vector machines. As opposed to conventional approaches that often use timbral features based on a time-frequency representation of the musical signal using constant window, this paper deals with a new audio classification method which improves the features extraction according the Constant Q Transform (CQT approach and includes original audio features related to the musical context in which the notes appear. The enhancement done by this work is also lay on the proposal of an optimal features selection procedure which combines filter and wrapper strategies. Experimental results show the accuracy and efficiency of the adopted approach in the binary classification as well as in the multi-class classification.

  15. Transparent electrodes in the terahertz regime – a new approach

    DEFF Research Database (Denmark)

    Malureanu, Radu; Song, Z.; Zalkovskij, Maksim

    We suggest a new possibility for obtaining a transparent metallic film, thus allowing for completely transparent electrodes. By placing a complementary composite layer on top of the electrode, we can cancel the back-scattering of the latter thus obtaining a perfectly transparent structure. For ea...... of fabrication, we performed the first experiments in the THz regime, but the concept is applicable to the entire electromagnetic waves spectrum. We show that the experiments and theory match each other perfectly....

  16. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  17. Graduates employment classification using data mining approach

    Science.gov (United States)

    Aziz, Mohd Tajul Rizal Ab; Yusof, Yuhanis

    2016-08-01

    Data Mining is a platform to extract hidden knowledge in a collection of data. This study investigates the suitable classification model to classify graduates employment for one of the MARA Professional College (KPM) in Malaysia. The aim is to classify the graduates into either as employed, unemployed or further study. Five data mining algorithms offered in WEKA were used; Naïve Bayes, Logistic regression, Multilayer perceptron, k-nearest neighbor and Decision tree J48. Based on the obtained result, it is learned that the Logistic regression produces the highest classification accuracy which is at 92.5%. Such result was obtained while using 80% data for training and 20% for testing. The produced classification model will benefit the management of the college as it provides insight to the quality of graduates that they produce and how their curriculum can be improved to cater the needs from the industry.

  18. Eating Disorder Diagnoses: Empirical Approaches to Classification

    Science.gov (United States)

    Wonderlich, Stephen A.; Joiner, Thomas E., Jr.; Keel, Pamela K.; Williamson, Donald A.; Crosby, Ross D.

    2007-01-01

    Decisions about the classification of eating disorders have significant scientific and clinical implications. The eating disorder diagnoses in the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV; American Psychiatric Association, 1994) reflect the collective wisdom of experts in the field but are frequently not supported in…

  19. Toward a common classification approach for biorefinery systems

    NARCIS (Netherlands)

    Cherubini, F.; Jungmeier, G.; Wellisch, M.; Wilke, T.; Skiadas, I.; Ree, van R.; Jong, de E.

    2009-01-01

    This paper deals with a biorefinery classification approach developed within International Energy Agency (IEA) Bioenergy Task 42. Since production of transportation biofuels is seen as the driving force for future biorefinery developments, a selection of the most interesting transportation biofuels

  20. Toward a common classification approach for biorefinery systems

    NARCIS (Netherlands)

    Cherubini, F.; Jungmeier, G.; Wellisch, M.; Wilke, T.; Skiadas, I.; Ree, van R.; Jong, de E.

    2009-01-01

    This paper deals with a biorefinery classification approach developed within International Energy Agency (IEA) Bioenergy Task 42. Since production of transportation biofuels is seen as the driving force for future biorefinery developments, a selection of the most interesting transportation biofuels

  1. A Psycholinguistic Approach to the Classification, Evaluation and ...

    African Journals Online (AJOL)

    KATEVG

    clinical practice, such groups of patients are then subject to similar assessment and ... Phonetic rules are applied to generate the phonetic level .... recently, a linguistic approach toward classification has been adopted, in which the clinical.

  2. Analysis of Kernel Approach in Fuzzy-Based Image Classifications

    Directory of Open Access Journals (Sweden)

    Mragank Singhal

    2013-03-01

    Full Text Available This paper presents a framework of kernel approach in the field of fuzzy based image classification in remote sensing. The goal of image classification is to separate images according to their visual content into two or more disjoint classes. Fuzzy logic is relatively young theory. Major advantage of this theory is that it allows the natural description, in linguistic terms, of problems that should be solved rather than in terms of relationships between precise numerical values. This paper describes how remote sensing data with uncertainty are handled with fuzzy based classification using Kernel approach for land use/land cover maps generation. The introduction to fuzzification using Kernel approach provides the basis for the development of more robust approaches to the remote sensing classification problem. The kernel explicitly defines a similarity measure between two samples and implicitly represents the mapping of the input space to the feature space.

  3. A new classification of large-scale climate regimes around the Tibetan Plateau based on seasonal circulation patterns

    Directory of Open Access Journals (Sweden)

    Xin-Gang Dai

    2017-03-01

    Full Text Available This study aims to develop a large-scale climate classification for investigating the characteristics of the climate regimes around the Tibetan Plateau based on seasonal precipitation, moisture transport and moisture divergence using in situ observations and ERA40 reanalysis data. The results indicate that the climate can be attributed to four regimes around the Plateau. They situate in East Asia, South Asia, Central Asia and the semi-arid zone in northern Central Asia throughout the dryland of northwestern China, in addition to the Köppen climate classification. There are different collocations of seasonal temperature and precipitation: 1 in phase for the East and South Asia monsoon regimes, 2 anti-phase for the Central Asia regime, 3 out-of-phase for the westerly regime. The seasonal precipitation concentrations are coupled with moisture divergence, i.e., moisture convergence coincides with the Asian monsoon zone and divergence appears over the Mediterranean-like arid climate region and westerly controlled area in the warm season, while it reverses course in the cold season. In addition, moisture divergence is associated with meridional moisture transport. The northward/southward moisture transport corresponds to moisture convergence/divergence, indicating that the wet and dry seasons are, to a great extent, dominated by meridional moisture transport in these regions. The climate mean southward transport results in the dry-cold season of the Asian monsoon zone and the dry-warm season, leading to desertification or land degradation in Central Asia and the westerly regime zone. The mean-wind moisture transport (MMT is the major contributor to total moisture transport, while persistent northward transient eddy moisture transport (TEMT plays a key role in dry season precipitation, especially in the Asian monsoon zone. The persistent TEMT divergence is an additional mechanism of the out-of-phase collocation in the westerly regime zone. In addition

  4. Non-Linear Dynamical Classification of Short Time Series of the Rössler System in High Noise Regimes

    Science.gov (United States)

    Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E.; Poizner, Howard; Sejnowski, Terrence J.

    2013-01-01

    Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson’s disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to −30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A′ under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data. PMID

  5. Non-linear Dynamical Classification of Short Time Series of the Rössler System in High Noise Regimes

    Directory of Open Access Journals (Sweden)

    Claudia eLainscsek

    2013-11-01

    Full Text Available Time series analysis with delay differential equations (DDEs reveals nonlinear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10dB to -30dB signal-to-noise ratios (SNR. Structure selection was supervised by selecting the number of terms, delays, and order of nonlinearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.

  6. An improved TF-IDF approach for text classification

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yun-tao; GONG Ling; WANG Yong-cheng

    2005-01-01

    This paper presents a new improved term frequency/inverse document frequency (TF-IDF) approach which uses confidence, support and characteristic words to enhance the recall and precision of text classification. Synonyms defined by a lexicon are processed in the improved TF-IDF approach. We detailedly discuss and analyze the relationship among confidence,recall and precision. The experiments based on science and technology gave promising results that the new TF-IDF approach improves the precision and recall of text classification compared with the conventional TF-IDF approach.

  7. Classification approach based on association rules mining for unbalanced data

    CERN Document Server

    Ndour, Cheikh

    2012-01-01

    This paper deals with the supervised classification when the response variable is binary and its class distribution is unbalanced. In such situation, it is not possible to build a powerful classifier by using standard methods such as logistic regression, classification tree, discriminant analysis, etc. To overcome this short-coming of these methods that provide classifiers with low sensibility, we tackled the classification problem here through an approach based on the association rules learning because this approach has the advantage of allowing the identification of the patterns that are well correlated with the target class. Association rules learning is a well known method in the area of data-mining. It is used when dealing with large database for unsupervised discovery of local patterns that expresses hidden relationships between variables. In considering association rules from a supervised learning point of view, a relevant set of weak classifiers is obtained from which one derives a classification rule...

  8. Precipitation regime classification for the Mojave Desert: Implications for Fire Occurrence

    Energy Technology Data Exchange (ETDEWEB)

    Tagestad, Jerry D.; Brooks, Matthew L.; Cullinan, Valerie I.; Downs, Janelle L.; McKinley, Randy

    2016-01-05

    Mojave Desert ecosystem processes are dependent upon the amount and seasonality of precipitation. Multi-decadal periods of drought or above-average rainfall affect landscape vegetation condition, biomass and susceptibility to fire. The seasonality of precipitation events can also affect the likelihood of lightning, a key ignition source for fires. To develop an understanding of precipitation regimes and fire patterns we used monthly average precipitation data and GIS data representing burned areas from 1971-2010. We applied a K-means cluster analysis to the monthly precipitation data identifying three distinct precipitation seasons; winter (October – March), spring (April-June) and summer (July-September) and four discrete precipitation regimes within the Mojave ecoregion.

  9. Inguinal hernia recurrence: Classification and approach

    Directory of Open Access Journals (Sweden)

    Campanelli Giampiero

    2006-01-01

    Full Text Available The authors reviewed the records of 2,468 operations of groin hernia in 2,350 patients, including 277 recurrent hernias updated to January 2005. The data obtained - evaluating technique, results and complications - were used to propose a simple anatomo-clinical classification into three types which could be used to plan the surgical strategy:Type R1: first recurrence ′high,′ oblique external, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R2: first recurrence ′low,′ direct, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R3: all the other recurrences - including femoral recurrences; recurrent groin hernia with big defect (inguinal eventration; multirecurrent hernias; nonreducible, linked with a controlateral primitive or recurrent hernia; and situations compromised from aggravating factors (for example obesity or anyway not easily included in R1 or R2, after pure tissue or mesh repair.

  10. Discovering Fuzzy Censored Classification Rules (Fccrs: A Genetic Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Renu Bala

    2012-08-01

    Full Text Available Classification Rules (CRs are often discovered in the form of ‘If-Then’ Production Rules (PRs. PRs, beinghigh level symbolic rules, are comprehensible and easy to implement. However, they are not capable ofdealing with cognitive uncertainties like vagueness and ambiguity imperative to real word decision makingsituations. Fuzzy Classification Rules (FCRs based on fuzzy logic provide a framework for a flexiblehuman like reasoning involving linguistic variables. Moreover, a classification system consisting of simple‘If-Then’ rules is not competent in handling exceptional circumstances. In this paper, we propose aGenetic Algorithm approach to discover Fuzzy Censored Classification Rules (FCCRs. A FCCR is aFuzzy Classification Rule (FCRs augmented with censors. Here, censors are exceptional conditions inwhich the behaviour of a rule gets modified. The proposed algorithm works in two phases. In the firstphase, the Genetic Algorithm discovers Fuzzy Classification Rules. Subsequently, these FuzzyClassification Rules are mutated to produce FCCRs in the second phase. The appropriate encodingscheme, fitness function and genetic operators are designed for the discovery of FCCRs. The proposedapproach for discovering FCCRs is then illustrated on a synthetic dataset.

  11. Optimal and Sustainable Exchange Rate Regimes; A Simple Game-Theoretic Approach

    OpenAIRE

    Masahiro Kawai

    1992-01-01

    This paper examines the question of how to design an optimal and sustainable exchange rate regime in a world economy of two interdependent countries. It develops a Barro-Gordon type two-country model and compares noncooperative equilibria under different assumptions of monetary policy credibility and different exchange rate regimes. Using a two-stage game approach to the strategic choice of policy instruments, it identifies optimal (in a Pare to sense) and sustainable (self-enforcing) exchang...

  12. A Hybrid Sensing Approach for Pure and Adulterated Honey Classification

    Directory of Open Access Journals (Sweden)

    Ammar Zakaria

    2012-10-01

    Full Text Available This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA and Principal Component Analysis (PCA statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose and Fourier Transform Infrared Spectroscopy (FTIR were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0% gave higher classification accuracy than e-nose data (76.5% using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data.

  13. A Hybrid Ensemble Learning Approach to Star-Galaxy Classification

    CERN Document Server

    Kim, Edward J; Kind, Matias Carrasco

    2015-01-01

    There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template fitting method. Using data from the CFHTLenS survey, we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2, SDSS, VIPERS, and VVDS, and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, s...

  14. Classification of Simple Oxides: A Polarizability Approach

    Science.gov (United States)

    Dimitrov, Vesselin; Komatsu, Takayuki

    2002-01-01

    A simple oxide classification has been proposed on the basis of correlation between electronic polarizabilities of the ions and their binding energies determined by XPS. Three groups of oxides have been considered taking into account the values obtained on refractive-index- or energy-gap-based oxide ion polarizability, cation polarizability, optical basicity, O 1s binding energy, metal (or nonmetal) binding energy, and Yamashita-Kurosawa's interaction parameter of the oxides. The group of semicovalent predominantly acidic oxides includes BeO, B2O3, P2O5, SiO2, Al2O3, GeO2, and Ga2O3 with low oxide ion polarizability, high O 1s binding energy, low cation polarizability, high metal (or nonmetal) outermost binding energy, comparatively low optical basicity, and strong interionic interaction, leading to the formation of strong covalent bonds. Some main group oxides so-called ionic or basic such as CaO, In2O3, SnO2, and TeO2 and most transition metal oxides show relatively high oxide ion polarizability, O 1s binding energy in a very narrow medium range, high cation polarizability, and low metal (or nonmetal) binding energy. Their optical basicity varies in a narrow range and it is close to that of CaO. The group of very ionic or very basic oxides includes CdO, SrO, and BaO as well as PbO, Sb2O3, and Bi2O3, which possess very high oxide ion polarizability, low O 1s binding energy, very high cation polarizability, and very low metal (or nonmetal) binding energy. Their optical basicity is higher than that of CaO and the interionic interaction is very weak, giving rise to the formation of very ionic chemical bonds.

  15. Paving the Way for an Institutional Approach towards an Ethical Migration Regime

    Directory of Open Access Journals (Sweden)

    Johan Rochel

    2013-08-01

    Full Text Available In thinking about moral principles for an international regime on migration, international lawyers and political theorists wishing to provide practical guidance should adopt a specific methodological approach suitable for international institutions. This paper proposes a methodological tool entitled “normative reflexive dialogue” to support theorists in dealing with the current institutional realities while developing and justifying moral principles that international institutions should follow. After describing the basic features of this approach, which links legal analysis with moral reasoning, GATS Mode 4 will be used as an example of a methodological approach to generating some substantive moral principles for a migration regime.

  16. Approaches to Substance of Social Infrastructure and to Its Classification

    Directory of Open Access Journals (Sweden)

    Kyrychenko Sergiy О. –

    2016-03-01

    Full Text Available The article is concerned with studying and analyzing approaches to both substance and classification of social infrastructure objects as a specific constellation of subsystems and components. To address the purpose set, the following tasks have been formulated: analysis of existing methods for determining the classification of social infrastructure; classification of the branches of social infrastructure using functional-dedicated approach; formulation of author's own definition of substance of social infrastructure. It has been determined that to date most often a social infrastructure classification is carried out depending on its functional tasks, although there are other approaches to classification. The author's definition of substance of social infrastructure has been formulated as follows: social infrastructure is a body of economy branches (public utilities, management, public safety and environment, socio-economic services, the purpose of which is to impact on reproductive potential and overall conditions of human activity in the spheres of work, everyday living, family, social-political, spiritual and intellectual development as well as life activity.

  17. Toward a common classification approach for biorefinery systems

    DEFF Research Database (Denmark)

    Cherubini, Francesco; Jungmeier, Gerfried; Wellisch, Maria

    2009-01-01

    until 2020 is based on their characteristics to be mixed with gasoline, diesel and natural gas, reflecting the main advantage of using the already-existing infrastructure for easier market introduction. This classification approach relies on four main features: (1) platforms; (2) products; (3) feedstock...

  18. Diagnosing Unemployment: The 'Classification' Approach to Multiple Causation

    NARCIS (Netherlands)

    Rodenburg, P.

    2002-01-01

    The establishment of appropriate policy measures for fighting unemployment has always been difficult since causes of unemployment are hard to identify. This paper analyses an approach used mainly in the 1960s and 1970s in economics, in which classification is used as a way to deal with such a

  19. About decomposition approach for solving the classification problem

    Science.gov (United States)

    Andrianova, A. A.

    2016-11-01

    This article describes the features of the application of an algorithm with using of decomposition methods for solving the binary classification problem of constructing a linear classifier based on Support Vector Machine method. Application of decomposition reduces the volume of calculations, in particular, due to the emerging possibilities to build parallel versions of the algorithm, which is a very important advantage for the solution of problems with big data. The analysis of the results of computational experiments conducted using the decomposition approach. The experiment use known data set for binary classification problem.

  20. Knowledge-based approach to video content classification

    Science.gov (United States)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  1. Modelling of eddy-current interaction with cracks in the thin-skin regime. Two approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mastorchio, S. [Electricite de France, 78 - Chatou (France). Research and Development Div.; Harfield, N. [Surrey Univ. (United Kingdom). Dept. of Physics

    1998-02-01

    EDF uses TRIFOU code for eddy current testing modelling. This general electromagnetic code is to be further adapted to Non Destructive Testing applications, not only for nuclear NDT but also in other fields such as aeronautical. This paper compares experimental data for aluminium and steel specimens with two methods of solving the forward problem in the thin-skin regime. The first approach is a 3D Finite Element / Boundary Integral Element method (TRIFOU) developed by EDF/RD Division (France). The second approach is specialized for the treatment of surface cracks in the thin-skin regime developed by the University of Surrey (England). In the thin-skin regime, the electromagnetic skin-depth is small compared with the depth of the crack. Such conditions are common in tests on steels and sometimes on aluminium. (K.A.) 4 refs.

  2. STAR POLYMERS IN GOOD SOLVENTS FROM DILUTE TO CONCENTRATED REGIMES: CROSSOVER APPROACH

    Directory of Open Access Journals (Sweden)

    S.B.Kiselev

    2002-01-01

    Full Text Available An introduction is given to the crossover theory of the conformational and thermodynamic properties of star polymers in good solvents. The crossover theory is tested against Monte Carlo simulation data for the structure and thermodynamics of model star polymers. In good solvent conditions, star polymers approach a "universal" limit as N → ∞, however, there are two types of approach towards this limit. In the dilute regime, a critical degree of polymerization N* is found to play a similar role as the Ginzburg number in the crossover theory for critical phenomena in simple fluids. A rescaled penetration function is found to control the free energy of star polymer solutions in the dilute and semidilute regions. This equation of state captures the scaling behaviour of polymer solutions in the dilute/semidilute regimes and also performs well in the concentrated regimes, where the details of the monomer-monomer interactions become important.

  3. Surgical approach to impacted mandibular third molars--operative classification.

    Science.gov (United States)

    Abu-El Naaj, Imad; Braun, Refael; Leiser, Yoav; Peled, Micha

    2010-03-01

    The aim of the present study is to suggest a convenient way to classify the position of the impacted third mandibular molar relative to the mandibular canal and to suggest indications for the use of each surgical approach for mandibular third molar extraction. The presented new typing system, Third Molar Classification (TMC), is a simple and easy-to-apply method for the surgical management of mandibular third molars and can be extended for any ectopic or impacted mandibular tooth. There are 3 major types of third molar positions. The second type is subdivided further into 2 subtypes. In the present study, 9 patients with high-risk mandibular third molars were treated according to the present classification and are presented and discussed. Patients typed as TMC IIb were treated with a sagittal split osteotomy approach and patients typed as TMC III were treated with an extraoral approach. The operative classification was successfully implemented in very rare cases of deeply impacted mandibular third molars. In 3 of 9 cases (33%) minor complications included some degree of hypoesthesia using the extraoral approach; these complications resolved spontaneously without the need for any intervention. The present study describes the use of a new surgical classification system for treatment planning in all types of mandibular third molar extractions. We believe that the present classification could help the oral and maxillofacial surgeon in decision-making and limit the possible risks that are present when attempting to extract impacted mandibular third molars. Copyright (c) 2010 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  4. An Adaptive Approach to Schema Classification for Data Warehouse Modeling

    Institute of Scientific and Technical Information of China (English)

    Hong-Ding Wang; Yun-Hai Tong; Shao-Hua Tan; Shi-Wei Tang; Dong-Qing Yang; Guo-Hui Sun

    2007-01-01

    Data warehouse (DW) modeling is a complicated task, involving both knowledge of business processes and familiarity with operational information systems structure and behavior. Existing DW modeling techniques suffer from the following major drawbacks -data-driven approach requires high levels of expertise and neglects the requirements of end users, while demand-driven approach lacks enterprise-wide vision and is regardless of existing models of underlying operational systems. In order to make up for those shortcomings, a method of classification of schema elements for DW modeling is proposed in this paper. We first put forward the vector space models for subjects and schema elements, then present an adaptive approach with self-tuning theory to construct context vectors of subjects, and finally classify the source schema elements into different subjects of the DW automatically. Benefited from the result of the schema elements classification, designers can model and construct a DW more easily.

  5. Artificial Neural Network Approach in Radar Target Classification

    Directory of Open Access Journals (Sweden)

    N. K. Ibrahim

    2009-01-01

    Full Text Available Problem statement: This study unveils the potential and utilization of Neural Network (NN in radar applications for target classification. The radar system under test is a special of it kinds and known as Forward Scattering Radar (FSR. In this study the target is a ground vehicle which is represented by typical public road transport. The features from raw radar signal were extracted manually prior to classification process using Neural Network (NN. Features given to the proposed network model are identified through radar theoretical analysis. Multi-Layer Perceptron (MLP back-propagation neural network trained with three back-propagation algorithm was implemented and analyzed. In NN classifier, the unknown target is sent to the network trained by the known targets to attain the accurate output. Approach: Two types of classifications were analyzed. The first one is to classify the exact type of vehicle, four vehicle types were selected. The second objective is to grouped vehicle into their categories. The proposed NN architecture is compared to the K Nearest Neighbor classifier and the performance is evaluated. Results: Based on the results, the proposed NN provides a higher percentage of successful classification than the KNN classifier. Conclusion/Recommendation: The result presented here show that NN can be effectively employed in radar classification applications.

  6. Sustainability in product development: a proposal for classification of approaches

    Directory of Open Access Journals (Sweden)

    Patrícia Flores Magnago

    2012-06-01

    Full Text Available The product development is a process that addresses sustainability issues inside companies. Many approaches have been discussed in academy concerning sustainability, as Natural Capitalism, Design for Environment (DfE and Life Cycle Analysis (LCA, but a question arises: which is indicated for what circumstance? This article aim is the proposition of a classification, based on a literature review, for 15 of these approaches. The criteria were: (i approach nature, (ii organization level, (iii integration level in Product Development Process (PDP, and (iv approach relevance for sustainability dimensions. Common terms allowed the establishment of connections among the approaches. As a result the researchers concluded that, despite they come from distinct knowledge areas they are not mutually excludent, on the contrary, the approaches may be used in a complementary way by managers. The combined use of complementary approaches is finally suggested in the paper.

  7. Classification using sparse representations: a biologically plausible approach.

    Science.gov (United States)

    Spratling, M W

    2014-02-01

    Representing signals as linear combinations of basis vectors sparsely selected from an overcomplete dictionary has proven to be advantageous for many applications in pattern recognition, machine learning, signal processing, and computer vision. While this approach was originally inspired by insights into cortical information processing, biologically plausible approaches have been limited to exploring the functionality of early sensory processing in the brain, while more practical applications have employed non-biologically plausible sparse coding algorithms. Here, a biologically plausible algorithm is proposed that can be applied to practical problems. This algorithm is evaluated using standard benchmark tasks in the domain of pattern classification, and its performance is compared to a wide range of alternative algorithms that are widely used in signal and image processing. The results show that for the classification tasks performed here, the proposed method is competitive with the best of the alternative algorithms that have been evaluated. This demonstrates that classification using sparse representations can be performed in a neurally plausible manner, and hence, that this mechanism of classification might be exploited by the brain.

  8. Shape variability and classification of human hair: a worldwide approach.

    Science.gov (United States)

    De la Mettrie, Roland; Saint-Léger, Didier; Loussouarn, Geneviève; Garcel, Annelise; Porter, Crystal; Langaney, André

    2007-06-01

    Human hair has been commonly classified according to three conventional ethnic human subgroups, that is, African, Asian, and European. Such broad classification hardly accounts for the high complexity of human biological diversity, resulting from both multiple and past or recent mixed origins. The research reported here is intended to develop a more factual and scientific approach based on physical features of human hair. The aim of the study is dual: (1) to define hair types according to specific shape criteria through objective and simple measurements taken on hairs from 1442 subjects from 18 different countries and (2) to define such hair types without referring to human ethnicity. The driving principle is simple: Because hair can be found in many different human subgroups, defining a straight or a curly hair should provide a more objective approach than a debatable ethnicity-based classification. The proposed method is simple to use and requires the measurement of only three easily accessible descriptors of hair shape: curve diameter (CD), curl index (i), and number of waves (w). This method leads to a worldwide coherent classification of hair in eight well-defined categories. The new hair categories, as described, should be more appropriate and more reliable than conventional standards in cosmetic and forensic sciences. Furthermore, the classification can be useful for testing whether hair shape diversity follows the continuous geographic and historical pattern suggested for human genetic variation or presents major discontinuities between some large human subdivisions, as claimed by earlier classical anthropology.

  9. ADHD classification using bag of words approach on network features

    Science.gov (United States)

    Solmaz, Berkan; Dey, Soumyabrata; Rao, A. Ravishankar; Shah, Mubarak

    2012-02-01

    Attention Deficit Hyperactivity Disorder (ADHD) is receiving lots of attention nowadays mainly because it is one of the common brain disorders among children and not much information is known about the cause of this disorder. In this study, we propose to use a novel approach for automatic classification of ADHD conditioned subjects and control subjects using functional Magnetic Resonance Imaging (fMRI) data of resting state brains. For this purpose, we compute the correlation between every possible voxel pairs within a subject and over the time frame of the experimental protocol. A network of voxels is constructed by representing a high correlation value between any two voxels as an edge. A Bag-of-Words (BoW) approach is used to represent each subject as a histogram of network features; such as the number of degrees per voxel. The classification is done using a Support Vector Machine (SVM). We also investigate the use of raw intensity values in the time series for each voxel. Here, every subject is represented as a combined histogram of network and raw intensity features. Experimental results verified that the classification accuracy improves when the combined histogram is used. We tested our approach on a highly challenging dataset released by NITRC for ADHD-200 competition and obtained promising results. The dataset not only has a large size but also includes subjects from different demography and edge groups. To the best of our knowledge, this is the first paper to propose BoW approach in any functional brain disorder classification and we believe that this approach will be useful in analysis of many brain related conditions.

  10. Local fractal dimension based approaches for colonic polyp classification.

    Science.gov (United States)

    Häfner, Michael; Tamaki, Toru; Tanaka, Shinji; Uhl, Andreas; Wimmer, Georg; Yoshida, Shigeto

    2015-12-01

    This work introduces texture analysis methods that are based on computing the local fractal dimension (LFD; or also called the local density function) and applies them for colonic polyp classification. The methods are tested on 8 HD-endoscopic image databases, where each database is acquired using different imaging modalities (Pentax's i-Scan technology combined with or without staining the mucosa) and on a zoom-endoscopic image database using narrow band imaging. In this paper, we present three novel extensions to a LFD based approach. These extensions additionally extract shape and/or gradient information of the image to enhance the discriminativity of the original approach. To compare the results of the LFD based approaches with the results of other approaches, five state of the art approaches for colonic polyp classification are applied to the employed databases. Experiments show that LFD based approaches are well suited for colonic polyp classification, especially the three proposed extensions. The three proposed extensions are the best performing methods or at least among the best performing methods for each of the employed databases. The methods are additionally tested by means of a public texture image database, the UIUCtex database. With this database, the viewpoint invariance of the methods is assessed, an important features for the employed endoscopic image databases. Results imply that most of the LFD based methods are more viewpoint invariant than the other methods. However, the shape, size and orientation adapted LFD approaches (which are especially designed to enhance the viewpoint invariance) are in general not more viewpoint invariant than the other LFD based approaches.

  11. A wrapper-based approach to image segmentation and classification.

    Science.gov (United States)

    Farmer, Michael E; Jain, Anil K

    2005-12-01

    The traditional processing flow of segmentation followed by classification in computer vision assumes that the segmentation is able to successfully extract the object of interest from the background image. It is extremely difficult to obtain a reliable segmentation without any prior knowledge about the object that is being extracted from the scene. This is further complicated by the lack of any clearly defined metrics for evaluating the quality of segmentation or for comparing segmentation algorithms. We propose a method of segmentation that addresses both of these issues, by using the object classification subsystem as an integral part of the segmentation. This will provide contextual information regarding the objects to be segmented, as well as allow us to use the probability of correct classification as a metric to determine the quality of the segmentation. We view traditional segmentation as a filter operating on the image that is independent of the classifier, much like the filter methods for feature selection. We propose a new paradigm for segmentation and classification that follows the wrapper methods of feature selection. Our method wraps the segmentation and classification together, and uses the classification accuracy as the metric to determine the best segmentation. By using shape as the classification feature, we are able to develop a segmentation algorithm that relaxes the requirement that the object of interest to be segmented must be homogeneous in some low-level image parameter, such as texture, color, or grayscale. This represents an improvement over other segmentation methods that have used classification information only to modify the segmenter parameters, since these algorithms still require an underlying homogeneity in some parameter space. Rather than considering our method as, yet, another segmentation algorithm, we propose that our wrapper method can be considered as an image segmentation framework, within which existing image segmentation

  12. Domain Adaptation for Opinion Classification: A Self-Training Approach

    Directory of Open Access Journals (Sweden)

    Yu, Ning

    2013-03-01

    Full Text Available Domain transfer is a widely recognized problem for machine learning algorithms because models built upon one data domain generally do not perform well in another data domain. This is especially a challenge for tasks such as opinion classification, which often has to deal with insufficient quantities of labeled data. This study investigates the feasibility of self-training in dealing with the domain transfer problem in opinion classification via leveraging labeled data in non-target data domain(s and unlabeled data in the target-domain. Specifically, self-training is evaluated for effectiveness in sparse data situations and feasibility for domain adaptation in opinion classification. Three types of Web content are tested: edited news articles, semi-structured movie reviews, and the informal and unstructured content of the blogosphere. Findings of this study suggest that, when there are limited labeled data, self-training is a promising approach for opinion classification, although the contributions vary across data domains. Significant improvement was demonstrated for the most challenging data domain-the blogosphere-when a domain transfer-based self-training strategy was implemented.

  13. A hybrid ensemble learning approach to star-galaxy classification

    Science.gov (United States)

    Kim, Edward J.; Brunner, Robert J.; Carrasco Kind, Matias

    2015-10-01

    There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template-fitting method. Using data from the CFHTLenS survey (Canada-France-Hawaii Telescope Lensing Survey), we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2 (Deep Extragalactic Evolutionary Probe Phase 2 ), SDSS (Sloan Digital Sky Survey), VIPERS (VIMOS Public Extragalactic Redshift Survey), and VVDS (VIMOS VLT Deep Survey), and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, strategies that combine the predictions of different classifiers may prove to be optimal in currently ongoing and forthcoming photometric surveys, such as the Dark Energy Survey and the Large Synoptic Survey Telescope.

  14. A Two Stage Classification Approach for Handwritten Devanagari Characters

    CERN Document Server

    Arora, Sandhya; Nasipuri, Mita; Malik, Latesh

    2010-01-01

    The paper presents a two stage classification approach for handwritten devanagari characters The first stage is using structural properties like shirorekha, spine in character and second stage exploits some intersection features of characters which are fed to a feedforward neural network. Simple histogram based method does not work for finding shirorekha, vertical bar (Spine) in handwritten devnagari characters. So we designed a differential distance based technique to find a near straight line for shirorekha and spine. This approach has been tested for 50000 samples and we got 89.12% success

  15. Morphological Analysis as Classification an Inductive-Learning Approach

    CERN Document Server

    Van den Bosch, A; Weijters, T; Bosch, Antal van den; Daelemans, Walter; Weijters, Ton

    1996-01-01

    Morphological analysis is an important subtask in text-to-speech conversion, hyphenation, and other language engineering tasks. The traditional approach to performing morphological analysis is to combine a morpheme lexicon, sets of (linguistic) rules, and heuristics to find a most probable analysis. In contrast we present an inductive learning approach in which morphological analysis is reformulated as a segmentation task. We report on a number of experiments in which five inductive learning algorithms are applied to three variations of the task of morphological analysis. Results show (i) that the generalisation performance of the algorithms is good, and (ii) that the lazy learning algorithm IB1-IG performs best on all three tasks. We conclude that lazy learning of morphological analysis as a classification task is indeed a viable approach; moreover, it has the strong advantages over the traditional approach of avoiding the knowledge-acquisition bottleneck, being fast and deterministic in learning and process...

  16. Linking river flow regimes to riparian plant guilds: a community-wide modeling approach.

    Science.gov (United States)

    Lytle, David A; Merritt, David M; Tonkin, Jonathan D; Olden, Julian D; Reynolds, Lindsay V

    2017-06-01

    Modeling riparian plant dynamics along rivers is complicated by the fact that plants have different edaphic and hydrologic requirements at different life stages. With intensifying human demands for water and continued human alteration of rivers, there is a growing need for predicting responses of vegetation to flow alteration, including responses related to climate change and river flow management. We developed a coupled structured population model that combines stage-specific responses of plant guilds with specific attributes of river hydrologic regime. The model uses information on the vital rates of guilds as they relate to different hydrologic conditions (flood, drought, and baseflow), but deliberately omits biotic interactions from the structure (interaction neutral). Our intent was to (1) consolidate key vital rates concerning plant population dynamics and to incorporate these data into a quantitative framework, (2) determine whether complex plant stand dynamics, including biotic interactions, can be predicted from basic vital rates and river hydrology, and (3) project how altered flow regimes might affect riparian communities. We illustrated the approach using five flow-response guilds that encompass much of the river floodplain community: hydroriparian tree, xeroriparian shrub, hydroriparian shrub, mesoriparian meadow, and desert shrub. We also developed novel network-based tools for predicting community-wide effects of climate-driven shifts and deliberately altered flow regimes. The model recovered known patterns of hydroriparian tree vs. xeroriparian shrub dominance, including the relative proportion of these two guilds as a function of river flow modification. By simulating flow alteration scenarios ranging from increased drought to shifts in flood timing, the model predicted that mature hydroriparian forest should be most abundant near the observed natural flow regime. Multiguild sensitivity analysis identified substantial network connectivity (many

  17. Vessel-guided airway tree segmentation: A voxel classification approach

    DEFF Research Database (Denmark)

    Ashraf, Haseem; Pedersen, Jesper J H; Lo, Pechin Chien Pau;

    2010-01-01

    This paper presents a method for airway tree segmentation that uses a combination of a trained airway appearance model, vessel and airway orientation information, and region growing. We propose a voxel classification approach for the appearance model, which uses a classifier that is trained...... method is evaluated on 250 low dose computed tomography images from a lung cancer screening trial. Our experiments showed that applying the region growing algorithm on the airway appearance model produces more complete airway segmentations, leading to on average 20% longer trees, and 50% less leakage...

  18. Exploring the physical controls of regional patterns of flow duration curves - Part 3: A catchment classification system based on regime curve indicators

    Science.gov (United States)

    Coopersmith, E.; Yaeger, M. A.; Ye, S.; Cheng, L.; Sivapalan, M.

    2012-11-01

    Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of flow duration curves (FDCs), motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser 3 (ID3) algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the five catchments' regime behavior and climate and catchment properties becomes clearer. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index playing somewhat

  19. Exploring the physical controls of regional patterns of flow duration curves - Part 3: A catchment classification system based on seasonality and runoff regime

    Science.gov (United States)

    Coopersmith, E.; Yaeger, M.; Ye, S.; Cheng, L.; Sivapalan, M.

    2012-06-01

    Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of Flow Duration Curves (FDCs), motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser (ID3) algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the catchments' regime behavior and climate and catchment properties becomes self-evident. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index playing somewhat

  20. Overcoming Privacy Preserving Federalism: A multiscalar approach to the Swiss and German shifts in Gender Regime

    Directory of Open Access Journals (Sweden)

    Olivier Giraud

    2014-12-01

    Full Text Available This contribution provides a comparative overview of the transformation of gender regulations in two federal countries, Germany and Switzerland, drawing upon federalist analysis and gender regime approaches. Both countries have witnessed important legislative reforms towards gender equality from the end of the 1990’s. Mobilizing a multiscalar analytical grid, we situate this shift within the broader context of a social historical transformation of gender regimes that articulates changes in the relations between public and private regulation as well as between different territorial levels. We demonstrate how the federalist privacy deadlock, historically characterizing both countries, has been overcome. Nevertheless, in the field of childcare, the multiscalar grid reveals that both countries follow different trajectories towards equality. In Germany, top-down regulation has eventually supplanted bottom-up processes and there is a modest influence of public regulation over the definition of gender roles. Such changes were not achieved via a similar mode in Switzerland.

  1. Hydrometeor classification from polarimetric radar measurements: a clustering approach

    Directory of Open Access Journals (Sweden)

    J. Grazioli

    2015-01-01

    Full Text Available A data-driven approach to the classification of hydrometeors from measurements collected with polarimetric weather radars is proposed. In a first step, the optimal number of hydrometeor classes (nopt that can be reliably identified from a large set of polarimetric data is determined. This is done by means of an unsupervised clustering technique guided by criteria related both to data similarity and to spatial smoothness of the classified images. In a second step, the nopt clusters are assigned to the appropriate hydrometeor class by means of human interpretation and comparisons with the output of other classification techniques. The main innovation in the proposed method is the unsupervised part: the hydrometeor classes are not defined a priori, but they are learned from data. The approach is applied to data collected by an X-band polarimetric weather radar during two field campaigns (from which about 50 precipitation events are used in the present study. Seven hydrometeor classes (nopt = 7 have been found in the data set, and they have been identified as light rain (LR, rain (RN, heavy rain (HR, melting snow (MS, ice crystals/small aggregates (CR, aggregates (AG, and rimed-ice particles (RI.

  2. Vocabulary Length Experiments for Binary Image Classification Using BOV Approach

    Directory of Open Access Journals (Sweden)

    S.P.Vimal

    2013-12-01

    Full Text Available Bag-of-Visual-words (BoV approach to image classif ication is popular among computer vision scientists . The visual words come from the visual vocabulary wh ich is constructed using the key points extracted f rom the image database. Unlike the natural language, th e length of such vocabulary for image classificatio n is task dependent. The visual words capture the local invariant features of the image. The region of imag e over which a visual word is constrained forms the s patial content for the visual word. Spatial pyramid representation of images is an approach to handle s patial information. In this paper, we study the rol e of vocabulary lengths for the levels of a simple two l evel spatial pyramid to perform binary classificati ons. Two binary classification problems namely to detect the presence of persons and cars are studied. Rele vant images from PASCAL dataset are being used for the l earning activities involved in this work

  3. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  4. A new multi criteria classification approach in a multi agent system applied to SEEG analysis.

    Science.gov (United States)

    Kinié, A; Ndiaye, M; Montois, J J; Jacquelet, Y

    2007-01-01

    This work is focused on the study of the organization of the SEEG signals during epileptic seizures with multi-agent system approach. This approach is based on cooperative mechanisms of auto-organization at the micro level and of emergence of a global function at the macro level. In order to evaluate this approach we propose a distributed collaborative approach for the classification of the interesting signals. This new multi-criteria classification method is able to provide a relevant brain area structures organisation and to bring out epileptogenic networks elements. The method is compared to another classification approach a fuzzy classification and gives better results when applied to SEEG signals.

  5. A hierarchical approach for speech-instrumental-song classification.

    Science.gov (United States)

    Ghosal, Arijit; Chakraborty, Rudrasis; Dhara, Bibhas Chandra; Saha, Sanjoy Kumar

    2013-01-01

    Audio classification acts as the fundamental step for lots of applications like content based audio retrieval and audio indexing. In this work, we have presented a novel scheme for classifying audio signal into three categories namely, speech, music without voice (instrumental) and music with voice (song). A hierarchical approach has been adopted to classify the signals. At the first stage, signals are categorized as speech and music using audio texture derived from simple features like ZCR and STE. Proposed audio texture captures contextual information and summarizes the frame level features. At the second stage, music is further classified as instrumental/song based on Mel frequency cepstral co-efficient (MFCC). A classifier based on Random Sample and Consensus (RANSAC), capable of handling wide variety of data has been utilized. Experimental result indicates the effectiveness of the proposed scheme.

  6. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  7. An information theoretic approach to the functional classification of neurons

    CERN Document Server

    Schneidman, E; Berry, M J; Schneidman, Elad; Bialek, William; Berry, Michael J.

    2002-01-01

    A population of neurons typically exhibits a broad diversity of responses to sensory inputs. The intuitive notion of functional classification is that cells can be clustered so that most of the diversity is captured in the identity of the clusters rather than by individuals within clusters. We show how this intuition can be made precise using information theory, without any need to introduce a metric on the space of stimuli or responses. Applied to the retinal ganglion cells of the salamander, this approach recovers classical results, but also provides clear evidence for subclasses beyond those identified previously. Further, we find that each of the ganglion cells is functionally unique, and that even within the same subclass only a few spikes are needed to reliably distinguish between cells.

  8. An Approach for Automatic Classification of Radiology Reports in Spanish.

    Science.gov (United States)

    Cotik, Viviana; Filippo, Darío; Castaño, José

    2015-01-01

    Automatic detection of relevant terms in medical reports is useful for educational purposes and for clinical research. Natural language processing (NLP) techniques can be applied in order to identify them. In this work we present an approach to classify radiology reports written in Spanish into two sets: the ones that indicate pathological findings and the ones that do not. In addition, the entities corresponding to pathological findings are identified in the reports. We use RadLex, a lexicon of English radiology terms, and NLP techniques to identify the occurrence of pathological findings. Reports are classified using a simple algorithm based on the presence of pathological findings, negation and hedge terms. The implemented algorithms were tested with a test set of 248 reports annotated by an expert, obtaining a best result of 0.72 F1 measure. The output of the classification task can be used to look for specific occurrences of pathological findings.

  9. Two-Stage Approach for Protein Superfamily Classification

    Directory of Open Access Journals (Sweden)

    Swati Vipsita

    2013-01-01

    Full Text Available We deal with the problem of protein superfamily classification in which the family membership of newly discovered amino acid sequence is predicted. Correct prediction is a matter of great concern for the researchers and drug analyst which helps them in discovery of new drugs. As this problem falls broadly under the category of pattern classification problem, we have made all efforts to optimize feature extraction in the first stage and classifier design in the second stage with an overall objective to maximize the performance accuracy of the classifier. In the feature extraction phase, Genetic Algorithm- (GA- based wrapper approach is used to select few eigenvectors from the principal component analysis (PCA space which are encoded as binary strings in the chromosome. On the basis of position of 1’s in the chromosome, the eigenvectors are selected to build the transformation matrix which then maps the original high-dimension feature space to lower dimension feature space. Using PCA-NSGA-II (non-dominated sorting GA, the nondominated solutions obtained from the Pareto front solve the trade-off problem by compromising between the number of eigenvectors selected and the accuracy obtained by the classifier. In the second stage, recursive orthogonal least square algorithm (ROLSA is used for training radial basis function network (RBFN to select optimal number of hidden centres as well as update the output layer weighting matrix. This approach can be applied to large data set with much lower requirements of computer memory. Thus, very small architectures having few number of hidden centres are obtained showing higher level of performance accuracy.

  10. A simplified approach for the molecular classification of glioblastomas.

    Directory of Open Access Journals (Sweden)

    Marie Le Mercier

    Full Text Available Glioblastoma (GBM is the most common malignant primary brain tumors in adults and exhibit striking aggressiveness. Although GBM constitute a single histological entity, they exhibit considerable variability in biological behavior, resulting in significant differences in terms of prognosis and response to treatment. In an attempt to better understand the biology of GBM, many groups have performed high-scale profiling studies based on gene or protein expression. These studies have revealed the existence of several GBM subtypes. Although there remains to be a clear consensus, two to four major subtypes have been identified. Interestingly, these different subtypes are associated with both differential prognoses and responses to therapy. In the present study, we investigated an alternative immunohistochemistry (IHC-based approach to achieve a molecular classification for GBM. For this purpose, a cohort of 100 surgical GBM samples was retrospectively evaluated by immunohistochemical analysis of EGFR, PDGFRA and p53. The quantitative analysis of these immunostainings allowed us to identify the following two GBM subtypes: the "Classical-like" (CL subtype, characterized by EGFR-positive and p53- and PDGFRA-negative staining and the "Proneural-like" (PNL subtype, characterized by p53- and/or PDGFRA-positive staining. This classification represents an independent prognostic factor in terms of overall survival compared to age, extent of resection and adjuvant treatment, with a significantly longer survival associated with the PNL subtype. Moreover, these two GBM subtypes exhibited different responses to chemotherapy. The addition of temozolomide to conventional radiotherapy significantly improved the survival of patients belonging to the CL subtype, but it did not affect the survival of patients belonging to the PNL subtype. We have thus shown that it is possible to differentiate between different clinically relevant subtypes of GBM by using IHC

  11. A Two Step Data Mining Approach for Amharic Text Classification

    Directory of Open Access Journals (Sweden)

    Seffi Gebeyehu

    2016-08-01

    Full Text Available Traditionally, text classifiers are built from labeled training examples (supervised. Labeling is usually done manually by human experts (or the users, which is a labor intensive and time consuming process. In the past few years, researchers have investigated various forms of semi-supervised learning to reduce the burden of manual labeling. In this paper is aimed to show as the accuracy of learned text classifiers can be improved by augmenting a small number of labeled training documents with a large pool of unlabeled documents. This is important because in many text classification problems obtaining training labels is expensive, while large quantities of unlabeled documents are readily available. In this paper, intended to implement an algorithm for learning from labeled and unlabeled documents based on the combination of Expectation- Maximization (EM and two classifiers: Naive Bayes (NB and locally weighted learning (LWL. NB first trains a classifier using the available labeled documents, and probabilistically labels the unlabeled documents while LWL uses a class of function approximation to build a model around the current point of interest. An experiment conducted on a mixture of labeled and unlabeled Amharic text documents showed that the new method achieved a significant performance in comparison with that of a supervised LWL and NB. The result also pointed out that the use of unlabeled data with EM reduces the classification absolute error by 27.6%. In general, since unlabeled documents are much less expensive and easier to collect than labeled documents, this method will be useful for text categorization tasks including online data sources such as web pages, e-mails and news group postings. If one uses this method, building text categorization systems will be significantly faster and less expensive than the supervised learning approach.

  12. An attempt of classification of theoretical approaches to national identity

    Directory of Open Access Journals (Sweden)

    Milošević-Đorđević Jasna S.

    2003-01-01

    Full Text Available It is compulsory that complex social concepts should be defined in different ways and approached from the perspective of different science disciplines. Therefore, it is difficult to precisely define them without overlapping of meaning with other similar concepts. This paper has made an attempt towards theoretical classification of the national identity and differentiate that concept in comparison to the other related concepts (race, ethnic group, nation, national background, authoritativeness, patriarchy. Theoretical assessments are classified into two groups: ones that are dealing with nature of national identity and others that are stating one or more dimensions of national identity, crucial for its determination. On the contrary to the primordialistic concept of national identity, describing it as a fundamental, deeply rooted human feature, there are many numerous contemporary theoretical approaches (instrumentalist, constructivist, functionalistic, emphasizing changeable, fluid, instrumentalist function of the national identity. Fundamental determinants of national identity are: language, culture (music, traditional myths, state symbols (territory, citizenship, self-categorization, religion, set of personal characteristics and values.

  13. Unified Approach to Thermodynamic Optimization of Generic Objective Functions in the Linear Response Regime

    Directory of Open Access Journals (Sweden)

    Yan Wang

    2016-04-01

    Full Text Available While many efforts have been devoted to optimizing the power output for a finite-time thermodynamic process, thermodynamic optimization under realistic situations is not necessarily concerned with power alone; rather, it may be of great relevance to optimize generic objective functions that are combinations of power, entropy production, and/or efficiency. One can optimize the objective function for a given model; generally the obtained results are strongly model dependent. However, if the thermodynamic process in question is operated in the linear response regime, then we show in this work that it is possible to adopt a unified approach to optimizing the objective function, thanks to Onsager’s theory of linear irreversible thermodynamics. A dissipation bound is derived, and based on it, the efficiency associated with the optimization problem, which is universal in the linear response regime and irrespective of model details, can be obtained in a unified way. Our results are in good agreement with previous findings. Moreover, we unveil that the ratio between the stopping time of a finite-time process and the optimized duration time plays a pivotal role in determining the corresponding efficiency in the case of linear response.

  14. Flow adjustment inside large finite-size wind farms approaching the infinite wind farm regime

    Science.gov (United States)

    Wu, Ka Ling; Porté-Agel, Fernando

    2017-04-01

    Due to the increasing number and the growing size of wind farms, the distance among them continues to decrease. Thus, it is necessary to understand how these large finite-size wind farms and their wakes could interfere the atmospheric boundary layer (ABL) dynamics and adjacent wind farms. Fully-developed flow inside wind farms has been extensively studied through numerical simulations of infinite wind farms. The transportation of momentum and energy is only vertical and the advection of them is neglected in these infinite wind farms. However, less attention has been paid to examine the length of wind farms required to reach such asymptotic regime and the ABL dynamics in the leading and trailing edges of the large finite-size wind farms. Large eddy simulations are performed in this study to investigate the flow adjustment inside large finite-size wind farms in conventionally-neutral boundary layer with the effect of Coriolis force and free-atmosphere stratification from 1 to 5 K/km. For the large finite-size wind farms considered in the present work, when the potential temperature lapse rate is 5 K/km, the wind farms exceed the height of the ABL by two orders of magnitude for the incoming flow inside the farms to approach the fully-developed regime. An entrance fetch of approximately 40 times of the ABL height is also required for such flow adjustment. At the fully-developed flow regime of the large finite-size wind farms, the flow characteristics match those of infinite wind farms even though they have different adjustment length scales. The role of advection at the entrance and exit regions of the large finite-size wind farms is also examined. The interaction between the internal boundary layer developed above the large finite-size wind farms and the ABL under different potential temperature lapse rates are compared. It is shown that the potential temperature lapse rate plays a role in whether the flow inside the large finite-size wind farms adjusts to the fully

  15. Optimal choice of an exchange rate regime: a critical literature review

    OpenAIRE

    Ouchen, Mariam

    2013-01-01

    This paper set out to review the main theories and empirical methods employed in selecting an appropriate exchange rate regime.In order to achieve this, the paper is organized as follows : Section 2 introduces the distinct classifications of exchange regimes(de jure exchange rate regimes versus the facto exchange rate regimes), and the different theoretical approaches which illustrate how an optimal exchange rate regime is determined . Despite their initial popularity, the theoretical consi...

  16. Identifying regime shifts in Indian stock market: A Markov switching approach

    OpenAIRE

    Wasim, Ahmad; Bandi, Kamaiah

    2011-01-01

    Seeking for the existence of bull and bear regimes in the Indian stock market, a two state Markov switching autoregressive model (MS (2)-AR (2)) is used to identify bull and bear market regimes. The model predicts that Indian stock market will remain under bull regime with very high probability compared to bear regime. The results also identify the bear phases during all major global economic crises including recent US sub-prime (2008) and European debt crisis (2010). The paper concludes that...

  17. Development of internal solitary waves in various thermocline regimes - a multi-modal approach

    Directory of Open Access Journals (Sweden)

    T. Gerkema

    2003-01-01

    Full Text Available A numerical analysis is made on the appearance of oceanic internal solitary waves in a multi-modal setting. This is done for observed profiles of stratification from the Sulu Sea and the Bay of Biscay, in which thermocline motion is dominated by the first and third mode, respectively. The results show that persistent solitary waves occur only in the former case, in accordance with the observations. In the Bay of Biscay much energy is transferred from the third mode to lower modes, implying that a uni-modal approach would not have been appropriate. To elaborate on these results in a systematic way, a simple model for the stratification is used; an interpretation is given in terms of regimes of thermocline strength.

  18. Unified semiclassical approach to electronic transport from diffusive to ballistic regimes

    Science.gov (United States)

    Geng, Hao; Deng, Wei-Yin; Ren, Yue-Jiao; Sheng, Li; Xing, Ding-Yu

    2016-09-01

    We show that by integrating out the electric field and incorporating proper boundary conditions, a Boltzmann equation can describe electron transport properties, continuously from the diffusive to ballistic regimes. General analytical formulas of the conductance in D = 1,2,3 dimensions are obtained, which recover the Boltzmann-Drude formula and Landauer-Büttiker formula in the diffusive and ballistic limits, respectively. This intuitive and efficient approach can be applied to investigate the interplay of system size and impurity scattering in various charge and spin transport phenomena, when the quantum interference effect is not important. Project supported by the National Basic Research Program of China (Grant Nos. 2015CB921202 and 2014CB921103) and the National Natural Science Foundation of China (Grant No. 11225420).

  19. Modeling and forecasting of wind power generation - Regime-switching approaches

    DEFF Research Database (Denmark)

    Trombe, Pierre-Julien

    for improved forecasts over very short lead times, from a few minutes up to a few hours, because these forecasts, when generated with traditional approaches, are characterized by large uncertainty. In this thesis, this issue is considered from a statistical perspective, with time series models. The primary...... of more renewable energy into power systems since these systems are subjected to maintain a strict balance between electricity consumption and production, at any time. For this purpose, wind power forecasts offer an essential support to power system operators. In particular, there is a growing demand...... of high and low variability. They also yield substantial gains in probabilistic forecast accuracy for lead times of a few minutes. However, these models only integrate historical and local measurements of wind power and thus have a limited ability for notifying regime changes for larger lead times...

  20. A probabilistic approach to quantifying spatial patterns of flow regimes and network-scale connectivity

    Science.gov (United States)

    Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca

    2017-04-01

    The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of

  1. Nanomedical science and laser-driven particle acceleration: promising approaches in the prethermal regime

    Science.gov (United States)

    Gauduel, Y. A.

    2017-05-01

    A major challenge of spatio-temporal radiation biomedicine concerns the understanding of biophysical events triggered by an initial energy deposition inside confined ionization tracks. This contribution deals with an interdisciplinary approach that concerns cutting-edge advances in real-time radiation events, considering the potentialities of innovating strategies based on ultrafast laser science, from femtosecond photon sources to advanced techniques of ultrafast TW laser-plasma accelerator. Recent advances of powerful TW laser sources ( 1019 W cm-2) and laser-plasma interactions providing ultra-short relativistic particle beams in the energy domain 5-200 MeV open promising opportunities for the development of high energy radiation femtochemistry (HERF) in the prethermal regime of secondary low-energy electrons and for the real-time imaging of radiation-induced biomolecular alterations at the nanoscopic scale. New developments would permit to correlate early radiation events triggered by ultrashort radiation sources with a molecular approach of Relative Biological Effectiveness (RBE). These emerging research developments are crucial to understand simultaneously, at the sub-picosecond and nanometric scales, the early consequences of ultra-short-pulsed radiation on biomolecular environments or integrated biological entities. This innovating approach would be applied to biomedical relevant concepts such as the emerging domain of real-time nanodosimetry for targeted pro-drug activation and pulsed radio-chimiotherapy of cancers.

  2. Subgrouping patients with low back pain: evolution of a classification approach to physical therapy.

    Science.gov (United States)

    Fritz, Julie M; Cleland, Joshua A; Childs, John D

    2007-06-01

    The development of valid classification methods to assist the physical therapy management of patients with low back pain has been recognized as a research priority. There is also growing evidence that the use of a classification approach to physical therapy results in better clinical outcomes than the use of alternative management approaches. In 1995 Delitto and colleagues proposed a classification system intended to inform and direct the physical therapy management of patients with low back pain. The system described 4 classifications of patients with low back pain (manipulation, stabilization, specific exercise, and traction). Each classification could be identified by a unique set of examination criteria, and was associated with an intervention strategy believed to result in the best outcomes for the patient. The system was based on expert opinion and research evidence available at the time. A substantial amount of research has emerged in the years since the introduction of this classification system, including the development of clinical prediction rules, providing new evidence for the examination criteria used to place a patient into a classification and for the optimal intervention strategies for each classification. New evidence should continually be incorporated into existing classification systems. The purpose of this clinical commentary is to review this classification system, its evolution and current status, and to discuss its implications for the classification of patients with low back pain.

  3. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  4. State Traditions and Language Regimes: A Historical Institutionalism Approach to Language Policy

    Directory of Open Access Journals (Sweden)

    Sonntag Selma K.

    2015-12-01

    Full Text Available This paper is an elaboration of a theoretical framework we developed in the introductory chapter of our co-edited volume, State Traditions and Language Regimes (McGill-Queen’s University Press, 2015. Using a historical institutionalism approach derived from political science, we argue that language policies need to be understood in terms of their historical and institutional context. The concept of ‘state tradition’ focuses our attention on the relative autonomy of the state in terms of its normative and institutional traditions that lead to particular path dependencies of language policy choices, subject to change at critical junctures. ‘Language regime’ is the conceptual link between state traditions and language policy choices: it allows us to analytically conceptualize how and why these choices are made and how and why they change. We suggest that our framework offers a more robust analysis of language politics than other approaches found in sociolinguistics and normative theory. It also challenges political science to become more engaged with scholarly debate on language policy and linguistic diversity.

  5. Soft computing approach to pattern classification and object recognition a unified concept

    CERN Document Server

    Ray, Kumar S

    2012-01-01

    Soft Computing Approach to Pattern Classification and Object Recognition establishes an innovative, unified approach to supervised pattern classification and model-based occluded object recognition. The book also surveys various soft computing tools, fuzzy relational calculus (FRC), genetic algorithm (GA) and multilayer perceptron (MLP) to provide a strong foundation for the reader. The supervised approach to pattern classification and model-based approach to occluded object recognition are treated in one framework , one based on either a conventional interpretation or a new interpretation of

  6. Visual words based approach for tissue classification in mammograms

    Science.gov (United States)

    Diamant, Idit; Goldberger, Jacob; Greenspan, Hayit

    2013-02-01

    The presence of Microcalcifications (MC) is an important indicator for developing breast cancer. Additional indicators for cancer risk exist, such as breast tissue density type. Different methods have been developed for breast tissue classification for use in Computer-aided diagnosis systems. Recently, the visual words (VW) model has been successfully applied for different classification tasks. The goal of our work is to explore VW based methodologies for various mammography classification tasks. We start with the challenge of classifying breast density and then focus on classification of normal tissue versus Microcalcifications. The presented methodology is based on patch-based visual words model which includes building a dictionary for a training set using local descriptors and representing the image using a visual word histogram. Classification is then performed using k-nearest-neighbour (KNN) and Support vector machine (SVM) classifiers. We tested our algorithm on the MIAS and DDSM publicly available datasets. The input is a representative region-of-interest per mammography image, manually selected and labelled by expert. In the tissue density task, classification accuracy reached 85% using KNN and 88% using SVM, which competes with the state-of-the-art results. For MC vs. normal tissue, accuracy reached 95.6% using SVM. Results demonstrate the feasibility to classify breast tissue using our model. Currently, we are improving the results further while also investigating VW capability to classify additional important mammogram classification problems. We expect that the methodology presented will enable high levels of classification, suggesting new means for automated tools for mammography diagnosis support.

  7. Predicting ecological regime shift under climate change: New modelling techniques and potential of molecular-based approaches

    Directory of Open Access Journals (Sweden)

    Richard STAFFORD, V. Anne SMITH, Dirk HUSMEIER, Thomas GRIMA, Barbara-ann GUINN

    2013-06-01

    Full Text Available Ecological regime shift is the rapid transition from one stable community structure to another, often ecologically inferior, stable community. Such regime shifts are especially common in shallow marine communities, such as the transition of kelp forests to algal turfs that harbour far lower biodiversity. Stable regimes in communities are a result of balanced interactions between species, and predicting new regimes therefore requires an evaluation of new species interactions, as well as the resilience of the ‘stable’ position. While computational optimisation techniques can predict new potential regimes, predicting the most likely community state of the various options produced is currently educated guess work. In this study we integrate a stable regime optimisation approach with a Bayesian network used to infer prior knowledge of the likely stress of climate change (or, in practice, any other disturbance on each component species of a representative rocky shore community model. Combining the results, by calculating the product of the match between resilient computational predictions and the posterior probabilities of the Bayesian network, gives a refined set of model predictors, and demonstrates the use of the process in determining community changes, as might occur through processes such as climate change. To inform Bayesian priors, we conduct a review of molecular approaches applied to the analysis of the transcriptome of rocky shore organisms, and show how such an approach could be linked to measureable stress variables in the field. Hence species-specific microarrays could be designed as biomarkers of in situ stress, and used to inform predictive modelling approaches such as those described here [Current Zoology 59 (3: 403–417, 2013].

  8. Predicting ecological regime shift under climate change:New modelling techniques and potential of molecular-based approaches

    Institute of Scientific and Technical Information of China (English)

    Richard STAFFORD; V.Anne SMITH; Dirk HUSMEIER; Thomas GRIMA; Barbara-ann GUINN

    2013-01-01

    Ecological regime shift is the rapid transition from one stable community structure to another,often ecologically inferior,stable community.Such regime shifts are especially common in shallow marine communities,such as the transition of kelp forests to algal turfs that harbour far lower biodiversity.Stable regimes in communities are a result of balanced interactions between species,and predicting new regimes therefore requires an evaluation of new species interactions,as well as the resilience of the ‘stable' position.While computational optimisation techniques can predict new potential regimes,predicting the most likely community state of the various options produced is currently educated guess work.In this study we integrate a stable regime optimisation approach with a Bayesian network used to infer prior knowledge of the likely stress of climate change (or,in practice,any other disturbance) on each component species of a representative rocky shore community model.Combining the results,by calculating the product of the match between resilient computational predictions and the posterior probabilities of the Bayesian network,gives a refined set of model predictors,and demonstrates the use of the process in determining community changes,as might occur through processes such as climate change.To inform Bayesian priors,we conduct a review of molecular approaches applied to the analysis of the transcriptome of rocky shore organisms,and show how such an approach could be linked to measureable stress variables in the field.Hence species-specific microarrays could be designed as biomarkers of in situ stress,and used to inform predictive modelling approaches such as those described here.

  9. Text Categorization Based on K-Nearest Neighbor Approach for Web Site Classification.

    Science.gov (United States)

    Kwon, Oh-Woog; Lee, Jong-Hyeok

    2003-01-01

    Discusses text categorization and Web site classification and proposes a three-step classification system that includes the use of Web pages linked with the home page. Highlights include the k-nearest neighbor (k-NN) approach; improving performance with a feature selection method and a term weighting scheme using HTML tags; and similarity…

  10. A texton-based approach for the classification of lung parenchyma in CT images

    DEFF Research Database (Denmark)

    Gangeh, Mehrdad J.; Sørensen, Lauge; Shaker, Saher B.

    2010-01-01

    In this paper, a texton-based classification system based on raw pixel representation along with a support vector machine with radial basis function kernel is proposed for the classification of emphysema in computed tomography images of the lung. The proposed approach is tested on 168 annotated...

  11. A novel approach for classification of abnormalities in digitized mammograms

    Indian Academy of Sciences (India)

    S Shanthi; V Murali Bhaskaran

    2014-10-01

    Feature extraction is an important process for the overall system performance in classification. The objective of this article is to reveal the effectiveness of texture feature analysis for detecting the abnormalities in digitized mammograms using Self Adaptive Resource Allocation Network (SRAN) classifier. Thus, we proposed a feature set based on Gabor filters, fractal analysis, multiscale surrounding region dependence method (MSRDM) to identify the most common appearance of breast cancer namely microcalcification, masses and architectural distortion. The results of the experiments indicate that the proposed features with SRAN classifier can improve the classification performance. The SRAN classifier produces the classification accuracy of 98.44% for the proposed features with 192 images from MIAS dataset.

  12. Simulation of the oscillation regimes of bowed bars: a non-linear modal approach

    Science.gov (United States)

    Inácio, Octávio; Henrique, Luís.; Antunes, José

    2003-06-01

    It is still a challenge to properly simulate the complex stick-slip behavior of multi-degree-of-freedom systems. In the present paper we investigate the self-excited non-linear responses of bowed bars, using a time-domain modal approach, coupled with an explicit model for the frictional forces, which is able to emulate stick-slip behavior. This computational approach can provide very detailed simulations and is well suited to deal with systems presenting a dispersive behavior. The effects of the bar supporting fixture are included in the model, as well as a velocity-dependent friction coefficient. We present the results of numerical simulations, for representative ranges of the bowing velocity and normal force. Computations have been performed for constant-section aluminum bars, as well as for real vibraphone bars, which display a central undercutting, intended to help tuning the first modes. Our results show limiting values for the normal force FN and bowing velocity ẏbow for which the "musical" self-sustained solutions exist. Beyond this "playability space", double period and even chaotic regimes were found for specific ranges of the input parameters FN and ẏbow. As also displayed by bowed strings, the vibration amplitudes of bowed bars also increase with the bow velocity. However, in contrast to string instruments, bowed bars "slip" during most of the motion cycle. Another important difference is that, in bowed bars, the self-excited motions are dominated by the system's first mode. Our numerical results are qualitatively supported by preliminary experimental results.

  13. Genome classification by gene distribution: An overlapping subspace clustering approach

    Directory of Open Access Journals (Sweden)

    Halgamuge Saman K

    2008-04-01

    Full Text Available Abstract Background Genomes of lower organisms have been observed with a large amount of horizontal gene transfers, which cause difficulties in their evolutionary study. Bacteriophage genomes are a typical example. One recent approach that addresses this problem is the unsupervised clustering of genomes based on gene order and genome position, which helps to reveal species relationships that may not be apparent from traditional phylogenetic methods. Results We propose the use of an overlapping subspace clustering algorithm for such genome classification problems. The advantage of subspace clustering over traditional clustering is that it can associate clusters with gene arrangement patterns, preserving genomic information in the clusters produced. Additionally, overlapping capability is desirable for the discovery of multiple conserved patterns within a single genome, such as those acquired from different species via horizontal gene transfers. The proposed method involves a novel strategy to vectorize genomes based on their gene distribution. A number of existing subspace clustering and biclustering algorithms were evaluated to identify the best framework upon which to develop our algorithm; we extended a generic subspace clustering algorithm called HARP to incorporate overlapping capability. The proposed algorithm was assessed and applied on bacteriophage genomes. The phage grouping results are consistent overall with the Phage Proteomic Tree and showed common genomic characteristics among the TP901-like, Sfi21-like and sk1-like phage groups. Among 441 phage genomes, we identified four significantly conserved distribution patterns structured by the terminase, portal, integrase, holin and lysin genes. We also observed a subgroup of Sfi21-like phages comprising a distinctive divergent genome organization and identified nine new phage members to the Sfi21-like genus: Staphylococcus 71, phiPVL108, Listeria A118, 2389, Lactobacillus phi AT3, A2

  14. INTRODUCTION OF A SECTORAL APPROACH TO TRANSPORT SECTOR FOR POST-2012 CLIMATE REGIME

    Directory of Open Access Journals (Sweden)

    Atit TIPPICHAI

    2009-01-01

    Full Text Available Recently, the concept of sectoral approaches has been discussed actively under the UNFCCC framework as it could realize GHG mitigations for the Kyoto Protocol and beyond. However, most studies have never introduced this approach to the transport sector explicitly or analyzed its impacts quantitatively. In this paper, we introduce a sectoral approach which aims to set sector-specific emission reduction targets for the transport sector for the post-2012 climate regime. We suppose that developed countries will commit to the sectoral reduction target and key developing countries such as China and India will have the sectoral no-lose targets — no penalties for the failure to meet targets but the right to sell exceeding reductions — for the medium term commitment, i.e. 2013–2020. Six scenarios of total CO2 emission reduction target in the transport sector in 2020, varying from 5% to 30% reductions from the 2005 level are established. The paper preliminarily analyzes shares of emission reductions and abatement costs to meet the targets for key developed countries including the USA, EU-15, Russia, Japan and Canada. To analyze the impacts of the proposed approach, we generate sectoral marginal abatement cost (MAC curves by region through extending a top-down economic model, namely the AIM/CGE model. The total emission reduction targets are analyzed against the developed MAC curves for the transport sector in order to obtain an equal marginal abatement cost which derives optimal emission reduction for each country and minimizes total abatement cost. The results indicate that the USA will play a crucial role in GHG mitigations in the transport sector as it is most responsible for emission reductions (i.e. accounts for more than 70% while Japan will least reduce (i.e. accounts for about 3% for all scenarios. In the case of a 5% reduction, the total abatement is equal to 171.1 MtCO2 with a total cost of 1.61 billion USD; and in the case of a 30

  15. The effects of crude oil shocks on stock market shifts behaviour A regime switching approach

    Energy Technology Data Exchange (ETDEWEB)

    Aloui, Chaker; Jammazi, Rania [International Finance Group-Tunisia, Faculty of Management and Economic Sciences of Tunis, Boulevard du 7 novembre, El Manar University, B.P. 248, C.P. 2092, Tunis Cedex (Tunisia)

    2009-09-15

    In this paper we develop a two regime Markov-switching EGARCH model introduced by Henry [Henry, O., 2009. Regime switching in the relationship between equity returns and short-term interest rates. Journal of Banking and Finance 33, 405-414.] to examine the relationship between crude oil shocks and stock markets. An application to stock markets of UK, France and Japan over the sample period January 1989 to December 2007 illustrates plausible results. We detect two episodes of series behaviour one relative to low mean/high variance regime and the other to high mean/low variance regime. Furthermore, there is evidence that common recessions coincide with the low mean/high variance regime. In addition, we allow both real stock returns and probability of transitions from one regime to another to depend on the net oil price increase variable. The findings show that rises in oil price has a significant role in determining both the volatility of stock returns and the probability of transition across regimes. (author)

  16. Machine Learning Approaches for High-resolution Urban Land Cover Classification: A Comparative Study

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL; Chandola, Varun [ORNL; Cheriyadat, Anil M [ORNL; Bright, Eddie A [ORNL; Bhaduri, Budhendra L [ORNL; Graesser, Jordan B [ORNL

    2011-01-01

    The proliferation of several machine learning approaches makes it difficult to identify a suitable classification technique for analyzing high-resolution remote sensing images. In this study, ten classification techniques were compared from five broad machine learning categories. Surprisingly, the performance of simple statistical classification schemes like maximum likelihood and Logistic regression over complex and recent techniques is very close. Given that these two classifiers require little input from the user, they should still be considered for most classification tasks. Multiple classifier systems is a good choice if the resources permit.

  17. An approach for mechanical fault classification based on generalized discriminant analysis

    Institute of Scientific and Technical Information of China (English)

    LI Wei-hua; SHI Tie-lin; YANG Shu-zi

    2006-01-01

    To deal with pattern classification of complicated mechanical faults,an approach to multi-faults classification based on generalized discriminant analysis is presented.Compared with linear discriminant analysis (LDA),generalized discriminant analysis (GDA),one of nonlinear discriminant analysis methods,is more suitable for classifying the linear non-separable problem.The connection and difference between KPCA (Kernel Principal Component Analysis) and GDA is discussed.KPCA is good at detection of machine abnormality while GDA performs well in multi-faults classification based on the collection of historical faults symptoms.When the proposed method is applied to air compressor condition classification and gear fault classification,an excellent performance in complicated multi-faults classification is presented.

  18. A neural network based approach to social touch classification

    NARCIS (Netherlands)

    van Wingerden, Siewart; Uebbing, Tobias J.; Jung, Merel Madeleine; Poel, Mannes

    Touch is an important interaction modality in social interaction, for instance touch can communicate emotions and can intensify emotions communicated by other modalities. In this paper we explore the use of Neural Networks for the classification of touch. The exploration and assessment of Neural

  19. Spectral transform approaches of 3D coordinates for object classification

    OpenAIRE

    Semenov, N.; Leontiev, A.

    2008-01-01

    This article describes one of the methods to process the data for subsequent classification spectral processing of the three dimensional data. This processing allows, using minimal amount of computation, to transfer the object's coordinates to the starting point, as well as to turn the object around any axis and normalize its size.

  20. New Approaches to Object Classification in Synoptic Sky Surveys

    CERN Document Server

    Donalek, C; Djorgovski, S G; Marney, S; Drake, A; Glikman, E; Graham, M J; Williams, R

    2008-01-01

    Digital synoptic sky surveys pose several new object classification challenges. In surveys where real-time detection and classification of transient events is a science driver, there is a need for an effective elimination of instrument-related artifacts which can masquerade as transient sources in the detection pipeline, e.g., unremoved large cosmic rays, saturation trails, reflections, crosstalk artifacts, etc. We have implemented such an Artifact Filter, using a supervised neural network, for the real-time processing pipeline in the Palomar-Quest (PQ) survey. After the training phase, for each object it takes as input a set of measured morphological parameters and returns the probability of it being a real object. Despite the relatively low number of training cases for many kinds of artifacts, the overall artifact classification rate is around 90%, with no genuine transients misclassified during our real-time scans. Another question is how to assign an optimal star-galaxy classification in a multi-pass surv...

  1. Using hydrogeomorphic criteria to classify wetlands on Mt. Desert Island, Maine - approach, classification system, and examples

    Science.gov (United States)

    Nielsen, Martha G.; Guntenspergen, Glenn R.; Neckles, Hilary A.

    2005-01-01

    A wetland classification system was designed for Mt. Desert Island, Maine, to help categorize the large number of wetlands (over 1,200 mapped units) as an aid to understanding their hydrologic functions. The classification system, developed by the U.S. Geological Survey (USGS), in cooperation with the National Park Service, uses a modified hydrogeomorphic (HGM) approach, and assigns categories based on position in the landscape, soils and surficial geologic setting, and source of water. A dichotomous key was developed to determine a preliminary HGM classification of wetlands on the island. This key is designed for use with USGS topographic maps and 1:24,000 geographic information system (GIS) coverages as an aid to the classification, but may also be used with field data. Hydrologic data collected from a wetland monitoring study were used to determine whether the preliminary classification of individual wetlands using the HGM approach yielded classes that were consistent with actual hydroperiod data. Preliminary HGM classifications of the 20 wetlands in the monitoring study were consistent with the field hydroperiod data. The modified HGM classification approach appears robust, although the method apparently works somewhat better with undisturbed wetlands than with disturbed wetlands. This wetland classification system could be applied to other hydrogeologically similar areas of northern New England.

  2. Human Rights Promotion through Transnational Investment Regimes: An International Political Economy Approach

    National Research Council Canada - National Science Library

    A Claire Cutler

    2013-01-01

      International investment agreements are foundational instruments in a transnational investment regime that governs how states regulate the foreign-owned assets and the foreign investment activities of private actors...

  3. MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH

    Data.gov (United States)

    National Aeronautics and Space Administration — MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Multispectral remote sensing images have...

  4. Evaluation of toroidal torque by non-resonant magnetic perturbations in tokamaks for resonant transport regimes using a Hamiltonian approach

    CERN Document Server

    Albert, Christopher G; Kapper, Gernot; Kasilov, Sergei V; Kernbichler, Winfried; Martitsch, Andreas F

    2016-01-01

    Toroidal torque generated by neoclassical viscosity caused by external non-resonant, non-axisymmetric perturbations has a significant influence on toroidal plasma rotation in tokamaks. In this article, a derivation for the expressions of toroidal torque and radial transport in resonant regimes is provided within quasilinear theory in canonical action-angle variables. The proposed approach treats all low-collisional quasilinear resonant NTV regimes including superbanana plateau and drift-orbit resonances in a unified way and allows for magnetic drift in all regimes. It is valid for perturbations on toroidally symmetric flux surfaces of the unperturbed equilibrium without specific assumptions on geometry or aspect ratio. The resulting expressions are shown to match existing analytical results in the large aspect ratio limit. Numerical results from the newly developed code NEO-RT are compared to calculations by the quasilinear version of the code NEO-2 at low collisionalities. The importance of the magnetic shea...

  5. A Scenario Based Approach to Separate the Impacts of Land Use and Climate Alteration on Daily Flow Regime Indices

    Science.gov (United States)

    Darabi, Hamid; Torabi Haghighi, Ali; Fazel, Nasim; Klöve, Björn

    2017-04-01

    Land use and climate changes have important impacts on water resources such as river flow regimes and they are often complicated to separate at the watershed scale. To separate impact, we develop a scenario based approach using remote sensing and hydro-climatological data. Using the framework, we assess the on hydrological indices in Marboreh watershed (headwater of Dez River which modified by the most important hydropower plant in Iran). The analysis is based on data from three Landsat TM images (1988, 1998 and 2008), meteorological data (1983-2012) at Aligudarz station and hydrological data (1983-2012) at Doroud gauge station. To carry out the study, the QUAC module and supervised classification (ML algorithm) in the ENVI 5.1, the SWAT model and Mann-Kendall method were used for remote sensing, hydrological modeling and trend analysis respectively. To analyses the impact of land use and climate changes, the studied period was divided into three decades (1983-1992, 1993-2002 and 2003-2012). For all periods, the land use maps were assigned based on the middle year of each decade (1988, 1998 and 2008). Then, 10 hydrological indices related to high flow and low flow indices (HDI and LDI) were analyzed for seven scenarios which were created by combining predefined climatic periods and land use maps. Base on the RS analysis, the major alterations in land use including degradation of natural rangeland (-18.49%) and increasing farming land (+16.70%) and residential area (+0.80%) were assessed from 1988 to 2008. The Mann-Kendall test indicates a statistically decreasing trend in rainfall induced runoff and increasing trend in the temperature at the 5% and 1% significance levels, respectively. The results of this study clearly showed that in Marboreh watershed is influenced by climate variability impact on hydrological indices more than land use change. Also, the present study demonstrated that the low flow indices were affected more than high flow indices in both climate

  6. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    Science.gov (United States)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  7. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    Directory of Open Access Journals (Sweden)

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  8. The improvement of approaches to the classification of risks of industrial enterprises

    Directory of Open Access Journals (Sweden)

    I.M. Posokhov

    2016-12-01

    Full Text Available The aim of this article. It is established that there is the lack of uniformity concerning the classification of industrial enterprises risks. It is analyzed the existing approaches to the classification of risks and it is singled out their advantages and disadvantages. The existing typology of industrial enterprises economic risks was supplemented with additional features, including internal and external risks according to the scope of display, according to the nature of emergence - objective and subjective with their further specification into subspecies, and that causes the new approach to the formation of risk classification system. The classification of industrial enterprises risks was improved. The results of the analysis. It was analyzed the existing approaches to the classification of risks, and it was singled out their advantages and disadvantages. The systematization of risks is based on the stages of the risk classification considering the principles of division and grouping, major functions, generalization, adjustment and addition the factors, main classified features, analysis of current approaches. The division of risks according to the scope of display, the nature of emergence and type of production activity into groups, types and subtypes causes the new approach to the formation of the risk classification system. An improved systematic approach to the risk classification reproduces the most likely risks of activities of industrial enterprise, that facilitates the timely usage of appropriate measures for reduction of their impact. A significant advantage of a systematic approach to the risk classification is the possibility of liquidation of features multiplicity, finding their rational place in the system of risks, affecting the activity of the enterprise. It was proposed to widen the risk classification by distinguishing the specific risks of industrial enterprises according to the classified feature “The sphere of

  9. A Common Weight Linear Optimization Approach for Multicriteria ABC Inventory Classification

    Directory of Open Access Journals (Sweden)

    S. M. Hatefi

    2015-01-01

    Full Text Available Organizations typically employ the ABC inventory classification technique to have an efficient control on a huge amount of inventory items. The ABC inventory classification problem is classification of a large amount of items into three groups: A, very important; B, moderately important; and C, relatively unimportant. The traditional ABC classification only accounts for one criterion, namely, the annual dollar usage of the items. But, there are other important criteria in real world which strongly affect the ABC classification. This paper proposes a novel methodology based on a common weight linear optimization model to solve the multiple criteria inventory classification problem. The proposed methodology enables the classification of inventory items via a set of common weights which is very essential in a fair classification. It has a remarkable computational saving when compared with the existing approaches and at the same time it needs no subjective information. Furthermore, it is easy enough to apply for managers. The proposed model is applied on an illustrative example and a case study taken from the literature. Both numerical results and qualitative comparisons with the existing methods reveal several merits of the proposed approach for ABC analysis.

  10. A Novel Imbalanced Data Classification Approach Based on Logistic Regression and Fisher Discriminant

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2015-01-01

    Full Text Available We introduce an imbalanced data classification approach based on logistic regression significant discriminant and Fisher discriminant. First of all, a key indicators extraction model based on logistic regression significant discriminant and correlation analysis is derived to extract features for customer classification. Secondly, on the basis of the linear weighted utilizing Fisher discriminant, a customer scoring model is established. And then, a customer rating model where the customer number of all ratings follows normal distribution is constructed. The performance of the proposed model and the classical SVM classification method are evaluated in terms of their ability to correctly classify consumers as default customer or nondefault customer. Empirical results using the data of 2157 customers in financial engineering suggest that the proposed approach better performance than the SVM model in dealing with imbalanced data classification. Moreover, our approach contributes to locating the qualified customers for the banks and the bond investors.

  11. Developing a novel approach to analyse the regimes of temporary streams and their controls on aquatic biota

    Directory of Open Access Journals (Sweden)

    F. Gallart

    2011-10-01

    Full Text Available Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. The use of the aquatic fauna structural and functional characteristics to assess the ecological quality of a temporary stream reach can not therefore be made without taking into account the controls imposed by the hydrological regime. This paper develops some methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: flood, riffles, connected, pools, dry and arid. We used the water discharge records from gauging stations or simulations using rainfall-runoff models to infer the temporal patterns of occurrence of these states using the developed aquatic states frequency graph. The visual analysis of this graph is complemented by the development of two metrics based on the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of the aquatic regimes of temporary streams in terms of their influence over the development of aquatic life is put forward, defining Permanent, Temporary-pools, Temporary-dry and Episodic regime types. All these methods were tested with data from eight temporary streams around the Mediterranean from MIRAGE project and its application was a precondition to assess the ecological quality of these streams using the current methods prescribed in the European Water Framework Directive for macroinvertebrate communities.

  12. Developing a novel approach to analyse the regimes of temporary streams and their controls on aquatic biota

    Science.gov (United States)

    Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; de Girolamo, A. M.; Lo Porto, A.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Froebrich, J.

    2011-10-01

    Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. The use of the aquatic fauna structural and functional characteristics to assess the ecological quality of a temporary stream reach can not therefore be made without taking into account the controls imposed by the hydrological regime. This paper develops some methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: flood, riffles, connected, pools, dry and arid. We used the water discharge records from gauging stations or simulations using rainfall-runoff models to infer the temporal patterns of occurrence of these states using the developed aquatic states frequency graph. The visual analysis of this graph is complemented by the development of two metrics based on the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of the aquatic regimes of temporary streams in terms of their influence over the development of aquatic life is put forward, defining Permanent, Temporary-pools, Temporary-dry and Episodic regime types. All these methods were tested with data from eight temporary streams around the Mediterranean from MIRAGE project and its application was a precondition to assess the ecological quality of these streams using the current methods prescribed in the European Water Framework Directive for macroinvertebrate communities.

  13. A New Approach Using Data Envelopment Analysis for Ranking Classification Algorithms

    Directory of Open Access Journals (Sweden)

    A. Bazleh

    2011-01-01

    Full Text Available Problem statement: A variety of methods and algorithms for classification problems have been developed recently. But the main question is that how to select an appropriate and effective classification algorithm. This has always been an important and difficult issue. Approach: Since the classification algorithm selection task needs to examine more than one criterion such as accuracy and computational time, it can be modeled and also ranked by Data Envelopment Analysis (DEA technique. Results: In this study, 44 standard databases were modeled by 7 famous classification algorithms and we have examined them by accreditation method. Conclusion/Recommendation: The results indicate that Data Envelopment Analysis (DEA is an appropriate tool for evaluating classification algorithms.

  14. Thermal form factor approach to the ground-state correlation functions of the XXZ chain in the antiferromagnetic massive regime

    CERN Document Server

    Dugave, Maxime; Kozlowski, Karol K; Suzuki, Junji

    2016-01-01

    We use the form factors of the quantum transfer matrix in the zero-temperature limit in order to study the two-point ground-state correlation functions of the XXZ chain in the antiferromagnetic massive regime. We obtain novel form factor series representations of the correlation functions which differ from those derived either from the q-vertex-operator approach or from the algebraic Bethe Ansatz approach to the usual transfer matrix. We advocate that our novel representations are numerically more efficient and allow for a straightforward calculation of the large-distance asymptotic behaviour of the two-point functions. Keeping control over the temperature corrections to the two-point functions we see that these are of order $T^\\infty$ in the whole antiferromagnetic massive regime. The isotropic limit of our result yields a novel form factor series representation for the two-point correlation functions of the XXX chain at zero magnetic field.

  15. A bayesian approach to classification criteria for spectacled eiders

    Science.gov (United States)

    Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.

    1996-01-01

    To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.

  16. Human Rights Promotion through Transnational Investment Regimes: An International Political Economy Approach

    Directory of Open Access Journals (Sweden)

    Claire Cutler

    2013-05-01

    Full Text Available International investment agreements are foundational instruments in a transnational investment regime that governs how states regulate the foreign-owned assets and the foreign investment activities of private actors. Over 3,000 investment agreements between states govern key governmental powers and form the basis for an emerging transnational investment regime. This transnational regime significantly decentralizes, denationalizes, and privatizes decision-making and policy choices over foreign investment. Investment agreements set limits to state action in a number of areas of vital public concern, including the protection of human and labour rights, the environment, and sustainable development. They determine the distribution of power between foreign investors and host states and their societies. However, the societies in which they operate seldom have any input into the terms or operation of these agreements, raising crucial questions of their democratic legitimacy as mechanisms of governance. This paper draws on political science and law to explore the political economy of international investment agreements and asks whether these agreements are potential vehicles for promoting international human rights. The analysis provides an historical account of the investment regime, while a review of the political economy of international investment agreements identifies what appears to be a paradox at the core of their operation. It then examines contract theory for insight into this apparent paradox and considers whether investment agreements are suitable mechanisms for advancing international human rights.

  17. A law & economics approach to the study of integrated management regimes of estuaries

    NARCIS (Netherlands)

    Griendt, van de Wim

    2004-01-01

    In this paper it is proposed to analyse legal regimes for integrated management of estuaries with the help of institutional legal theory and the Schlager & Ostrom framework for types of ownership. Estuaries are highly valued and valuable and therefore need protection. The problem is that they qualif

  18. Human Rights Promotion through Transnational Investment Regimes: An International Political Economy Approach

    Directory of Open Access Journals (Sweden)

    Claire Cutler

    2013-05-01

    Full Text Available International investment agreements are foundational instruments in a transnational investment regime that governs how states regulate the foreign-owned assets and the foreign investment activities of private actors. Over 3,000 investment agreements between states govern key governmental powers and form the basis for an emerging transnational investment regime. This transnational regime significantly decentralizes, denationalizes, and privatizes decision-making and policy choices over foreign investment. Investment agreements set limits to state action in a number of areas of vital public concern, including the protection of human and labour rights, the environment, and sustainable development. They determine the distribution of power between foreign investors and host states and their societies. However, the societies in which they operate seldom have any input into the terms or operation of these agreements, raising crucial questions of their democratic legitimacy as mechanisms of governance. This paper draws on political science and law to explore the political economy of international investment agreements and asks whether these agreements are potential vehicles for promoting international human rights. The analysis provides an historical account of the investment regime, while a review of the political economy of international investment agreements identifies what appears to be a paradox at the core of their operation. It then examines contract theory for insight into this apparent paradox and considers whether investment agreements are suitable mechanisms for advancing international human rights.

  19. A deep learning approach to the classification of 3D CAD models

    Institute of Scientific and Technical Information of China (English)

    Fei-wei QIN; Lu-ye LI; Shu-ming GAO; Xiao-ling YANG; Xiang CHEN

    2014-01-01

    Model classification is essential to the management and reuse of 3D CAD models. Manual model classification is laborious and error prone. At the same time, the automatic classification methods are scarce due to the intrinsic complexity of 3D CAD models. In this paper, we propose an automatic 3D CAD model classification approach based on deep neural networks. According to prior knowledge of the CAD domain, features are selected and extracted from 3D CAD models first, and then pre-processed as high dimensional input vectors for category recognition. By analogy with the thinking process of engineers, a deep neural network classifier for 3D CAD models is constructed with the aid of deep learning techniques. To obtain an optimal solution, multiple strategies are appropriately chosen and applied in the training phase, which makes our classifier achieve better per-formance. We demonstrate the efficiency and effectiveness of our approach through experiments on 3D CAD model datasets.

  20. Automatic training sample selection for a multi-evidence based crop classification approach

    DEFF Research Database (Denmark)

    Chellasamy, Menaka; Ferre, Ty; Greve, Mogens Humlekrog

    three Multi-Layer Perceptron (MLP) neural networks trained separately with spectral, texture and vegetation indices; classification labels were then assigned based on Endorsement Theory. The present study proposes an approach to feed this ensemble classifier with automatically selected training samples......An approach to use the available agricultural parcel information to automatically select training samples for crop classification is investigated. Previous research addressed the multi-evidence crop classification approach using an ensemble classifier. This first produced confidence measures using....... Thus this approach uses the spectral, texture and indices domains in an ensemble framework to iteratively remove the mislabeled pixels from the crop clusters declared by the farmers. Once the clusters are refined, the selected border samples are used for final learning and the unknown samples...

  1. An approach for detection and family classification of malware based on behavioral analysis

    DEFF Research Database (Denmark)

    Hansen, Steven Strandlund; Larsen, Thor Mark Tampus; Stevanovic, Matija

    2016-01-01

    manner. We propose a novel approach for detecting malware and classifying it to either known or novel, i.e., previously unseen malware family. The approach relies on Random Forests classifier for performing both malware detection and family classification. Furthermore, the proposed approach employs novel...... using a modified version of Cuckoo sandbox, that was able to harvest behavioral traces of the analyzed samples in a time-efficient manner. The proposed system achieves high malware detection rate and promising predictive performance in the family classification, opening the possibility of coping...

  2. Novel approaches for the molecular classification of prostate cancer

    Institute of Scientific and Technical Information of China (English)

    Robert H. Getzenberg

    2010-01-01

    @@ Among the urologic cancers, prostate cancer is by far the most common, and it appears to have the potential to affect almost all men throughout the world as they age. A number of studies have shown that many men with prostate cancer will not die from their disease, but rather with the disease but from other causes. These men have a form of prostate cancer that is de-scribed as "very low risk" and has often been called indolent. There are however a group of men that have a form of prostate cancer that is much more aggressive and life threatening. Unlike other cancer types, we have few tools to provide for the molecular classification of prostate cancer.

  3. A CNN Based Approach for Garments Texture Design Classification

    Directory of Open Access Journals (Sweden)

    S.M. Sofiqul Islam

    2017-05-01

    Full Text Available Identifying garments texture design automatically for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several Hand-Engineered feature coding exists for identifying garments design classes. Recently, Deep Convolutional Neural Networks (CNNs have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data more accurately. In this paper, a CNN model for identifying garments design classes has been proposed. Experimental results on two different datasets show better results than existing two well-known CNN models (AlexNet and VGGNet and some state-of-the-art Hand-Engineered feature extraction methods.

  4. Syndromic classification of rickettsioses: an approach for clinical practice

    Directory of Open Access Journals (Sweden)

    Álvaro A. Faccini-Martínez

    2014-11-01

    Full Text Available Rickettsioses share common clinical manifestations, such as fever, malaise, exanthema, the presence or absence of an inoculation eschar, and lymphadenopathy. Some of these manifestations can be suggestive of certain species of Rickettsia infection. Nevertheless none of these manifestations are pathognomonic, and direct diagnostic methods to confirm the involved species are always required. A syndrome is a set of signs and symptoms that characterizes a disease with many etiologies or causes. This situation is applicable to rickettsioses, where different species can cause similar clinical presentations. We propose a syndromic classification for these diseases: exanthematic rickettsiosis syndrome with a low probability of inoculation eschar and rickettsiosis syndrome with a probability of inoculation eschar and their variants. In doing so, we take into account the clinical manifestations, the geographic origin, and the possible vector involved, in order to provide a guide for physicians of the most probable etiological agent.

  5. Characterizing Vocal Repertoires—Hard vs. Soft Classification Approaches

    Science.gov (United States)

    Wadewitz, Philip; Hammerschmidt, Kurt; Battaglia, Demian; Witt, Annette; Wolf, Fred; Fischer, Julia

    2015-01-01

    To understand the proximate and ultimate causes that shape acoustic communication in animals, objective characterizations of the vocal repertoire of a given species are critical, as they provide the foundation for comparative analyses among individuals, populations and taxa. Progress in this field has been hampered by a lack of standard in methodology, however. One problem is that researchers may settle on different variables to characterize the calls, which may impact on the classification of calls. More important, there is no agreement how to best characterize the overall structure of the repertoire in terms of the amount of gradation within and between call types. Here, we address these challenges by examining 912 calls recorded from wild chacma baboons (Papio ursinus). We extracted 118 acoustic variables from spectrograms, from which we constructed different sets of acoustic features, containing 9, 38, and 118 variables; as well 19 factors derived from principal component analysis. We compared and validated the resulting classifications of k-means and hierarchical clustering. Datasets with a higher number of acoustic features lead to better clustering results than datasets with only a few features. The use of factors in the cluster analysis resulted in an extremely poor resolution of emerging call types. Another important finding is that none of the applied clustering methods gave strong support to a specific cluster solution. Instead, the cluster analysis revealed that within distinct call types, subtypes may exist. Because hard clustering methods are not well suited to capture such gradation within call types, we applied a fuzzy clustering algorithm. We found that this algorithm provides a detailed and quantitative description of the gradation within and between chacma baboon call types. In conclusion, we suggest that fuzzy clustering should be used in future studies to analyze the graded structure of vocal repertoires. Moreover, the use of factor analyses to

  6. Characterizing Vocal Repertoires--Hard vs. Soft Classification Approaches.

    Directory of Open Access Journals (Sweden)

    Philip Wadewitz

    Full Text Available To understand the proximate and ultimate causes that shape acoustic communication in animals, objective characterizations of the vocal repertoire of a given species are critical, as they provide the foundation for comparative analyses among individuals, populations and taxa. Progress in this field has been hampered by a lack of standard in methodology, however. One problem is that researchers may settle on different variables to characterize the calls, which may impact on the classification of calls. More important, there is no agreement how to best characterize the overall structure of the repertoire in terms of the amount of gradation within and between call types. Here, we address these challenges by examining 912 calls recorded from wild chacma baboons (Papio ursinus. We extracted 118 acoustic variables from spectrograms, from which we constructed different sets of acoustic features, containing 9, 38, and 118 variables; as well 19 factors derived from principal component analysis. We compared and validated the resulting classifications of k-means and hierarchical clustering. Datasets with a higher number of acoustic features lead to better clustering results than datasets with only a few features. The use of factors in the cluster analysis resulted in an extremely poor resolution of emerging call types. Another important finding is that none of the applied clustering methods gave strong support to a specific cluster solution. Instead, the cluster analysis revealed that within distinct call types, subtypes may exist. Because hard clustering methods are not well suited to capture such gradation within call types, we applied a fuzzy clustering algorithm. We found that this algorithm provides a detailed and quantitative description of the gradation within and between chacma baboon call types. In conclusion, we suggest that fuzzy clustering should be used in future studies to analyze the graded structure of vocal repertoires. Moreover, the use of

  7. Unified Kinetic Approach for Simulation of Gas Flows in Rarefied and Continuum Regimes

    Science.gov (United States)

    2007-06-01

    a low-speed flow induced by temperature gradients. The nonuniform boundary temperature distribution can induce flows in reactor : a significant flow...Rotational Spectrum and Molecular Interaction Potential, ibid R. R. Arslanbekov and V. I. Kolobov, Simulation of Low Pressure Plasma Processing Reactors ... Microchannel flow in the slip regime: gas-kinetic BGK—Burnett solutions, J. Fluid Mech. 513, 87 (2004) 59 R.L.Bayut, PhD thesis, MIT 1999 60

  8. A New Approach in Teaching the Features and Classifications of Invertebrate Animals in Biology Courses

    Directory of Open Access Journals (Sweden)

    Fatih SEZEK

    2013-08-01

    Full Text Available This study examined the effectiveness of a new learning approach in teaching classification of invertebrate animals in biology courses. In this approach, we used an impersonal style: the subject jigsaw, which differs from the other jigsaws in that both course topics and student groups are divided. Students in Jigsaw group were divided into five “subgroups” since teaching the features and classification of invertebrate animals is divided into five subtopics (modules A, B, C, D and E. The subtopics are concerning characteristics used in classification of invertebrate animals and fundamental structures of: phyla porifera and cnidarians (module A, annelid (module B, mollusks (module C, arthropods (module D and Echinodermata (module E. The data obtained in the tests indicated that the the new learning approach was more successful than teacher-centered learning.

  9. Automated classification of histopathology images of prostate cancer using a Bag-of-Words approach

    Science.gov (United States)

    Sanghavi, Foram M.; Agaian, Sos S.

    2016-05-01

    The goals of this paper are (1) test the Computer Aided Classification of the prostate cancer histopathology images based on the Bag-of-Words (BoW) approach (2) evaluate the performance of the classification grade 3 and 4 of the proposed method using the results of the approach proposed by the authors Khurd et al. in [9] and (3) classify the different grades of cancer namely, grade 0, 3, 4, and 5 using the proposed approach. The system performance is assessed using 132 prostate cancer histopathology of different grades. The system performance of the SURF features are also analyzed by comparing the results with SIFT features using different cluster sizes. The results show 90.15% accuracy in detection of prostate cancer images using SURF features with 75 clusters for k-mean clustering. The results showed higher sensitivity for SURF based BoW classification compared to SIFT based BoW.

  10. Classification of real farm conditions Iberian pigs according to the feeding regime with multivariate models developed by using fatty acids composition or NIR spectral data

    Directory of Open Access Journals (Sweden)

    De Pedro, Emiliano

    2009-07-01

    Full Text Available Multivariate Classification models to classify real farm conditions Iberian pigs, according to the feeding regime were developed by using fatty acids composition or NIR spectral data of liquid fat samples. A total of 121 subcutaneous fat samples were taken from Iberian pigs carcasses belonging to 5 batches reared under different feeding systems. Once the liquid sample was extracted from each subcutaneous fat sample, it was determined the percentage of 11 fatty acids (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1. At the same time, Near Infrared (NIR spectrum of each liquid sample was obtained. Linear Discriminant Analysis (LDA was considered as pattern recognition method to develop the multivariate models. Classification errors of the LDA models generated by using NIR spectral data were 0.0% and 1.7% for the model generated by using fatty acids composition. Results confirm the possibility to discriminate Iberian pig liquid samples from animals reared under different feeding regimes on real farm conditions by using NIR spectral data or fatty acids composition. Classification error obtained using models generated from NIR spectral data were lower than those obtained in models based on fatty acids composition.Se han desarrollado modelos multivariantes, generados a partir de la composición en ácidos grasos o datos espectrales NIR, para clasificar según el régimen alimenticio cerdos Ibéricos producidos bajo condiciones no experimentales. Se han empleado 121 muestras de grasa líquida procedentes de grasa subcutánea de canales de cerdos Ibéricos pertenecientes a 5 partidas con regímenes alimenticios diferentes. A dichas muestras líquidas se les determinó el contenido en 11 ácidos grasos (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1 y se obtuvo su espectro NIR. Los modelos de clasificación multivariantes se desarrollaron mediante Análisis Discriminante Lineal. Dichos

  11. Decision tree approach for classification of remotely sensed satellite data using open source support

    Indian Academy of Sciences (India)

    Richa Sharma; Aniruddha Ghosh; P K Joshi

    2013-10-01

    In this study, an attempt has been made to develop a decision tree classification (DTC) algorithm for classification of remotely sensed satellite data (Landsat TM) using open source support. The decision tree is constructed by recursively partitioning the spectral distribution of the training dataset using WEKA, open source data mining software. The classified image is compared with the image classified using classical ISODATA clustering and Maximum Likelihood Classifier (MLC) algorithms. Classification result based on DTC method provided better visual depiction than results produced by ISODATA clustering or by MLC algorithms. The overall accuracy was found to be 90% (kappa = 0.88) using the DTC, 76.67% (kappa = 0.72) using the Maximum Likelihood and 57.5% (kappa = 0.49) using ISODATA clustering method. Based on the overall accuracy and kappa statistics, DTC was found to be more preferred classification approach than others.

  12. Intelligence system based classification approach for medical disease diagnosis

    Science.gov (United States)

    Sagir, Abdu Masanawa; Sathasivam, Saratha

    2017-08-01

    The prediction of breast cancer in women who have no signs or symptoms of the disease as well as survivability after undergone certain surgery has been a challenging problem for medical researchers. The decision about presence or absence of diseases depends on the physician's intuition, experience and skill for comparing current indicators with previous one than on knowledge rich data hidden in a database. This measure is a very crucial and challenging task. The goal is to predict patient condition by using an adaptive neuro fuzzy inference system (ANFIS) pre-processed by grid partitioning. To achieve an accurate diagnosis at this complex stage of symptom analysis, the physician may need efficient diagnosis system. A framework describes methodology for designing and evaluation of classification performances of two discrete ANFIS systems of hybrid learning algorithms least square estimates with Modified Levenberg-Marquardt and Gradient descent algorithms that can be used by physicians to accelerate diagnosis process. The proposed method's performance was evaluated based on training and test datasets with mammographic mass and Haberman's survival Datasets obtained from benchmarked datasets of University of California at Irvine's (UCI) machine learning repository. The robustness of the performance measuring total accuracy, sensitivity and specificity is examined. In comparison, the proposed method achieves superior performance when compared to conventional ANFIS based gradient descent algorithm and some related existing methods. The software used for the implementation is MATLAB R2014a (version 8.3) and executed in PC Intel Pentium IV E7400 processor with 2.80 GHz speed and 2.0 GB of RAM.

  13. Innovation approaches to controlling the electric regimes of electric arc furnaces

    Science.gov (United States)

    Bikeev, R. A.; Serikov, V. A.; Ognev, A. M.; Rechkalov, A. V.; Cherednichenko, V. S.

    2015-12-01

    The processes of current passage in an ac electric arc furnace (EAF) are subjected to industrial experiments and mathematical simulation. It is shown that, when a charge is melted, arcs between charge fragments exist in series with main arc discharges, and these arcs influence the stability of the main arc discharges. The measurement of instantaneous currents and voltages allowed us to perform a real-time calculation of the electrical characteristics of a three-phase circuit and to determine the θ parameter, which characterizes the nonlinearity of the circuit segment between electrodes. Based on these studies, we created an advanced system for controlling the electric regime of EAF.

  14. A transdisciplinary approach to understanding the health effects of wildfire and prescribed fire smoke regimes

    Science.gov (United States)

    Williamson, G. J.; Bowman, D. M. J. S.; Price, O. F.; Henderson, S. B.; Johnston, F. H.

    2016-12-01

    Prescribed burning is used to reduce the occurrence, extent and severity of uncontrolled fires in many flammable landscapes. However, epidemiologic evidence of the human health impacts of landscape fire smoke emissions is shaping fire management practice through increasingly stringent environmental regulation and public health policy. An unresolved question, critical for sustainable fire management, concerns the comparative human health effects of smoke from wild and prescribed fires. Here we review current knowledge of the health effects of landscape fire emissions and consider the similarities and differences in smoke from wild and prescribed fires with respect to the typical combustion conditions and fuel properties, the quality and magnitude of air pollution emissions, and the potential for dispersion to large populations. We further examine the interactions between these considerations, and how they may shape the longer term smoke regimes to which populations are exposed. We identify numerous knowledge gaps and propose a conceptual framework that describes pathways to better understanding of the health trade-offs of prescribed and wildfire smoke regimes.

  15. Evaluating machine learning classification for financial trading: An empirical approach

    OpenAIRE

    Gerlein, EA; McGinnity, M; Belatreche, A; Coleman, S.

    2016-01-01

    Technical and quantitative analysis in financial trading use mathematical and statistical tools to help investors decide on the optimum moment to initiate and close orders. While these traditional approaches have served their purpose to some extent, new techniques arising from the field of computational intelligence such as machine learning and data mining have emerged to analyse financial information. While the main financial engineering research has focused on complex computational models s...

  16. New approaches for the financial distress classification in agribusiness

    Directory of Open Access Journals (Sweden)

    Jan Vavřina

    2013-01-01

    Full Text Available After the recent financial crisis the need for unchallenged tools evaluating the financial health of enterprises has even arisen. Apart from well-known techniques such as Z-score and logit models, a new approaches were suggested, namely the data envelopment analysis (DEA reformulation for bankruptcy prediction and production function-based economic performance evaluation (PFEP. Being recently suggested, these techniques have not yet been validated for common use in financial sector, although as for DEA approach some introductory studies are available for manufacturing and IT industry. In this contribution we focus on the thorough validation calculations that evaluate these techniques for the specific agribusiness industry. To keep the data as homogeneous as possible we limit the choice of agribusiness companies onto the area of the countries of Visegrad Group. The extensive data set covering several hundreds of enterprises were collected employing the database Amadeus of Bureau van Dijk. We present the validation results for each of the four mentioned methods, outline the strengths and weaknesses of each approach and discuss the valid suggestions for the effective detection of financial problems in the specific branch of agribusiness.

  17. A novel deep learning approach for classification of EEG motor imagery signals

    Science.gov (United States)

    Rezaei Tabar, Yousef; Halici, Ugur

    2017-02-01

    Objective. Signal classification is an important issue in brain computer interface (BCI) systems. Deep learning approaches have been used successfully in many recent studies to learn features and classify different types of data. However, the number of studies that employ these approaches on BCI applications is very limited. In this study we aim to use deep learning methods to improve classification performance of EEG motor imagery signals. Approach. In this study we investigate convolutional neural networks (CNN) and stacked autoencoders (SAE) to classify EEG Motor Imagery signals. A new form of input is introduced to combine time, frequency and location information extracted from EEG signal and it is used in CNN having one 1D convolutional and one max-pooling layers. We also proposed a new deep network by combining CNN and SAE. In this network, the features that are extracted in CNN are classified through the deep network SAE. Main results. The classification performance obtained by the proposed method on BCI competition IV dataset 2b in terms of kappa value is 0.547. Our approach yields 9% improvement over the winner algorithm of the competition. Significance. Our results show that deep learning methods provide better classification performance compared to other state of art approaches. These methods can be applied successfully to BCI systems where the amount of data is large due to daily recording.

  18. Evaluation of toroidal torque by non-resonant magnetic perturbations in tokamaks for resonant transport regimes using a Hamiltonian approach

    Science.gov (United States)

    Albert, Christopher G.; Heyn, Martin F.; Kapper, Gernot; Kasilov, Sergei V.; Kernbichler, Winfried; Martitsch, Andreas F.

    2016-08-01

    Toroidal torque generated by neoclassical viscosity caused by external non-resonant, non-axisymmetric perturbations has a significant influence on toroidal plasma rotation in tokamaks. In this article, a derivation for the expressions of toroidal torque and radial transport in resonant regimes is provided within quasilinear theory in canonical action-angle variables. The proposed approach treats all low-collisional quasilinear resonant neoclassical toroidal viscosity regimes including superbanana-plateau and drift-orbit resonances in a unified way and allows for magnetic drift in all regimes. It is valid for perturbations on toroidally symmetric flux surfaces of the unperturbed equilibrium without specific assumptions on geometry or aspect ratio. The resulting expressions are shown to match the existing analytical results in the large aspect ratio limit. Numerical results from the newly developed code NEO-RT are compared to calculations by the quasilinear version of the code NEO-2 at low collisionalities. The importance of the magnetic shear term in the magnetic drift frequency and a significant effect of the magnetic drift on drift-orbit resonances are demonstrated.

  19. Head Pose Estimation on Eyeglasses Using Line Detection and Classification Approach

    Science.gov (United States)

    Setthawong, Pisal; Vannija, Vajirasak

    This paper proposes a unique approach for head pose estimation of subjects with eyeglasses by using a combination of line detection and classification approaches. Head pose estimation is considered as an important non-verbal form of communication and could also be used in the area of Human-Computer Interface. A major improvement of the proposed approach is that it allows estimation of head poses at a high yaw/pitch angle when compared with existing geometric approaches, does not require expensive data preparation and training, and is generally fast when compared with other approaches.

  20. Sows’ activity classification device using acceleration data – A resource constrained approach

    DEFF Research Database (Denmark)

    Marchioro, Gilberto Fernandes; Cornou, Cécile; Kristensen, Anders Ringgaard

    2011-01-01

    This paper discusses the main architectural alternatives and design decisions in order to implement a sows’ activity classification model on electronic devices. The different possibilities are analyzed in practical and technical aspects, focusing on the implementation metrics, like cost......, performance, complexity and reliability. The target architectures are divided into: server based, where the main processing element is a central computer; and embedded based, where the processing is distributed on devices attached to the animals. The initial classification model identifies the activities...... of a heuristic classification approach, focusing on the resource constrained characteristics of embedded systems. The new approach classifies the activities performed by the sows with accuracy close to 90%. It was implemented as a hardware module that can easily be instantiated to provide preprocessed...

  1. Reconciling the eigenmode analysis with the Maxwell-Bloch equations approach to superradiance in the linear regime

    Energy Technology Data Exchange (ETDEWEB)

    Friedberg, Richard [Department of Physics, Columbia University, New York, NY 10027 (United States); Manassah, Jamal T. [HMS Consultants, Inc., PO Box 592, New York, NY 10028 (United States)], E-mail: jmanassah@gmail.com

    2008-07-28

    The superradiance from a slab of inverted two-level atoms is theoretically analyzed in the linear regime from both the perspective of the expansion in eigenfunctions of the integral equation with the Lienard-Wiechert potential as kernel, and that of linearizing the Maxwell-Bloch equations. We show the equivalence of both approaches. We show that the so-called Reduced Maxwell-Bloch equations do not yield even approximately the correct solution when applied in the obvious way, but that they can be made to give the correct solution by adding a fictitious input field.

  2. Mapping raised bogs with an iterative one-class classification approach

    Science.gov (United States)

    Mack, Benjamin; Roscher, Ribana; Stenzel, Stefanie; Feilhauer, Hannes; Schmidtlein, Sebastian; Waske, Björn

    2016-10-01

    Land use and land cover maps are one of the most commonly used remote sensing products. In many applications the user only requires a map of one particular class of interest, e.g. a specific vegetation type or an invasive species. One-class classifiers are appealing alternatives to common supervised classifiers because they can be trained with labeled training data of the class of interest only. However, training an accurate one-class classification (OCC) model is challenging, particularly when facing a large image, a small class and few training samples. To tackle these problems we propose an iterative OCC approach. The presented approach uses a biased Support Vector Machine as core classifier. In an iterative pre-classification step a large part of the pixels not belonging to the class of interest is classified. The remaining data is classified by a final classifier with a novel model and threshold selection approach. The specific objective of our study is the classification of raised bogs in a study site in southeast Germany, using multi-seasonal RapidEye data and a small number of training sample. Results demonstrate that the iterative OCC outperforms other state of the art one-class classifiers and approaches for model selection. The study highlights the potential of the proposed approach for an efficient and improved mapping of small classes such as raised bogs. Overall the proposed approach constitutes a feasible approach and useful modification of a regular one-class classifier.

  3. Review on Electrodynamic Energy Harvesters—A Classification Approach

    Directory of Open Access Journals (Sweden)

    Roland Lausecker

    2013-04-01

    Full Text Available Beginning with a short historical sketch, electrodynamic energy harvesters with focus on vibration generators and volumes below 1dm3 are reviewed. The current challenges to generate up to several milliwatts of power from practically relevant flows and vibrations are addressed, and the variety of available solutions is sketched. Sixty-seven different harvester concepts from more than 130 publications are classified with respect to excitation, additional boundary conditions, design and fabrication. A chronological list of the harvester concepts with corresponding references provides an impression about the developments. Besides resonant harvester concepts, the review includes broadband approaches and mechanisms to harvest from flow. Finally, a short overview of harvesters in applications and first market ready concepts is given.

  4. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    Science.gov (United States)

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  5. Theoretic-methodological approaches to determine the content and classification of innovation-investment development strategies

    Directory of Open Access Journals (Sweden)

    Gerashenkova Tatyana

    2016-01-01

    Full Text Available The article states the necessity to form an innovation-investment strategy of enterprise development, offers an approach to its classification, determines the place of this strategy in a corporatewide strategy, gives the methodology of formation and the realization form of the innovation-investment development strategy.

  6. A novel information transferring approach for the classification of remote sensing images

    Science.gov (United States)

    Gao, Jianqiang; Xu, Lizhong; Shen, Jie; Huang, Fengchen; Xu, Feng

    2015-12-01

    Traditional remote sensing images classification methods focused on using a large amount of labeled target data to train an efficient classification model. However, these approaches were generally based on the target data without considering a host of auxiliary data or the additional information of auxiliary data. If the valuable information from auxiliary data could be successfully transferred to the target data, the performance of the classification model would be improved. In addition, from the perspective of practical application, these valuable information from auxiliary data should be fully used. Therefore, in this paper, based on the transfer learning idea, we proposed a novel information transferring approach to improve the remote sensing images classification performance. The main rationale of this approach is that first, the information of the same areas associated with each pixel is modeled as the intra-class set, and the information of different areas associated with each pixel is modeled as the inter-class set, and then the obtained texture feature information of each area from auxiliary is transferred to the target data set such that the inter-class set is separated and intra-class set is gathered as far as possible. Experiments show that the proposed approach is effective and feasible.

  7. A Hybrid Reduction Approach for Enhancing Cancer Classification of Microarray Data

    Directory of Open Access Journals (Sweden)

    Abeer M. Mahmoud

    2014-10-01

    Full Text Available This paper presents a novel hybrid machine learning (MLreduction approach to enhance cancer classification accuracy of microarray data based on two ML gene ranking techniques (T-test and Class Separability (CS. The proposed approach is integrated with two ML classifiers; K-nearest neighbor (KNN and support vector machine (SVM; for mining microarray gene expression profiles. Four public cancer microarray databases are used for evaluating the proposed approach and successfully accomplish the mining process. These are Lymphoma, Leukemia SRBCT, and Lung Cancer. The strategy to select genes only from the training samples and totally excluding the testing samples from the classifier building process is utilized for more accurate and validated results. Also, the computational experiments are illustrated in details and comprehensively presented with literature related results. The results showed that the proposed reduction approach reached promising results of the number of genes supplemented to the classifiers as well as the classification accuracy.

  8. An ensemble classification approach for improved Land use/cover change detection

    Science.gov (United States)

    Chellasamy, M.; Ferré, T. P. A.; Humlekrog Greve, M.; Larsen, R.; Chinnasamy, U.

    2014-11-01

    Change Detection (CD) methods based on post-classification comparison approaches are claimed to provide potentially reliable results. They are considered to be most obvious quantitative method in the analysis of Land Use Land Cover (LULC) changes which provides from - to change information. But, the performance of post-classification comparison approaches highly depends on the accuracy of classification of individual images used for comparison. Hence, we present a classification approach that produce accurate classified results which aids to obtain improved change detection results. Machine learning is a part of broader framework in change detection, where neural networks have drawn much attention. Neural network algorithms adaptively estimate continuous functions from input data without mathematical representation of output dependence on input. A common practice for classification is to use Multi-Layer-Perceptron (MLP) neural network with backpropogation learning algorithm for prediction. To increase the ability of learning and prediction, multiple inputs (spectral, texture, topography, and multi-temporal information) are generally stacked to incorporate diversity of information. On the other hand literatures claims backpropagation algorithm to exhibit weak and unstable learning in use of multiple inputs, while dealing with complex datasets characterized by mixed uncertainty levels. To address the problem of learning complex information, we propose an ensemble classification technique that incorporates multiple inputs for classification unlike traditional stacking of multiple input data. In this paper, we present an Endorsement Theory based ensemble classification that integrates multiple information, in terms of prediction probabilities, to produce final classification results. Three different input datasets are used in this study: spectral, texture and indices, from SPOT-4 multispectral imagery captured on 1998 and 2003. Each SPOT image is classified

  9. In vivo application of an optical segment tracking approach for bone loading regimes recording in humans: a reliability study.

    Science.gov (United States)

    Yang, Peng-Fei; Sanno, Maximilian; Ganse, Bergita; Koy, Timmo; Brüggemann, Gert-Peter; Müller, Lars Peter; Rittweger, Jörn

    2014-08-01

    This paper demonstrates an optical segment tracking (OST) approach for assessing the in vivo bone loading regimes in humans. The relative movement between retro-reflective marker clusters affixed to the tibia cortex by bone screws was tracked and expressed as tibia loading regimes in terms of segment deformation. Stable in vivo fixation of bone screws was tested by assessing the resonance frequency of the screw-marker structure and the relative marker position changes after hopping and jumping. Tibia deformation was recorded during squatting exercises to demonstrate the reliability of the OST approach. Results indicated that the resonance frequencies remain unchanged prior to and after all exercises. The changes of Cardan angle between marker clusters induced by the exercises were rather minor, maximally 0.06°. The reproducibility of the deformation angles during squatting remained small (0.04°/m-0.65°/m). Most importantly, all surgical and testing procedures were well tolerated. The OST method promises to bring more insights of the mechanical loading acting on bone than in the past.

  10. Thermal form factor approach to the ground-state correlation functions of the XXZ chain in the antiferromagnetic massive regime

    Science.gov (United States)

    Dugave, Maxime; Göhmann, Frank; Kozlowski, Karol K.; Suzuki, Junji

    2016-09-01

    We use the form factors of the quantum transfer matrix in the zero-temperature limit in order to study the two-point ground-state correlation functions of the XXZ chain in the antiferromagnetic massive regime. We obtain novel form factor series representations of the correlation functions which differ from those derived either from the q-vertex-operator approach or from the algebraic Bethe Ansatz approach to the usual transfer matrix. We advocate that our novel representations are numerically more efficient and allow for a straightforward calculation of the large-distance asymptotic behaviour of the two-point functions. Keeping control over the temperature corrections to the two-point functions we see that these are of order {T}∞ in the whole antiferromagnetic massive regime. The isotropic limit of our result yields a novel form factor series representation for the two-point correlation functions of the XXX chain at zero magnetic field. Dedicated to the memory of Petr Petrovich Kulish.

  11. Response of plant functional types to changes in the fire regime in Mediterranean ecosystems: A simulation approach

    Energy Technology Data Exchange (ETDEWEB)

    Pausas, J.G. [Centro de Estudios Ambientales del Mediterraneo (CEAM), Valencia (Spain)

    1999-10-01

    In the Mediterranean basin, the climate is predicted to be warmer and effectively drier, leading to changes in fuel conditions and fire regime. Land abandonment in the Mediterranean basin is also changing the fire regime through the increase in fuel loads. In the present study, two simulation models of vegetation dynamics were tested in order to predict changes in plant functional types due to changes in fire recurrence in eastern Spain. The two modelling approaches are the FATE-model (based on vital attributes) and the gap model BROLLA (based on the gap-phase theory). The models were arranged to simulate four functional types, based mainly on their regenerative strategies after disturbance: Quercus (resprouter), Pinus (non-resprouter with serotinous cones), Erica (resprouter), and Cistus (non-resprouter with germination stimulated by fire). The simulation results suggested a decrease in Quercus abundance, an increase in Cistus and Erica, and a maximum of Pinus at intermediate recurrence scenarios. Despite their different approaches, both models predicted a similar response to increased fire recurrence, and the results were consistent with field observations.

  12. Non-canonical distribution and non-equilibrium transport beyond weak system-bath coupling regime: A polaron transformation approach

    Science.gov (United States)

    Xu, Dazhi; Cao, Jianshu

    2016-08-01

    The concept of polaron, emerged from condense matter physics, describes the dynamical interaction of moving particle with its surrounding bosonic modes. This concept has been developed into a useful method to treat open quantum systems with a complete range of system-bath coupling strength. Especially, the polaron transformation approach shows its validity in the intermediate coupling regime, in which the Redfield equation or Fermi's golden rule will fail. In the polaron frame, the equilibrium distribution carried out by perturbative expansion presents a deviation from the canonical distribution, which is beyond the usual weak coupling assumption in thermodynamics. A polaron transformed Redfield equation (PTRE) not only reproduces the dissipative quantum dynamics but also provides an accurate and efficient way to calculate the non-equilibrium steady states. Applications of the PTRE approach to problems such as exciton diffusion, heat transport and light-harvesting energy transfer are presented.

  13. Computer-aided diagnosis of interstitial lung disease: a texture feature extraction and classification approach

    Science.gov (United States)

    Vargas-Voracek, Rene; McAdams, H. Page; Floyd, Carey E., Jr.

    1998-06-01

    An approach for the classification of normal or abnormal lung parenchyma from selected regions of interest (ROIs) of chest radiographs is presented for computer aided diagnosis of interstitial lung disease (ILD). The proposed approach uses a feed-forward neural network to classify each ROI based on a set of isotropic texture measures obtained from the joint grey level distribution of pairs of pixels separated by a specific distance. Two hundred ROIs, each 64 X 64 pixels in size (11 X 11 mm), were extracted from digitized chest radiographs for testing. Diagnosis performance was evaluated with the leave-one-out method. Classification of independent ROIs achieved a sensitivity of 90% and a specificity of 84% with an area under the receiver operating characteristic curve of 0.85. The diagnosis for each patient was correct for all cases when a `majority vote' criterion for the classification of the corresponding ROIs was applied to issue a normal or ILD patient classification. The proposed approach is a simple, fast, and consistent method for computer aided diagnosis of ILD with a very good performance. Further research will include additional cases, including differential diagnosis among ILD manifestations.

  14. A new approach to the hazard classification of alloys based on transformation/dissolution.

    Science.gov (United States)

    Skeaff, James M; Hardy, David J; King, Pierrette

    2008-01-01

    Most of the metals produced for commercial application enter into service as alloys which, together with metals and all other chemicals in commerce, are subject to a hazard identification and classification initiative now being implemented in a number of jurisdictions worldwide, including the European Union Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) initiative, effective 1 June 2007. This initiative has considerable implications for environmental protection and market access. While a method for the hazard identification and classification of metals is available in the recently developed United Nations (UN) guidance document on the Globally Harmonized System of Hazard Classification and Labelling (GHS), an approach for alloys has yet to be formulated. Within the GHS, a transformation/dissolution protocol (T/ DP) for metals and sparingly soluble metal compounds is provided as a standard laboratory method for measuring the rate and extent of the release of metals into aqueous media from metal-bearing substances. By comparison with ecotoxicity reference data, T/D data can be used to derive UN GHS classification proposals. In this study we applied the T/DP for the 1st time to several economically important metals and alloys: iron powder, nickel powder, copper powder, and the alloys Fe-2Cu-0.6C (copper = 2%, carbon = 0.6%), Fe-2Ni-0.6C, Stainless Steel 304, Monel, brass, Inconel, and nickel-silver. The iron and copper powders and the iron and nickel powders had been sintered to produce the Fe-2Me-0.6C (Me = copper or nickel) alloys which made them essentially resistant to reaction with the aqueous media, so they would not classify under the GHS, although their component copper and nickel metal powders would. Forming a protective passivating film, chromium in the Stainless Steel 304 and Inconel alloys protected them from reaction with the aqueous media, so that their metal releases were minimal and would not result in GHS classification

  15. A Data Mining Approach to the Diagnosis of Tuberculosis by Cascading Clustering and Classification

    CERN Document Server

    T, Asha; Murthy, K N B

    2011-01-01

    In this paper, a methodology for the automated detection and classification of Tuberculosis(TB) is presented. Tuberculosis is a disease caused by mycobacterium which spreads through the air and attacks low immune bodies easily. Our methodology is based on clustering and classification that classifies TB into two categories, Pulmonary Tuberculosis(PTB) and retroviral PTB(RPTB) that is those with Human Immunodeficiency Virus (HIV) infection. Initially K-means clustering is used to group the TB data into two clusters and assigns classes to clusters. Subsequently multiple different classification algorithms are trained on the result set to build the final classifier model based on K-fold cross validation method. This methodology is evaluated using 700 raw TB data obtained from a city hospital. The best obtained accuracy was 98.7% from support vector machine (SVM) compared to other classifiers. The proposed approach helps doctors in their diagnosis decisions and also in their treatment planning procedures for diff...

  16. Stygoregions – a promising approach to a bioregional classification of groundwater systems

    Science.gov (United States)

    Stein, Heide; Griebler, Christian; Berkhoff, Sven; Matzke, Dirk; Fuchs, Andreas; Hahn, Hans Jürgen

    2012-01-01

    Linked to diverse biological processes, groundwater ecosystems deliver essential services to mankind, the most important of which is the provision of drinking water. In contrast to surface waters, ecological aspects of groundwater systems are ignored by the current European Union and national legislation. Groundwater management and protection measures refer exclusively to its good physicochemical and quantitative status. Current initiatives in developing ecologically sound integrative assessment schemes by taking groundwater fauna into account depend on the initial classification of subsurface bioregions. In a large scale survey, the regional and biogeographical distribution patterns of groundwater dwelling invertebrates were examined for many parts of Germany. Following an exploratory approach, our results underline that the distribution patterns of invertebrates in groundwater are not in accordance with any existing bioregional classification system established for surface habitats. In consequence, we propose to develope a new classification scheme for groundwater ecosystems based on stygoregions. PMID:22993698

  17. Using blocking approach to preserve privacy in classification rules by inserting dummy Transaction

    Directory of Open Access Journals (Sweden)

    Doryaneh Hossien Afshari

    2017-03-01

    Full Text Available The increasing rate of data sharing among organizations could maximize the risk of leaking sensitive knowledge. Trying to solve this problem leads to increase the importance of privacy preserving within the process of data sharing. In this study is focused on privacy preserving in classification rules mining as a technique of data mining. We propose a blocking algorithm to hiding sensitive classification rules. In the solution, rules' hiding occurs as a result of editing a set of transactions which satisfy sensitive classification rules. The proposed approach tries to deceive and block adversaries by inserting some dummy transactions. Finally, the solution has been evaluated and compared with other available solutions. Results show that limiting the number of attributes existing in each sensitive rule will lead to a decrease in both the number of lost rules and the production rate of ghost rules.

  18. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    Directory of Open Access Journals (Sweden)

    Dong Jiang

    Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.

  19. Brake fault diagnosis using Clonal Selection Classification Algorithm (CSCA – A statistical learning approach

    Directory of Open Access Journals (Sweden)

    R. Jegadeeshwaran

    2015-03-01

    Full Text Available In automobile, brake system is an essential part responsible for control of the vehicle. Any failure in the brake system impacts the vehicle's motion. It will generate frequent catastrophic effects on the vehicle cum passenger's safety. Thus the brake system plays a vital role in an automobile and hence condition monitoring of the brake system is essential. Vibration based condition monitoring using machine learning techniques are gaining momentum. This study is one such attempt to perform the condition monitoring of a hydraulic brake system through vibration analysis. In this research, the performance of a Clonal Selection Classification Algorithm (CSCA for brake fault diagnosis has been reported. A hydraulic brake system test rig was fabricated. Under good and faulty conditions of a brake system, the vibration signals were acquired using a piezoelectric transducer. The statistical parameters were extracted from the vibration signal. The best feature set was identified for classification using attribute evaluator. The selected features were then classified using CSCA. The classification accuracy of such artificial intelligence technique has been compared with other machine learning approaches and discussed. The Clonal Selection Classification Algorithm performs better and gives the maximum classification accuracy (96% for the fault diagnosis of a hydraulic brake system.

  20. A Simple Semi-Automatic Approach for Land Cover Classification from Multispectral Remote Sensing Imagery

    Science.gov (United States)

    Jiang, Dong; Huang, Yaohuan; Zhuang, Dafang; Zhu, Yunqiang; Xu, Xinliang; Ren, Hongyan

    2012-01-01

    Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1) images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization) with convenience. PMID:23049886

  1. A two-layer depth-averaged approach to describe the regime stratification in collapses of dry granular columns

    Science.gov (United States)

    Sarno, L.; Carravetta, A.; Martino, R.; Tai, Y. C.

    2014-10-01

    The dynamics of dry granular flows is still insufficiently understood. Several depth-averaged approaches, where the flow motion is described through hydrodynamic-like models with suitable resistance laws, have been proposed in the last decades to describe the propagation of avalanches and debris flows. Yet, some important features of the granular flow dynamics cannot be well delivered. For example, it is very challenging to capture the progressive deposition process, observed in collapses and dam-break flows over rough beds, where an upper surface flow is found to coexist with a lower creeping flow. The experimental observations of such flows suggest the existence of a flow regime stratification caused by different momentum transfer mechanisms. In this work, we propose a two-layer depth-averaged model, aiming at describing such a stratification regime inside the flowing granular mass. The model equations are derived for both two-dimensional plane and axi-symmetric flows. Mass and momentum balances of each layer are considered separately, so that different constitutive laws are introduced. The proposed model is equipped with a closure equation accounting for the mass flux at the interface between the layers. Numerical results are compared with experimental data of axi-symmetric granular collapses to validate the proposed approach. The model delivers sound agreement with experimental data when the initial aspect ratios are small. In case of large initial aspect ratios, it yields a significant improvement in predicting the final shape of deposit and also the run-out distances. Further comparisons with different numerical models show that the two-layer approach is capable of correctly describing the main features of the final deposit also in the case of two-dimensional granular collapses.

  2. A study of earthquake-induced building detection by object oriented classification approach

    Science.gov (United States)

    Sabuncu, Asli; Damla Uca Avci, Zehra; Sunar, Filiz

    2017-04-01

    Among the natural hazards, earthquakes are the most destructive disasters and cause huge loss of lives, heavily infrastructure damages and great financial losses every year all around the world. According to the statistics about the earthquakes, more than a million earthquakes occur which is equal to two earthquakes per minute in the world. Natural disasters have brought more than 780.000 deaths approximately % 60 of all mortality is due to the earthquakes after 2001. A great earthquake took place at 38.75 N 43.36 E in the eastern part of Turkey in Van Province on On October 23th, 2011. 604 people died and about 4000 buildings seriously damaged and collapsed after this earthquake. In recent years, the use of object oriented classification approach based on different object features, such as spectral, textural, shape and spatial information, has gained importance and became widespread for the classification of high-resolution satellite images and orthophotos. The motivation of this study is to detect the collapsed buildings and debris areas after the earthquake by using very high-resolution satellite images and orthophotos with the object oriented classification and also see how well remote sensing technology was carried out in determining the collapsed buildings. In this study, two different land surfaces were selected as homogenous and heterogeneous case study areas. In the first step of application, multi-resolution segmentation was applied and optimum parameters were selected to obtain the objects in each area after testing different color/shape and compactness/smoothness values. In the next step, two different classification approaches, namely "supervised" and "unsupervised" approaches were applied and their classification performances were compared. Object-based Image Analysis (OBIA) was performed using e-Cognition software.

  3. Time-dependent approach for single trial classification of covert visuospatial attention

    Science.gov (United States)

    Tonin, L.; Leeb, R.; Millán, J. del R.

    2012-08-01

    Recently, several studies have started to explore covert visuospatial attention as a control signal for brain-computer interfaces (BCIs). Covert visuospatial attention represents the ability to change the focus of attention from one point in the space without overt eye movements. Nevertheless, the full potential and possible applications of this paradigm remain relatively unexplored. Voluntary covert visuospatial attention might allow a more natural and intuitive interaction with real environments as neither stimulation nor gazing is required. In order to identify brain correlates of covert visuospatial attention, classical approaches usually rely on the whole α-band over long time intervals. In this work, we propose a more detailed analysis in the frequency and time domains to enhance classification performance. In particular, we investigate the contribution of α sub-bands and the role of time intervals in carrying information about visual attention. Previous neurophysiological studies have already highlighted the role of temporal dynamics in attention mechanisms. However, these important aspects are not yet exploited in BCI. In this work, we studied different methods that explicitly cope with the natural brain dynamics during visuospatial attention tasks in order to enhance BCI robustness and classification performances. Results with ten healthy subjects demonstrate that our approach identifies spectro-temporal patterns that outperform the state-of-the-art classification method. On average, our time-dependent classification reaches 0.74 ± 0.03 of the area under the ROC (receiver operating characteristic) curve (AUC) value with an increase of 12.3% with respect to standard methods (0.65 ± 0.4). In addition, the proposed approach allows faster classification (<1 instead of 3 s), without compromising performances. Finally, our analysis highlights the fact that discriminant patterns are not stable for the whole trial period but are changing over short time

  4. Improving Wishart Classification of Polarimetric SAR Data Using the Hopfield Neural Network Optimization Approach

    Directory of Open Access Journals (Sweden)

    Íñigo Molina

    2012-11-01

    Full Text Available This paper proposes the optimization relaxation approach based on the analogue Hopfield Neural Network (HNN for cluster refinement of pre-classified Polarimetric Synthetic Aperture Radar (PolSAR image data. We consider the initial classification provided by the maximum-likelihood classifier based on the complex Wishart distribution, which is then supplied to the HNN optimization approach. The goal is to improve the classification results obtained by the Wishart approach. The classification improvement is verified by computing a cluster separability coefficient and a measure of homogeneity within the clusters. During the HNN optimization process, for each iteration and for each pixel, two consistency coefficients are computed, taking into account two types of relations between the pixel under consideration and its corresponding neighbors. Based on these coefficients and on the information coming from the pixel itself, the pixel under study is re-classified. Different experiments are carried out to verify that the proposed approach outperforms other strategies, achieving the best results in terms of separability and a trade-off with the homogeneity preserving relevant structures in the image. The performance is also measured in terms of computational central processing unit (CPU times.

  5. A Similarity-Based Approach for Audiovisual Document Classification Using Temporal Relation Analysis

    Directory of Open Access Journals (Sweden)

    Ferrane Isabelle

    2011-01-01

    Full Text Available Abstract We propose a novel approach for video classification that bases on the analysis of the temporal relationships between the basic events in audiovisual documents. Starting from basic segmentation results, we define a new representation method that is called Temporal Relation Matrix (TRM. Each document is then described by a set of TRMs, the analysis of which makes events of a higher level stand out. This representation has been first designed to analyze any audiovisual document in order to find events that may well characterize its content and its structure. The aim of this work is to use this representation to compute a similarity measure between two documents. Approaches for audiovisual documents classification are presented and discussed. Experimentations are done on a set of 242 video documents and the results show the efficiency of our proposals.

  6. Combining different classification approaches to improve off-line Arabic handwritten word recognition

    Science.gov (United States)

    Zavorin, Ilya; Borovikov, Eugene; Davis, Ericson; Borovikov, Anna; Summers, Kristen

    2008-01-01

    Machine perception and recognition of handwritten text in any language is a difficult problem. Even for Latin script most solutions are restricted to specific domains like bank checks courtesy amount recognition. Arabic script presents additional challenges for handwriting recognition systems due to its highly connected nature, numerous forms of each letter, and other factors. In this paper we address the problem of offline Arabic handwriting recognition of pre-segmented words. Rather than focusing on a single classification approach and trying to perfect it, we propose to combine heterogeneous classification methodologies. We evaluate our system on the IFN/ENIT corpus of Tunisian village and town names and demonstrate that the combined approach yields results that are better than those of the individual classifiers.

  7. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  8. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing.

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm.

  9. Extraction of rules for faulty bearing classification by a Neuro-Fuzzy approach

    Science.gov (United States)

    Marichal, G. N.; Artés, Mariano; García Prada, J. C.; Casanova, O.

    2011-08-01

    In this paper, a classification system of faulty bearings based on a Neuro-Fuzzy approach is presented. The vibration signals in the frequency domain produced by the faulty bearings will be taken as the inputs to the classification system. In this sense, it is an essential characteristic for the used Neuro-Fuzzy approach, the possibility of taking a great number of inputs. The system consists of several Neuro-Fuzzy systems for determining different bearing status, along with a measurement equipment of the vibration spectral data. In this paper, a special attention is focused on the analysis of the rules obtained by the final Neuro-Fuzzy system. In fact, a rule extraction process and an interpretation rule process is discussed. Several trials have been carried out, taking into account the vibration spectral data collected by the measurement equipment, where satisfactory results have been achieved.

  10. A Novel Classification Approach through Integration of Rough Sets and Back-Propagation Neural Network

    Directory of Open Access Journals (Sweden)

    Lei Si

    2014-01-01

    Full Text Available Classification is an important theme in data mining. Rough sets and neural networks are the most common techniques applied in data mining problems. In order to extract useful knowledge and classify ambiguous patterns effectively, this paper presented a hybrid algorithm based on the integration of rough sets and BP neural network to construct a novel classification system. The attribution values were discretized through PSO algorithm firstly to establish a decision table. The attribution reduction algorithm and rules extraction method based on rough sets were proposed, and the flowchart of proposed approach was designed. Finally, a prototype system was developed and some simulation examples were carried out. Simulation results indicated that the proposed approach was feasible and accurate and was outperforming others.

  11. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    Science.gov (United States)

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis.

  12. TWO-STAGE CHARACTER CLASSIFICATION : A COMBINED APPROACH OF CLUSTERING AND SUPPORT VECTOR CLASSIFIERS

    NARCIS (Netherlands)

    Vuurpijl, L.; Schomaker, L.

    2000-01-01

    This paper describes a two-stage classification method for (1) classification of isolated characters and (2) verification of the classification result. Character prototypes are generated using hierarchical clustering. For those prototypes known to sometimes produce wrong classification results, a

  13. A bag of cells approach for antinuclear antibodies HEp-2 image classification.

    Science.gov (United States)

    Wiliem, Arnold; Hobson, Peter; Minchin, Rodney F; Lovell, Brian C

    2015-06-01

    The antinuclear antibody (ANA) test via indirect immunofluorescence applied on Human Epithelial type 2 (HEp-2) cells is a pathology test commonly used to identify connective tissue diseases (CTDs). Despite its effectiveness, the test is still considered labor intensive and time consuming. Applying image-based computer aided diagnosis (CAD) systems is one of the possible ways to address these issues. Ideally, a CAD system should be able to classify ANA HEp-2 images taken by a camera fitted to a fluorescence microscope. Unfortunately, most prior works have primarily focused on the HEp-2 cell image classification problem which is one of the early essential steps in the system pipeline. In this work we directly tackle the specimen image classification problem. We aim to develop a system that can be easily scaled and has competitive accuracy. ANA HEp-2 images or ANA images are generally comprised of a number of cells. Patterns exhibiting in the cells are then used to make inference on the ANA image pattern. To that end, we adapted a popular approach for general image classification problems, namely a bag of visual words approach. Each specimen is considered as a visual document containing visual vocabularies represented by its cells. A specimen image is then represented by a histogram of visual vocabulary occurrences. We name this approach as the Bag of Cells approach. We studied the performance of the proposed approach on a set of images taken from 262 ANA positive patient sera. The results show the proposed approach has competitive performance compared to the recent state-of-the-art approaches. Our proposal can also be expanded to other tests involving examining patterns of human cells to make inferences. © 2014 International Society for Advancement of Cytometry.

  14. Developing a regional scale approach for modelling the impacts of fertiliser regime on N2O emissions in Ireland

    Science.gov (United States)

    Zimmermann, Jesko; Jones, Michael

    2016-04-01

    error (RMSE urease inhibitors. The results suggest that modelling changes in fertiliser regime on a large scale may require a multi-model approach to assure best performance. Ultimately, the research aims to develop a GIS based platform to apply such an approach on a regional scale.

  15. Computational Classification Approach to Profile Neuron Subtypes from Brain Activity Mapping Data

    OpenAIRE

    Meng Li; Fang Zhao; Jason Lee; Dong Wang; Hui Kuang; Joe Z Tsien

    2015-01-01

    The analysis of cell type-specific activity patterns during behaviors is important for better understanding of how neural circuits generate cognition, but has not been well explored from in vivo neurophysiological datasets. Here, we describe a computational approach to uncover distinct cell subpopulations from in vivo neural spike datasets. This method, termed “inter-spike-interval classification-analysis” (ISICA), is comprised of four major steps: spike pattern feature-extraction, pre-cluste...

  16. Improving oil classification quality from oil spill fingerprint beyond six sigma approach.

    Science.gov (United States)

    Juahir, Hafizan; Ismail, Azimah; Mohamed, Saiful Bahri; Toriman, Mohd Ekhwan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wah, Wong Kok; Zali, Munirah Abdul; Retnam, Ananthy; Taib, Mohd Zaki Mohd; Mokhtar, Mazlin

    2017-07-15

    This study involves the use of quality engineering in oil spill classification based on oil spill fingerprinting from GC-FID and GC-MS employing the six-sigma approach. The oil spills are recovered from various water areas of Peninsular Malaysia and Sabah (East Malaysia). The study approach used six sigma methodologies that effectively serve as the problem solving in oil classification extracted from the complex mixtures of oil spilled dataset. The analysis of six sigma link with the quality engineering improved the organizational performance to achieve its objectivity of the environmental forensics. The study reveals that oil spills are discriminated into four groups' viz. diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) according to the similarity of the intrinsic chemical properties. Through the validation, it confirmed that four discriminant component, diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) dominate the oil types with a total variance of 99.51% with ANOVA giving Fstat>Fcritical at 95% confidence level and a Chi Square goodness test of 74.87. Results obtained from this study reveals that by employing six-sigma approach in a data-driven problem such as in the case of oil spill classification, good decision making can be expedited. Copyright © 2017. Published by Elsevier Ltd.

  17. Numerical approach to the low-doping regime of the t-J model

    Science.gov (United States)

    Bonča, J.; Maekawa, S.; Tohyama, T.

    2007-07-01

    We develop an efficient numerical method for the description of a single-hole motion in the antiferromagnetic background. The method is free of finite-size effects and allows the calculation of physical properties at an arbitrary wave vector. A methodical increase of the functional space leads to results that are valid in the thermodynamic limit. We found good agreement with cumulant expansion, exact-diagonalization approaches on finite lattices as well as self-consistent Born approximations. The method allows a straightforward addition of other inelastic degrees of freedom, such as lattice effects. Our results confirm the existence of a finite quasiparticle weight near the band minimum for a single hole and the existence of stringlike peaks in the single-hole spectral function.

  18. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  19. A Neuro-Fuzzy Approach in the Classification of Students’ Academic Performance

    Directory of Open Access Journals (Sweden)

    Quang Hung Do

    2013-01-01

    Full Text Available Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  20. A neuro-fuzzy approach in the classification of students' academic performance.

    Science.gov (United States)

    Do, Quang Hung; Chen, Jeng-Fung

    2013-01-01

    Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  1. Feature Selection and Classification of Electroencephalographic Signals: An Artificial Neural Network and Genetic Algorithm Based Approach.

    Science.gov (United States)

    Erguzel, Turker Tekin; Ozekes, Serhat; Tan, Oguz; Gultekin, Selahattin

    2015-10-01

    Feature selection is an important step in many pattern recognition systems aiming to overcome the so-called curse of dimensionality. In this study, an optimized classification method was tested in 147 patients with major depressive disorder (MDD) treated with repetitive transcranial magnetic stimulation (rTMS). The performance of the combination of a genetic algorithm (GA) and a back-propagation (BP) neural network (BPNN) was evaluated using 6-channel pre-rTMS electroencephalographic (EEG) patterns of theta and delta frequency bands. The GA was first used to eliminate the redundant and less discriminant features to maximize classification performance. The BPNN was then applied to test the performance of the feature subset. Finally, classification performance using the subset was evaluated using 6-fold cross-validation. Although the slow bands of the frontal electrodes are widely used to collect EEG data for patients with MDD and provide quite satisfactory classification results, the outcomes of the proposed approach indicate noticeably increased overall accuracy of 89.12% and an area under the receiver operating characteristic (ROC) curve (AUC) of 0.904 using the reduced feature set.

  2. Computational Classification Approach to Profile Neuron Subtypes from Brain Activity Mapping Data.

    Science.gov (United States)

    Li, Meng; Zhao, Fang; Lee, Jason; Wang, Dong; Kuang, Hui; Tsien, Joe Z

    2015-07-27

    The analysis of cell type-specific activity patterns during behaviors is important for better understanding of how neural circuits generate cognition, but has not been well explored from in vivo neurophysiological datasets. Here, we describe a computational approach to uncover distinct cell subpopulations from in vivo neural spike datasets. This method, termed "inter-spike-interval classification-analysis" (ISICA), is comprised of four major steps: spike pattern feature-extraction, pre-clustering analysis, clustering classification, and unbiased classification-dimensionality selection. By using two key features of spike dynamic - namely, gamma distribution shape factors and a coefficient of variation of inter-spike interval - we show that this ISICA method provides invariant classification for dopaminergic neurons or CA1 pyramidal cell subtypes regardless of the brain states from which spike data were collected. Moreover, we show that these ISICA-classified neuron subtypes underlie distinct physiological functions. We demonstrate that the uncovered dopaminergic neuron subtypes encoded distinct aspects of fearful experiences such as valence or value, whereas distinct hippocampal CA1 pyramidal cells responded differentially to ketamine-induced anesthesia. This ISICA method should be useful to better data mining of large-scale in vivo neural datasets, leading to novel insights into circuit dynamics associated with cognitions.

  3. Stochastic thermodynamics in the strong coupling regime: An unambiguous approach based on coarse graining

    Science.gov (United States)

    Strasberg, Philipp; Esposito, Massimiliano

    2017-06-01

    We consider a classical and possibly driven composite system X ⊗Y weakly coupled to a Markovian thermal reservoir R so that an unambiguous stochastic thermodynamics ensues for X ⊗Y . This setup can be equivalently seen as a system X strongly coupled to a non-Markovian reservoir Y ⊗R . We demonstrate that only in the limit where the dynamics of Y is much faster than X , our unambiguous expressions for thermodynamic quantities, such as heat, entropy, or internal energy, are equivalent to the strong coupling expressions recently obtained in the literature using the Hamiltonian of mean force. By doing so, we also significantly extend these results by formulating them at the level of instantaneous rates and by allowing for time-dependent couplings between X and its environment. Away from the limit where Y evolves much faster than X , previous approaches fail to reproduce the correct results from the original unambiguous formulation, as we illustrate numerically for an underdamped Brownian particle coupled strongly to a non-Markovian reservoir.

  4. A Novel Approach for Cardiac Disease Prediction and Classification Using Intelligent Agents

    CERN Document Server

    Kuttikrishnan, Murugesan

    2010-01-01

    The goal is to develop a novel approach for cardiac disease prediction and diagnosis using intelligent agents. Initially the symptoms are preprocessed using filter and wrapper based agents. The filter removes the missing or irrelevant symptoms. Wrapper is used to extract the data in the data set according to the threshold limits. Dependency of each symptom is identified using dependency checker agent. The classification is based on the prior and posterior probability of the symptoms with the evidence value. Finally the symptoms are classified in to five classes namely absence, starting, mild, moderate and serious. Using the cooperative approach the cardiac problem is solved and verified.

  5. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  6. Comparison of Advanced Pixel Based (ANN and SVM and Object-Oriented Classification Approaches Using Landsat-7 Etm+ Data

    Directory of Open Access Journals (Sweden)

    Prasun Kumar Gupta

    2010-08-01

    Full Text Available In this study, the pixel-based and object-oriented image classification approaches were used for identifying different land use types in Karnal district. Imagery from Landsat-7 ETM with 6 spectral bands was used to perform the image classification.Ground truth data were collected from the available maps, personal knowledge and communication with the local people. In order to prepare land use map different approaches: Artificial Neural Network(ANN and Support Vector Machine (SVM were used. For performing object oriented classification eCognition software was used. During the object oriented classification, in first step several differentsets of parameters were used for image segmentation and in second step nearest neighbor classifier was used for classification. Outcome from the classification works show that the object-oriented approach gave more accurate results (including higher producer’s and user’s accuracy for most of the land cover classes than those achieved by pixelbased classification algorithms. It is also observed that ANN performed better as compared to SVM classification approach.

  7. A 3D approach to equilibrium, stability and transport studies in RFX-mod improved regimes

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, D; Bonfiglio, D; Gobbin, M; Lorenzini, R; Marrelli, L; Martines, E; Momo, B; Predebon, I; Spizzo, G; Agostini, M; Alfier, A; Apolloni, L; Auriemma, F; Baruzzo, M; Bolzonella, T [Consorzio RFX, Associazione EURATOM-ENEA sulla Fusione, Padova (Italy); Boozer, A H [Department of Applied Physics and Applied Mathematics, Columbia University, New York, NY (United States); Cooper, A W [EPFL, Association EURATOM-Confederation Suisse, Centre de Recherches en Physique des Plasmas, Lausanne (Switzerland); Hirshman, S P; Sanchez, R [ORNL Fusion Energy Division, Oak Ridge, TN (United States); Pomphrey, N, E-mail: david.terranova@igi.cnr.i [Princeton Plasma Physics Laboratory, Princeton, NJ (United States)

    2010-12-15

    The full three-dimensional (3D) approach is now becoming an important issue for all magnetic confinement configurations. It is a necessary condition for the stellarator but also the tokamak and the reversed field pinch (RFP) now cannot be completely described in an axisymmetric framework. For the RFP the observation of self-sustained helical configurations with improved plasma performances require a better description in order to assess a new view on this configuration. In this new framework plasma configuration studies for RFX-mod have been considered both with tools developed for the RFP as well as considering codes originally developed for the stellarator and adapted to the RFP. These helical states are reached through a transition to a very low/reversed shear configuration leading to internal electron transport barriers. These states are interrupted by MHD reconnection events and the large T{sub e} gradients at the barriers indicate that both current and pressure driven modes are to be considered. Furthermore the typically flat T{sub e} profiles in the helical core have raised the issue of the role of electrostatic and electromagnetic turbulence in these reduced chaos regions, so that a stability analysis in the correct 3D geometry is required to address an optimization of the plasma setup. In this view the VMEC code proved to be an effective way to obtain helical equilibria to be studied in terms of stability and transport with a suite of well tested codes. In this work, the equilibrium reconstruction technique as well as the experimental evidence of 3D effects and their first interpretation in terms of stability and transport are presented using both RFP and stellarator tools.

  8. A 3D approach to equilibrium, stability and transport studies in RFX-mod improved regimes

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, D. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Bonfiglio, D. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Boozer, A. H. [Columbia University; Cooper, W Anthony [CRPP/EPFL, Association Euratom-Suisse, Lausanne, Switzerland; Gobbin, M. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Hirshman, Steven Paul [ORNL; Lorenzini, R. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Marrelli, L. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Martines, E. [RFX, Padova, Italy; Momo, B. [RFX, Padova, Italy; Pomphrey, N. [Princeton Plasma Physics Laboratory (PPPL); Predebon, I. [RFX, Padova, Italy; Sanchez, Raul [ORNL; Spizzo, G. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Agnostini, M. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Alfier, A. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Apolloni, L. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Auriemma, F. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Baruzzo, M. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Bolzonella, T. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Bonomo, F. [Consorzio RFX, Italy; Brombin, M. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Canton, A. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Cappello, S. [Association Euratom ENEA Fusion, Consorzio RFX, Padua; Carraro, L. [Association Euratom ENEA Fusion, Consorzio RFX, Padua

    2010-01-01

    The full three-dimensional (3D) approach is now becoming an important issue for all magnetic confinement configurations. It is a necessary condition for the stellarator but also the tokamak and the reversed field pinch (RFP) now cannot be completely described in an axisymmetric framework. For the RFP the observation of self-sustained helical configurations with improved plasma performances require a better description in order to assess a new view on this configuration. In this new framework plasma configuration studies for RFX-mod have been considered both with tools developed for the RFP as well as considering codes originally developed for the stellarator and adapted to the RFP. These helical states are reached through a transition to a very low/reversed shear configuration leading to internal electron transport barriers. These states are interrupted by MHD reconnection events and the large T(e) gradients at the barriers indicate that both current and pressure driven modes are to be considered. Furthermore the typically flat T(e) profiles in the helical core have raised the issue of the role of electrostatic and electromagnetic turbulence in these reduced chaos regions, so that a stability analysis in the correct 3D geometry is required to address an optimization of the plasma setup. In this view the VMEC code proved to be an effective way to obtain helical equilibria to be studied in terms of stability and transport with a suite of well tested codes. In this work, the equilibrium reconstruction technique as well as the experimental evidence of 3D effects and their first interpretation in terms of stability and transport are presented using both RFP and stellarator tools.

  9. TO THE PROBLEM OF LEGAL SYSTEM CLASSIFICATION: CIVILIZED APPROACH. TENDENCIES OF LEGAL FAMILIES APPROACHING IN THE CONDITIONS OF GLOBALIZATION

    Directory of Open Access Journals (Sweden)

    Rasskazov L. P.

    2015-09-01

    Full Text Available The article discusses various criteria for the classification of legal systems. Special attention is drawn to the civilizational approach, which can be effectively used in the classification of legal systems. In accordance with the civilizational approach in the world there are many civilizations, developing according to its own laws (for example, the Scythian civilization, ancient Egyptian, etc.. In accordance with this approach the history of mankind is a history of the development of civilizations. There are different definitions of civilization. In generalized form is a community of people with particular characteristics in the socio - political organization, economy, culture. All States from the point of view of the civilizational approach can be divided into two types: Eastern (China, India, the Empire of the Incas, etc. characterized by Marx as the "Asian mode of production"; the Western, or progressive (especially European countries. Each of these types has its historical features. In turn, each of these types has its own legal family. It appears that the basis for determining the classification of legal systems is a normative element of the legal system, including law, legal principles, sources of law, legal system, legislation, legal techniques. But this criterion can be applied in one and the same type of civilizations. In accordance with the criterion of the country of the Western type, can be divided into two large families: the Romano-Germanic and Anglo-Saxon. It should be noted that globalization processes in the modern world lead to the convergence of legal families. In particular this applies to the RomanoGermanic and Anglo-Saxon legal families, between which there is a gradual disappearance of the traditional differences

  10. Classification of follicular lymphoma images: a holistic approach with symbol-based machine learning methods.

    Science.gov (United States)

    Zorman, Milan; Sánchez de la Rosa, José Luis; Dinevski, Dejan

    2011-12-01

    It is not very often to see a symbol-based machine learning approach to be used for the purpose of image classification and recognition. In this paper we will present such an approach, which we first used on the follicular lymphoma images. Lymphoma is a broad term encompassing a variety of cancers of the lymphatic system. Lymphoma is differentiated by the type of cell that multiplies and how the cancer presents itself. It is very important to get an exact diagnosis regarding lymphoma and to determine the treatments that will be most effective for the patient's condition. Our work was focused on the identification of lymphomas by finding follicles in microscopy images provided by the Laboratory of Pathology in the University Hospital of Tenerife, Spain. We divided our work in two stages: in the first stage we did image pre-processing and feature extraction, and in the second stage we used different symbolic machine learning approaches for pixel classification. Symbolic machine learning approaches are often neglected when looking for image analysis tools. They are not only known for a very appropriate knowledge representation, but also claimed to lack computational power. The results we got are very promising and show that symbolic approaches can be successful in image analysis applications.

  11. Hydrogeological approach to investigation in karst for possible modification of groundwater regime and increase of recoverable reserves

    Science.gov (United States)

    Komatina, Miomir

    1990-09-01

    An artificial contribution to groundwater reserves in the areas of interest for water supply is a principal methodological target in modern hydrogeology. Investigations directed to this goal are of increasing significance all over the world to meet the growing demand for good water, which groundwater generally can be. Progress has been made in the sphere of practical development in permeable rocks of intergranular porosity, which cannot be said of discontinuous karst media, although it seems to offer greater opportunities. The ingrained notion and fear, even among specialists, of the inherent risk and uncertainty were invariably present wherever a resource was discovered in karst of a geosynclinal area; consequently progress has been limited. The reasons, however, for such a cautious approach are diminishing, because much knowledge has been gained about these aquiferous rocks, especially through investigations in the regions of surface storage reservoirs. Better knowledge of karst features and the results achieved in construction and consolidation of surface reservoirs have indicated that large amounts of groundwater can be recovered. The conventional water investigation and recovery methods have made available only small safe yields equal to the lowest natural discharge on the order of 100 I/sec). A reasonable use of a karst water resource and its better management cannot be considered without artificial control of the groundwater regime, i.e., without adjusting the regime to human demands. Groundwater flow balance in karst is becoming one of the principal problems, and future activities should be directed to the search for a bolder solution. A multidisciplinary team of geologists, geomorphologists, hydrogeologists, hydrologists, hydraulic engineers, etc., is required. In this paper a variety of solutions for water resource utilization in naked geosynclinal karst is suggested and far greater activity in this field is encouraged.

  12. Fuzzy Continuous Review Inventory Model using ABC Multi-Criteria Classification Approach: A Single Case Study

    Directory of Open Access Journals (Sweden)

    Meriastuti - Ginting

    2015-06-01

    Full Text Available Abstract. Inventory is considered as the most expensive, yet important,to any companies. It representsapproximately 50% of the total investment. Inventory cost has become one of the majorcontributorsto inefficiency, therefore it should be managed effectively. This study aims to propose an alternative inventory model, by using ABC multi-criteria classification approach to minimize total cost. By combining FANP (Fuzzy Analytical Network Process and TOPSIS (Technique of Order Preferences by Similarity to the Ideal Solution, the ABC multi-criteria classification approach identified 12 items of 69 inventory items as “outstanding important class” that contributed to 80% total inventory cost. This finding is then used as the basis to determine the proposed continuous review inventory model.This study found that by using fuzzy trapezoidal cost, the inventory turnover ratio can be increased, and inventory cost can be decreased by 78% for each item in “class A” inventory. Keywords:ABC multi-criteria classification, FANP-TOPSIS, continuous review inventory model lead-time demand distribution, trapezoidal fuzzy number

  13. Fuzzy Continuous Review Inventory Model using ABC Multi-Criteria Classification Approach: A Single Case Study

    Directory of Open Access Journals (Sweden)

    Meriastuti - Ginting

    2015-07-01

    Full Text Available Abstract. Inventory is considered as the most expensive, yet important,to any companies. It representsapproximately 50% of the total investment. Inventory cost has become one of the majorcontributorsto inefficiency, therefore it should be managed effectively. This study aims to propose an alternative inventory model,  by using ABC multi-criteria classification approach to minimize total cost. By combining FANP (Fuzzy Analytical Network Process and TOPSIS (Technique of Order Preferences by Similarity to the Ideal Solution, the ABC multi-criteria classification approach identified 12 items of 69 inventory items as “outstanding important class” that contributed to 80% total inventory cost. This finding  is then used as the basis to determine the proposed continuous review inventory model.This study found that by using fuzzy trapezoidal cost, the inventory  turnover ratio can be increased, and inventory cost can be decreased by 78% for each item in “class A” inventory.Keywords:ABC multi-criteria classification, FANP-TOPSIS, continuous review inventory model lead-time demand distribution, trapezoidal fuzzy number 

  14. A New Approach for Clustered MCs Classification with Sparse Features Learning and TWSVM

    Directory of Open Access Journals (Sweden)

    Xin-Sheng Zhang

    2014-01-01

    Full Text Available In digital mammograms, an early sign of breast cancer is the existence of microcalcification clusters (MCs, which is very important to the early breast cancer detection. In this paper, a new approach is proposed to classify and detect MCs. We formulate this classification problem as sparse feature learning based classification on behalf of the test samples with a set of training samples, which are also known as a “vocabulary” of visual parts. A visual information-rich vocabulary of training samples is manually built up from a set of samples, which include MCs parts and no-MCs parts. With the prior ground truth of MCs in mammograms, the sparse feature learning is acquired by the lP-regularized least square approach with the interior-point method. Then we designed the sparse feature learning based MCs classification algorithm using twin support vector machines (TWSVMs. To investigate its performance, the proposed method is applied to DDSM datasets and compared with support vector machines (SVMs with the same dataset. Experiments have shown that performance of the proposed method is more efficient or better than the state-of-art methods.

  15. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan;

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...... diagnoses. The final result is a short but robust rule based classification scheme, achieving high degree of classification accuracy (exceeding 90% of accuracy for most classes) in a meaningful and user-friendly representation form for the medical expert. The domain of application analyzed through the paper...... is the well-known Pap-Test problem, corresponding to a numerical database, which consists of 450 medical records, 25 diagnostic attributes and 5 different diagnostic classes. Experimental data are divided in two equal parts for the training and testing phase, and 8 mutually dependent rules for diagnosis...

  16. A Transform-Based Feature Extraction Approach for Motor Imagery Tasks Classification

    Science.gov (United States)

    Khorshidtalab, Aida; Mesbah, Mostefa; Salami, Momoh J. E.

    2015-01-01

    In this paper, we present a new motor imagery classification method in the context of electroencephalography (EEG)-based brain–computer interface (BCI). This method uses a signal-dependent orthogonal transform, referred to as linear prediction singular value decomposition (LP-SVD), for feature extraction. The transform defines the mapping as the left singular vectors of the LP coefficient filter impulse response matrix. Using a logistic tree-based model classifier; the extracted features are classified into one of four motor imagery movements. The proposed approach was first benchmarked against two related state-of-the-art feature extraction approaches, namely, discrete cosine transform (DCT) and adaptive autoregressive (AAR)-based methods. By achieving an accuracy of 67.35%, the LP-SVD approach outperformed the other approaches by large margins (25% compared with DCT and 6 % compared with AAR-based methods). To further improve the discriminatory capability of the extracted features and reduce the computational complexity, we enlarged the extracted feature subset by incorporating two extra features, namely, Q- and the Hotelling’s \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$T^{2}$ \\end{document} statistics of the transformed EEG and introduced a new EEG channel selection method. The performance of the EEG classification based on the expanded feature set and channel selection method was compared with that of a number of the state-of-the-art classification methods previously reported with the BCI IIIa competition data set. Our method came second with an average accuracy of 81.38%. PMID:27170898

  17. Comparison of Advanced Pixel Based (ANN and SVM) and Object-Oriented Classification Approaches Using Landsat-7 Etm+ Data

    OpenAIRE

    Prasun Kumar Gupta; Gaurav Kalidas Pakhale

    2010-01-01

    In this study, the pixel-based and object-oriented image classification approaches were used for identifying different land use types in Karnal district. Imagery from Landsat-7 ETM with 6 spectral bands was used to perform the image classification.Ground truth data were collected from the available maps, personal knowledge and communication with the local people. In order to prepare land use map different approaches: Artificial Neural Network(ANN) and Support Vector Machine (SVM) were used. F...

  18. A multi-label, semi-supervised classification approach applied to personality prediction in social media.

    Science.gov (United States)

    Lima, Ana Carolina E S; de Castro, Leandro Nunes

    2014-10-01

    Social media allow web users to create and share content pertaining to different subjects, exposing their activities, opinions, feelings and thoughts. In this context, online social media has attracted the interest of data scientists seeking to understand behaviours and trends, whilst collecting statistics for social sites. One potential application for these data is personality prediction, which aims to understand a user's behaviour within social media. Traditional personality prediction relies on users' profiles, their status updates, the messages they post, etc. Here, a personality prediction system for social media data is introduced that differs from most approaches in the literature, in that it works with groups of texts, instead of single texts, and does not take users' profiles into account. Also, the proposed approach extracts meta-attributes from texts and does not work directly with the content of the messages. The set of possible personality traits is taken from the Big Five model and allows the problem to be characterised as a multi-label classification task. The problem is then transformed into a set of five binary classification problems and solved by means of a semi-supervised learning approach, due to the difficulty in annotating the massive amounts of data generated in social media. In our implementation, the proposed system was trained with three well-known machine-learning algorithms, namely a Naïve Bayes classifier, a Support Vector Machine, and a Multilayer Perceptron neural network. The system was applied to predict the personality of Tweets taken from three datasets available in the literature, and resulted in an approximately 83% accurate prediction, with some of the personality traits presenting better individual classification rates than others.

  19. Proposition of novel classification approach and features for improved real-time arrhythmia monitoring.

    Science.gov (United States)

    Kim, Yoon Jae; Heo, Jeong; Park, Kwang Suk; Kim, Sungwan

    2016-08-01

    Arrhythmia refers to a group of conditions in which the heartbeat is irregular, fast, or slow due to abnormal electrical activity in the heart. Some types of arrhythmia such as ventricular fibrillation may result in cardiac arrest or death. Thus, arrhythmia detection becomes an important issue, and various studies have been conducted. Additionally, an arrhythmia detection algorithm for portable devices such as mobile phones has recently been developed because of increasing interest in e-health care. This paper proposes a novel classification approach and features, which are validated for improved real-time arrhythmia monitoring. The classification approach that was employed for arrhythmia detection is based on the concept of ensemble learning and the Taguchi method and has the advantage of being accurate and computationally efficient. The electrocardiography (ECG) data for arrhythmia detection was obtained from the MIT-BIH Arrhythmia Database (n=48). A novel feature, namely the heart rate variability calculated from 5s segments of ECG, which was not considered previously, was used. The novel classification approach and feature demonstrated arrhythmia detection accuracy of 89.13%. When the same data was classified using the conventional support vector machine (SVM), the obtained accuracy was 91.69%, 88.14%, and 88.74% for Gaussian, linear, and polynomial kernels, respectively. In terms of computation time, the proposed classifier was 5821.7 times faster than conventional SVM. In conclusion, the proposed classifier and feature showed performance comparable to those of previous studies, while the computational complexity and update interval were highly reduced.

  20. An Ensemble Rule Learning Approach for Automated Morphological Classification of Erythrocytes.

    Science.gov (United States)

    Maity, Maitreya; Mungle, Tushar; Dhane, Dhiraj; Maiti, A K; Chakraborty, Chandan

    2017-04-01

    The analysis of pathophysiological change to erythrocytes is important for early diagnosis of anaemia. The manual assessment of pathology slides is time-consuming and complicated regarding various types of cell identification. This paper proposes an ensemble rule-based decision-making approach for morphological classification of erythrocytes. Firstly, the digital microscopic blood smear images are pre-processed for removal of spurious regions followed by colour normalisation and thresholding. The erythrocytes are segmented from background image using the watershed algorithm. The shape features are then extracted from the segmented image to detect shape abnormality present in microscopic blood smear images. The decision about the abnormality is taken using proposed multiple rule-based expert systems. The deciding factor is majority ensemble voting for abnormally shaped erythrocytes. Here, shape-based features are considered for nine different types of abnormal erythrocytes including normal erythrocytes. Further, the adaptive boosting algorithm is used to generate multiple decision tree models where each model tree generates an individual rule set. The supervised classification method is followed to generate rules using a C4.5 decision tree. The proposed ensemble approach is precise in detecting eight types of abnormal erythrocytes with an overall accuracy of 97.81% and weighted sensitivity of 97.33%, weighted specificity of 99.7%, and weighted precision of 98%. This approach shows the robustness of proposed strategy for erythrocytes classification into abnormal and normal class. The article also clarifies its latent quality to be incorporated in point of care technology solution targeting a rapid clinical assistance.

  1. A novel approach to ECG classification based upon two-layered HMMs in body sensor networks.

    Science.gov (United States)

    Liang, Wei; Zhang, Yinlong; Tan, Jindong; Li, Yang

    2014-03-27

    This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient's ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS) filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs) are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC) in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN) platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen.

  2. Intelligent Video Object Classification Scheme using Offline Feature Extraction and Machine Learning based Approach

    Directory of Open Access Journals (Sweden)

    Chandra Mani Sharma

    2012-01-01

    Full Text Available Classification of objects in video stream is important because of its application in many emerging areas such as visual surveillance, content based video retrieval and indexing etc. The task is far more challenging because the video data is of heavy and highly variable nature. The processing of video data is required to be in real-time. This paper presents a multiclass object classification technique using machine learning approach. Haar-like features are used for training the classifier. The feature calculation is performed using Integral Image representation and we train the classifier offline using a Stage-wise Additive Modeling using a Multiclass Exponential loss function (SAMME. The validity of the method has been verified from the implementation of a real-time human-car detector. Experimental results show that the proposed method can accurately classify objects, in video, into their respective classes. The proposed object classifier works well in outdoor environment in presence of moderate lighting conditions and variable scene background. The proposed technique is compared, with other object classification techniques, based on various performance parameters.

  3. A Novel Approach to ECG Classification Based upon Two-Layered HMMs in Body Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wei Liang

    2014-03-01

    Full Text Available This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient’s ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen.

  4. An improved hyperspectral image classification approach based on ISODATA and SKR method

    Science.gov (United States)

    Hong, Pu; Ye, Xiao-feng; Yu, Hui; Zhang, Zhi-jie; Cai, Yu-fei; Tang, Xin; Tang, Wei; Wang, Chensheng

    2016-11-01

    Hyper-spectral images can not only provide spatial information but also a wealth of spectral information. A short list of applications includes environmental mapping, global change research, geological research, wetlands mapping, assessment of trafficability, plant and mineral identification and abundance estimation, crop analysis, and bathymetry. A crucial aspect of hyperspectral image analysis is the identification of materials present in an object or scene being imaged. Classification of a hyperspectral image sequence amounts to identifying which pixels contain various spectrally distinct materials that have been specified by the user. Several techniques for classification of multi-hyperspectral pixels have been used from minimum distance and maximum likelihood classifiers to correlation matched filter-based approaches such as spectral signature matching and the spectral angle mapper. In this paper, an improved hyperspectral images classification algorithm is proposed. In the proposed method, an improved similarity measurement method is applied, in which both the spectrum similarity and space similarity are considered. We use two different weighted matrix to estimate the spectrum similarity and space similarity between two pixels, respectively. And then whether these two pixels represent the same material can be determined. In order to reduce the computational cost the wavelet transform is also applied prior to extract the spectral and space features. The proposed method is tested using hyperspectral imagery collected by the National Aeronautics and Space Administration Jet Propulsion Laboratory. Experimental results the efficiency of this new method on hyperspectral images associated with space object material identification.

  5. GSM-MRF based classification approach for real-time moving object detection

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Statistical and contextual information are typically used to detect moving regions in image sequences for a fixed camera. In this paper, we propose a fast and stable linear discriminant approach based on Gaussian Single Model (GSM) and Markov Random Field (MRF). The performance of GSM is analyzed first, and then two main improvements corresponding to the drawbacks of GSM are proposed: the latest filtered data based update scheme of the background model and the linear classification judgment rule based on spatial-temporal feature specified by MRF. Experimental results show that the proposed method runs more rapidly and accurately when compared with other methods.

  6. A multidimensional signal processing approach for classification of microwave measurements with application to stroke type diagnosis.

    Science.gov (United States)

    Mesri, Hamed Yousefi; Najafabadi, Masoud Khazaeli; McKelvey, Tomas

    2011-01-01

    A multidimensional signal processing method is described for detection of bleeding stroke based on microwave measurements from an antenna array placed around the head of the patient. The method is data driven and the algorithm uses samples from a healthy control group to calculate the feature used for classification. The feature is derived using a tensor approach and the higher order singular value decomposition is a key component. A leave-one-out validation method is used to evaluate the properties of the method using clinical data.

  7. A new approach for the combined chemical and mineral classification of the inorganic matter in coal. 2. Potential applications of the classification systems

    Energy Technology Data Exchange (ETDEWEB)

    Stanislav V. Vassilev; Christina G. Vassileva; David Baxter; Lars K. Andersen [Bulgarian Academy of Sciences, Sofia (Bulgaria). Central Laboratory of Mineralogy and Crystallography

    2009-02-15

    Part 1 of the present work introduced and evaluated a new approach for the combined chemical and mineral classification of the inorganic matter in coal. The benefit of these classification systems is the use of significant correlations and actual element associations, and well-defined and genetically described mineral classes and species in coal. Potential applications of the chemically and mineralogically categorized coal types and subtypes are discussed in the present part 2. The data show that various technological problems, environmental risks and health concerns of coal use are related directly or indirectly to specific mineral and chemical coal types and subtypes. Furthermore, a concept of 'self-cleaning fuels' also is introduced and developed herein based on mineral coal types. The application of these chemical and mineral classification systems and concept is proposed to both the scientific and industrial community. 54 refs., 1 fig., 3 tabs.

  8. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    Science.gov (United States)

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  9. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    Science.gov (United States)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  10. Improved Wetland Classification Using Eight-Band High Resolution Satellite Imagery and a Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Charles R. Lane

    2014-12-01

    Full Text Available Although remote sensing technology has long been used in wetland inventory and monitoring, the accuracy and detail level of wetland maps derived with moderate resolution imagery and traditional techniques have been limited and often unsatisfactory. We explored and evaluated the utility of a newly launched high-resolution, eight-band satellite system (Worldview-2; WV2 for identifying and classifying freshwater deltaic wetland vegetation and aquatic habitats in the Selenga River Delta of Lake Baikal, Russia, using a hybrid approach and a novel application of Indicator Species Analysis (ISA. We achieved an overall classification accuracy of 86.5% (Kappa coefficient: 0.85 for 22 classes of aquatic and wetland habitats and found that additional metrics, such as the Normalized Difference Vegetation Index and image texture, were valuable for improving the overall classification accuracy and particularly for discriminating among certain habitat classes. Our analysis demonstrated that including WV2’s four spectral bands from parts of the spectrum less commonly used in remote sensing analyses, along with the more traditional bandwidths, contributed to the increase in the overall classification accuracy by ~4% overall, but with considerable increases in our ability to discriminate certain communities. The coastal band improved differentiating open water and aquatic (i.e., vegetated habitats, and the yellow, red-edge, and near-infrared 2 bands improved discrimination among different vegetated aquatic and terrestrial habitats. The use of ISA provided statistical rigor in developing associations between spectral classes and field-based data. Our analyses demonstrated the utility of a hybrid approach and the benefit of additional bands and metrics in providing the first spatially explicit mapping of a large and heterogeneous wetland system.

  11. Macroeconomic regimes

    NARCIS (Netherlands)

    Baele, L.T.M.; Bekaert, G.R.J.; Cho, S.; Inghelbrecht, K.; Moreno, A.

    2015-01-01

    A New-Keynesian macro-model is estimated accommodating regime-switching behavior in monetary policy and macro-shocks. A key to our estimation strategy is the use of survey-based expectations for inflation and output. Output and inflation shocks shift to the low volatility regime around 1985 and 1990

  12. Exchange rate regimes and monetary arrangements

    Directory of Open Access Journals (Sweden)

    Ivan Ribnikar

    2005-06-01

    Full Text Available There is a close relationship between a country’s exchange rate regime and monetary arrangement and if we are to examine monetary arrangements then exchange rate regimes must first be analysed. Within the conventional and most widely used classification of exchange rate regimes into rigid and flexible or into polar regimes (hard peg and float on one side, and intermediate regimes on the other there, is a much greater variety among intermediate regimes. A more precise and, as will be seen, more useful classification of exchange rate regimes is the first topic of the paper. The second topic is how exchange rate regimes influence or determine monetary arrangements and monetary policy or monetary policy regimes: monetary autonomy versus monetary nonautonomy and discretion in monetary policy versus commitment in monetary policy. Both topics are important for countries on their path to the EU and the euro area

  13. Detection and classification of interstitial lung diseases and emphysema using a joint morphological-fuzzy approach

    Science.gov (United States)

    Chang Chien, Kuang-Che; Fetita, Catalin; Brillet, Pierre-Yves; Prêteux, Françoise; Chang, Ruey-Feng

    2009-02-01

    Multi-detector computed tomography (MDCT) has high accuracy and specificity on volumetrically capturing serial images of the lung. It increases the capability of computerized classification for lung tissue in medical research. This paper proposes a three-dimensional (3D) automated approach based on mathematical morphology and fuzzy logic for quantifying and classifying interstitial lung diseases (ILDs) and emphysema. The proposed methodology is composed of several stages: (1) an image multi-resolution decomposition scheme based on a 3D morphological filter is used to detect and analyze the different density patterns of the lung texture. Then, (2) for each pattern in the multi-resolution decomposition, six features are computed, for which fuzzy membership functions define a probability of association with a pathology class. Finally, (3) for each pathology class, the probabilities are combined up according to the weight assigned to each membership function and two threshold values are used to decide the final class of the pattern. The proposed approach was tested on 10 MDCT cases and the classification accuracy was: emphysema: 95%, fibrosis/honeycombing: 84% and ground glass: 97%.

  14. An image-based approach for classification of human micro-doppler radar signatures

    Science.gov (United States)

    Tivive, Fok Hing Chi; Phung, Son Lam; Bouzerdoum, Abdesselam

    2013-05-01

    With the advances in radar technology, there is an increasing interest in automatic radar-based human gait identification. This is because radar signals can penetrate through most dielectric materials. In this paper, an image-based approach is proposed for classifying human micro-Doppler radar signatures. The time-varying radar signal is first converted into a time-frequency representation, which is then cast as a two-dimensional image. A descriptor is developed to extract micro-Doppler features from local time-frequency patches centered along the torso Doppler frequency. Experimental results based on real data collected from a 24-GHz Doppler radar showed that the proposed approach achieves promising classification performance.

  15. A case-comparison study of automatic document classification utilizing both serial and parallel approaches

    Science.gov (United States)

    Wilges, B.; Bastos, R. C.; Mateus, G. P.; Dantas, M. A. R.

    2014-10-01

    A well-known problem faced by any organization nowadays is the high volume of data that is available and the required process to transform this volume into differential information. In this study, a case-comparison study of automatic document classification (ADC) approach is presented, utilizing both serial and parallel paradigms. The serial approach was implemented by adopting the RapidMiner software tool, which is recognized as the worldleading open-source system for data mining. On the other hand, considering the MapReduce programming model, the Hadoop software environment has been used. The main goal of this case-comparison study is to exploit differences between these two paradigms, especially when large volumes of data such as Web text documents are utilized to build a category database. In the literature, many studies point out that distributed processing in unstructured documents have been yielding efficient results in utilizing Hadoop. Results from our research indicate a threshold to such efficiency.

  16. Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach

    Science.gov (United States)

    Taufik, Afirah; Sakinah Syed Ahmad, Sharifah

    2016-06-01

    The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.

  17. Prediction of “Aggregation-Prone” Peptides with Hybrid Classification Approach

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2015-01-01

    Full Text Available Protein aggregation is a biological phenomenon caused by misfolding proteins aggregation and is associated with a wide variety of diseases, such as Alzheimer’s, Parkinson’s, and prion diseases. Many studies indicate that protein aggregation is mediated by short “aggregation-prone” peptide segments. Thus, the prediction of aggregation-prone sites plays a crucial role in the research of drug targets. Compared with the labor-intensive and time-consuming experiment approaches, the computational prediction of aggregation-prone sites is much desirable due to their convenience and high efficiency. In this study, we introduce two computational approaches Aggre_Easy and Aggre_Balance for predicting aggregation residues from the sequence information; here, the protein samples are represented by the composition of k-spaced amino acid pairs (CKSAAP. And we use the hybrid classification approach to predict aggregation-prone residues, which integrates the naïve Bayes classification to reduce the number of features, and two undersampling approaches EasyEnsemble and BalanceCascade to deal with samples imbalance problem. The Aggre_Easy achieves a promising performance with a sensitivity of 79.47%, a specificity of 80.70% and a MCC of 0.42; the sensitivity, specificity, and MCC of Aggre_Balance reach 70.32%, 80.70% and 0.42. Experimental results show that the performance of Aggre_Easy and Aggre_Balance predictor is better than several other state-of-the-art predictors. A user-friendly web server is built for prediction of aggregation-prone which is freely accessible to public at the website.

  18. A novel approach to analysing the regimes of temporary streams in relation to their controls on the composition and structure of aquatic biota

    Directory of Open Access Journals (Sweden)

    F. Gallart

    2012-09-01

    Full Text Available Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the

  19. A novel approach to analysing the regimes of temporary streams in relation to their controls on the composition and structure of aquatic biota

    Science.gov (United States)

    Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; De Girolamo, A. M.; Lo Porto, A.; Buffagni, A.; Erba, S.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Skoulikidis, N.; Gómez, R.; Sánchez-Montoya, M. M.; Froebrich, J.

    2012-09-01

    Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the availability of mesohabitats in given periods that

  20. A HYBRID APPROACH BASED MEDICAL IMAGE RETRIEVAL SYSTEM USING FEATURE OPTIMIZED CLASSIFICATION SIMILARITY FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Yogapriya Jaganathan

    2013-01-01

    Full Text Available For the past few years, massive upgradation is obtained in the pasture of Content Based Medical Image Retrieval (CBMIR for effective utilization of medical images based on visual feature analysis for the purpose of diagnosis and educational research. The existing medical image retrieval systems are still not optimal to solve the feature dimensionality reduction problem which increases the computational complexity and decreases the speed of a retrieval process. The proposed CBMIR is used a hybrid approach based on Feature Extraction, Optimization of Feature Vectors, Classification of Features and Similarity Measurements. This type of CBMIR is called Feature Optimized Classification Similarity (FOCS framework. The selected features are Textures using Gray level Co-occurrence Matrix Features (GLCM and Tamura Features (TF in which extracted features are formed as feature vector database. The Fuzzy based Particle Swarm Optimization (FPSO technique is used to reduce the feature vector dimensionality and classification is performed using Fuzzy based Relevance Vector Machine (FRVM to form groups of relevant image features that provide a natural way to classify dimensionally reduced feature vectors of images. The Euclidean Distance (ED is used as similarity measurement to measure the significance between the query image and the target images. This FOCS approach can get the query from the user and has retrieved the needed images from the databases. The retrieval algorithm performances are estimated in terms of precision and recall. This FOCS framework comprises several benefits when compared to existing CBMIR. GLCM and TF are used to extract texture features and form a feature vector database. Fuzzy-PSO is used to reduce the feature vector dimensionality issues while selecting the important features in the feature vector database in which computational complexity is decreased. Fuzzy based RVM is used for feature classification in which it increases the

  1. CHOICE OF SURGICAL APPROACH FOR ACETABULAR COMPONENT’S IMPLANTATION USING CURRENT CLASSIFICATION FOR ARTHRITIS FOLLOWING ACETABULAR FRACTURE

    Directory of Open Access Journals (Sweden)

    R. M. Tikhilov

    2011-01-01

    Full Text Available Degenerative-dystrophic changes in hip after treatment of acetabular fracture, over the time, develops about in 60% of affected people. In such cases, total hip replacement is used. Existing classifications (for example AO or Letournel are good for fracture treatment, but not for arthritis following acetabular fracture. The group of patients, with post traumatic arthritis, is heterogeneous with severity of post traumatic anatomic changes. Basis for surgical approach, could be current classification for post traumatic changes – taking into account features of anatomic functional changes in hip and the bone defects of acetabulum. In this article is demonstrated X-ray and clinical basing for current classification.

  2. Evaluating an ensemble classification approach for crop diversity verification in Danish greening subsidy control

    Science.gov (United States)

    Chellasamy, Menaka; Ferré, Ty Paul Andrew; Greve, Mogens Humlekrog

    2016-07-01

    Beginning in 2015, Danish farmers are obliged to meet specific crop diversification rules based on total land area and number of crops cultivated to be eligible for new greening subsidies. Hence, there is a need for the Danish government to extend their subsidy control system to verify farmers' declarations to warrant greening payments under the new crop diversification rules. Remote Sensing (RS) technology has been used since 1992 to control farmers' subsidies in Denmark. However, a proper RS-based approach is yet to be finalised to validate new crop diversity requirements designed for assessing compliance under the recent subsidy scheme (2014-2020); This study uses an ensemble classification approach (proposed by the authors in previous studies) for validating the crop diversity requirements of the new rules. The approach uses a neural network ensemble classification system with bi-temporal (spring and early summer) WorldView-2 imagery (WV2) and includes the following steps: (1) automatic computation of pixel-based prediction probabilities using multiple neural networks; (2) quantification of the classification uncertainty using Endorsement Theory (ET); (3) discrimination of crop pixels and validation of the crop diversification rules at farm level; and (4) identification of farmers who are violating the requirements for greening subsidies. The prediction probabilities are computed by a neural network ensemble supplied with training samples selected automatically using farmers declared parcels (field vectors containing crop information and the field boundary of each crop). Crop discrimination is performed by considering a set of conclusions derived from individual neural networks based on ET. Verification of the diversification rules is performed by incorporating pixel-based classification uncertainty or confidence intervals with the class labels at the farmer level. The proposed approach was tested with WV2 imagery acquired in 2011 for a study area in Vennebjerg

  3. Effects of a Peer Assessment System Based on a Grid-Based Knowledge Classification Approach on Computer Skills Training

    Science.gov (United States)

    Hsu, Ting-Chia

    2016-01-01

    In this study, a peer assessment system using the grid-based knowledge classification approach was developed to improve students' performance during computer skills training. To evaluate the effectiveness of the proposed approach, an experiment was conducted in a computer skills certification course. The participants were divided into three…

  4. Diagnostic classification of specific phobia subtypes using structural MRI data: a machine-learning approach.

    Science.gov (United States)

    Lueken, Ulrike; Hilbert, Kevin; Wittchen, Hans-Ulrich; Reif, Andreas; Hahn, Tim

    2015-01-01

    While neuroimaging research has advanced our knowledge about fear circuitry dysfunctions in anxiety disorders, findings based on diagnostic groups do not translate into diagnostic value for the individual patient. Machine-learning generates predictive information that can be used for single subject classification. We applied Gaussian process classifiers to a sample of patients with specific phobia as a model disorder for pathological forms of anxiety to test for classification based on structural MRI data. Gray (GM) and white matter (WM) volumetric data were analyzed in 33 snake phobics (SP; animal subtype), 26 dental phobics (DP; blood-injection-injury subtype) and 37 healthy controls (HC). Results showed good accuracy rates for GM and WM data in predicting phobia subtypes (GM: 62 % phobics vs. HC, 86 % DP vs. HC, 89 % SP vs. HC, 89 % DP vs. SP; WM: 88 % phobics vs. HC, 89 % DP vs. HC, 79 % SP vs. HC, 79 % DP vs. HC). Regarding GM, classification improved when considering the subtype compared to overall phobia status. The discriminatory brain pattern was not solely based on fear circuitry structures but included widespread cortico-subcortical networks. Results demonstrate that multivariate pattern recognition represents a promising approach for the development of neuroimaging-based diagnostic markers that could support clinical decisions. Regarding the increasing number of fMRI studies on anxiety disorders, researchers are encouraged to use functional and structural data not only for studying phenotype characteristics on a group level, but also to evaluate their incremental value for diagnostic or prognostic purposes.

  5. An integrated approach for identifying wrongly labelled samples when performing classification in microarray data.

    Directory of Open Access Journals (Sweden)

    Yuk Yee Leung

    Full Text Available BACKGROUND: Using hybrid approach for gene selection and classification is common as results obtained are generally better than performing the two tasks independently. Yet, for some microarray datasets, both classification accuracy and stability of gene sets obtained still have rooms for improvement. This may be due to the presence of samples with wrong class labels (i.e. outliers. Outlier detection algorithms proposed so far are either not suitable for microarray data, or only solve the outlier detection problem on their own. RESULTS: We tackle the outlier detection problem based on a previously proposed Multiple-Filter-Multiple-Wrapper (MFMW model, which was demonstrated to yield promising results when compared to other hybrid approaches (Leung and Hung, 2010. To incorporate outlier detection and overcome limitations of the existing MFMW model, three new features are introduced in our proposed MFMW-outlier approach: 1 an unbiased external Leave-One-Out Cross-Validation framework is developed to replace internal cross-validation in the previous MFMW model; 2 wrongly labeled samples are identified within the MFMW-outlier model; and 3 a stable set of genes is selected using an L1-norm SVM that removes any redundant genes present. Six binary-class microarray datasets were tested. Comparing with outlier detection studies on the same datasets, MFMW-outlier could detect all the outliers found in the original paper (for which the data was provided for analysis, and the genes selected after outlier removal were proven to have biological relevance. We also compared MFMW-outlier with PRAPIV (Zhang et al., 2006 based on same synthetic datasets. MFMW-outlier gave better average precision and recall values on three different settings. Lastly, artificially flipped microarray datasets were created by removing our detected outliers and flipping some of the remaining samples' labels. Almost all the 'wrong' (artificially flipped samples were detected, suggesting

  6. Classification of fish samples via an integrated proteomics and bioinformatics approach.

    Science.gov (United States)

    Bellgard, Matthew; Taplin, Ross; Chapman, Brett; Livk, Andreja; Wellington, Crispin; Hunter, Adam; Lipscombe, Richard

    2013-11-01

    There is an increasing demand to develop cost-effective and accurate approaches to analyzing biological tissue samples. This is especially relevant in the fishing industry where closely related fish samples can be mislabeled, and the high market value of certain fish leads to the use of alternative species as substitutes, for example, Barramundi and Nile Perch (belonging to the same genus, Lates). There is a need to combine selective proteomic datasets with sophisticated computational analysis to devise a robust classification approach. This paper describes an integrated MS-based proteomics and bioinformatics approach to classifying a range of fish samples. A classifier is developed using training data that successfully discriminates between Barramundi and Nile Perch samples using a selected protein subset of the proteome. Additionally, the classifier is shown to successfully discriminate between test samples not used to develop the classifier, including samples that have been cooked, and to classify other fish species as neither Barramundi nor Nile Perch. This approach has applications to truth in labeling for fishmongers and restaurants, monitoring fish catches, and for scientific research into distances between species. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Flood mapping using VHR satellite imagery: a comparison between different classification approaches

    Science.gov (United States)

    Franci, Francesca; Boccardo, Piero; Mandanici, Emanuele; Roveri, Elena; Bitelli, Gabriele

    2016-10-01

    Various regions in Europe have suffered from severe flooding over the last decades. Flood disasters often have a broad extent and a high frequency. They are considered the most devastating natural hazards because of the tremendous fatalities, injuries, property damages, economic and social disruption that they cause. In this context, Earth Observation techniques have become a key tool for flood risk and damage assessment. In particular, remote sensing facilitates flood surveying, providing valuable information, e.g. flood occurrence, intensity and progress of flood inundation, spurs and embankments affected/threatened. The present work aims to investigate the use of Very High Resolution satellite imagery for mapping flood-affected areas. The case study is the November 2013 flood event which occurred in Sardinia region (Italy), affecting a total of 2,700 people and killing 18 persons. The investigated zone extends for 28 km2 along the Posada river, from the Maccheronis dam to the mouth in the Tyrrhenian sea. A post-event SPOT6 image was processed by means of different classification methods, in order to produce the flood map of the analysed area. The unsupervised classification algorithm ISODATA was tested. A pixel-based supervised technique was applied using the Maximum Likelihood algorithm; moreover, the SPOT 6 image was processed by means of object-oriented approaches. The produced flood maps were compared among each other and with an independent data source, in order to evaluate the performance of each method, also in terms of time demand.

  8. Land cover data from Landsat single-date archive imagery: an integrated classification approach

    Science.gov (United States)

    Bajocco, Sofia; Ceccarelli, Tomaso; Rinaldo, Simone; De Angelis, Antonella; Salvati, Luca; Perini, Luigi

    2012-10-01

    The analysis of land cover dynamics provides insight into many environmental problems. However, there are few data sources which can be used to derive consistent time series, remote sensing being one of the most valuable ones. Due to their multi-temporal and spatial coverage needs, such analysis is usually based on large land cover datasets, which requires automated, objective and repeatable procedures. The USGS Landsat archives provide free access to multispectral, high-resolution remotely sensed data starting from the mid-eighties; in many cases, however, only single date images are available. This paper suggests an objective approach for generating land cover information from 30m resolution and single date Landsat archive satellite imagery. A procedure was developed integrating pixel-based and object-oriented classifiers, which consists of the following basic steps: i) pre-processing of the satellite image, including radiance and reflectance calibration, texture analysis and derivation of vegetation indices, ii) segmentation of the pre-processed image, iii) its classification integrating both radiometric and textural properties. The integrated procedure was tested for an area in Sardinia Region, Italy, and compared with a purely pixel-based one. Results demonstrated that a better overall accuracy, evaluated against the available land cover cartography, was obtained with the integrated (86%) compared to the pixel-based classification (68%) at the first CORINE Land Cover level. The proposed methodology needs to be further tested for evaluating its trasferability in time (constructing comparable land cover time series) and space (for covering larger areas).

  9. Identifying Microphone from Noisy Recordings by Using Representative Instance One Class-Classification Approach

    Directory of Open Access Journals (Sweden)

    Huy Quan Vu

    2012-06-01

    Full Text Available Rapid growth of technical developments has created huge challenges for microphone forensics - a sub-category of audio forensic science, because of the availability of numerous digital recording devices and massive amount of recording data. Demand for fast and efficient methods to assure integrity and authenticity of information is becoming more and more important in criminal investigation nowadays. Machine learning has emerged as an important technique to support audio analysis processes of microphone forensic practitioners. However, its application to real life situations using supervised learning is still facing great challenges due to expensiveness in collecting data and updating system. In this paper, we introduce a new machine learning approach which is called One-class Classification (OCC to be applied to microphone forensics; we demonstrate its capability on a corpus of audio samples collected from several microphones. In addition, we propose a representative instance classification framework (RICF that can effectively improve performance of OCC algorithms for recording signal with noise. Experiment results and analysis indicate that OCC has the potential to benefit microphone forensic practitioners in developing new tools and techniques for effective and efficient analysis.

  10. A machine learning approach for classification of anatomical coverage in CT

    Science.gov (United States)

    Wang, Xiaoyong; Lo, Pechin; Ramakrishna, Bharath; Goldin, Johnathan; Brown, Matthew

    2016-03-01

    Automatic classification of anatomical coverage of medical images is critical for big data mining and as a pre-processing step to automatically trigger specific computer aided diagnosis systems. The traditional way to identify scans through DICOM headers has various limitations due to manual entry of series descriptions and non-standardized naming conventions. In this study, we present a machine learning approach where multiple binary classifiers were used to classify different anatomical coverages of CT scans. A one-vs-rest strategy was applied. For a given training set, a template scan was selected from the positive samples and all other scans were registered to it. Each registered scan was then evenly split into k × k × k non-overlapping blocks and for each block the mean intensity was computed. This resulted in a 1 × k3 feature vector for each scan. The feature vectors were then used to train a SVM based classifier. In this feasibility study, four classifiers were built to identify anatomic coverages of brain, chest, abdomen-pelvis, and chest-abdomen-pelvis CT scans. Each classifier was trained and tested using a set of 300 scans from different subjects, composed of 150 positive samples and 150 negative samples. Area under the ROC curve (AUC) of the testing set was measured to evaluate the performance in a two-fold cross validation setting. Our results showed good classification performance with an average AUC of 0.96.

  11. Machine Learning Based Classification of Microsatellite Variation: An Effective Approach for Phylogeographic Characterization of Olive Populations.

    Science.gov (United States)

    Torkzaban, Bahareh; Kayvanjoo, Amir Hossein; Ardalan, Arman; Mousavi, Soraya; Mariotti, Roberto; Baldoni, Luciana; Ebrahimie, Esmaeil; Ebrahimi, Mansour; Hosseini-Mazinani, Mehdi

    2015-01-01

    Finding efficient analytical techniques is overwhelmingly turning into a bottleneck for the effectiveness of large biological data. Machine learning offers a novel and powerful tool to advance classification and modeling solutions in molecular biology. However, these methods have been less frequently used with empirical population genetics data. In this study, we developed a new combined approach of data analysis using microsatellite marker data from our previous studies of olive populations using machine learning algorithms. Herein, 267 olive accessions of various origins including 21 reference cultivars, 132 local ecotypes, and 37 wild olive specimens from the Iranian plateau, together with 77 of the most represented Mediterranean varieties were investigated using a finely selected panel of 11 microsatellite markers. We organized data in two '4-targeted' and '16-targeted' experiments. A strategy of assaying different machine based analyses (i.e. data cleaning, feature selection, and machine learning classification) was devised to identify the most informative loci and the most diagnostic alleles to represent the population and the geography of each olive accession. These analyses revealed microsatellite markers with the highest differentiating capacity and proved efficiency for our method of clustering olive accessions to reflect upon their regions of origin. A distinguished highlight of this study was the discovery of the best combination of markers for better differentiating of populations via machine learning models, which can be exploited to distinguish among other biological populations.

  12. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    Science.gov (United States)

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-07-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. Here, we illustrate how the method can be used to: (1) distinguish between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required.

  13. Automatic detection of photoresist residual layer in lithography using a neural classification approach

    KAUST Repository

    Gereige, Issam

    2012-09-01

    Photolithography is a fundamental process in the semiconductor industry and it is considered as the key element towards extreme nanoscale integration. In this technique, a polymer photo sensitive mask with the desired patterns is created on the substrate to be etched. Roughly speaking, the areas to be etched are not covered with polymer. Thus, no residual layer should remain on these areas in order to insure an optimal transfer of the patterns on the substrate. In this paper, we propose a nondestructive method based on a classification approach achieved by artificial neural network for automatic residual layer detection from an ellipsometric signature. Only the case of regular defect, i.e. homogenous residual layer, will be considered. The limitation of the method will be discussed. Then, an experimental result on a 400 nm period grating manufactured with nanoimprint lithography is analyzed with our method. © 2012 Elsevier B.V. All rights reserved.

  14. Stability Assessment of Natural Caves Using Empirical Approaches and Rock Mass Classifications

    Science.gov (United States)

    Jordá-Bordehore, L.

    2017-08-01

    The stability of underground voids such as caves can be assessed, in an initial approximation, by geomechanical classifications such as the Barton Q index. From a geomechanical viewpoint, the stability of 137 large span natural caves was analyzed herein. The caves were graphically represented based on existing tunnel and underground graphs, according to width and rock quality index Q. Many natural caves analyzed by a tunnel-type engineering approach could result as apparently unstable when represented in empirical existing graphics and would require reinforcements incompatible with speleothems and large chamber heights. A new graph and equation are proposed herein for the maximum span, for the exclusive case of caves, resulting in a reliable representation of large stable natural caves. The main contribution is a new stability chart for natural caves, consisting of two zones: a zone where stable caves are represented and a zone where unstable caves and collapsed caves are located.

  15. Use of a Novel Grammatical Inference Approach in Classification of Amyloidogenic Hexapeptides

    Directory of Open Access Journals (Sweden)

    Wojciech Wieczorek

    2016-01-01

    Full Text Available The present paper is a novel contribution to the field of bioinformatics by using grammatical inference in the analysis of data. We developed an algorithm for generating star-free regular expressions which turned out to be good recommendation tools, as they are characterized by a relatively high correlation coefficient between the observed and predicted binary classifications. The experiments have been performed for three datasets of amyloidogenic hexapeptides, and our results are compared with those obtained using the graph approaches, the current state-of-the-art methods in heuristic automata induction, and the support vector machine. The results showed the superior performance of the new grammatical inference algorithm on fixed-length amyloid datasets.

  16. ISOLATING CONTENT AND METADATA FROM WEBLOGS USING CLASSIFICATION AND RULE-BASED APPROACHES

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, Eric J.; Bell, Eric B.

    2011-09-04

    The emergence and increasing prevalence of social media, such as internet forums, weblogs (blogs), wikis, etc., has created a new opportunity to measure public opinion, attitude, and social structures. A major challenge in leveraging this information is isolating the content and metadata in weblogs, as there is no standard, universally supported, machine-readable format for presenting this information. We present two algorithms for isolating this information. The first uses web block classification, where each node in the Document Object Model (DOM) for a page is classified according to one of several pre-defined attributes from a common blog schema. The second uses a set of heuristics to select web blocks. These algorithms perform at a level suitable for initial use, validating this approach for isolating content and metadata from blogs. The resultant data serves as a starting point for analytical work on the content and substance of collections of weblog pages.

  17. A multiresolution approach to automated classification of protein subcellular location images

    Directory of Open Access Journals (Sweden)

    Srinivasa Gowri

    2007-06-01

    Full Text Available Abstract Background Fluorescence microscopy is widely used to determine the subcellular location of proteins. Efforts to determine location on a proteome-wide basis create a need for automated methods to analyze the resulting images. Over the past ten years, the feasibility of using machine learning methods to recognize all major subcellular location patterns has been convincingly demonstrated, using diverse feature sets and classifiers. On a well-studied data set of 2D HeLa single-cell images, the best performance to date, 91.5%, was obtained by including a set of multiresolution features. This demonstrates the value of multiresolution approaches to this important problem. Results We report here a novel approach for the classification of subcellular location patterns by classifying in multiresolution subspaces. Our system is able to work with any feature set and any classifier. It consists of multiresolution (MR decomposition, followed by feature computation and classification in each MR subspace, yielding local decisions that are then combined into a global decision. With 26 texture features alone and a neural network classifier, we obtained an increase in accuracy on the 2D HeLa data set to 95.3%. Conclusion We demonstrate that the space-frequency localized information in the multiresolution subspaces adds significantly to the discriminative power of the system. Moreover, we show that a vastly reduced set of features is sufficient, consisting of our novel modified Haralick texture features. Our proposed system is general, allowing for any combinations of sets of features and any combination of classifiers.

  18. An iterative approach to optimize change classification in SAR time series data

    Science.gov (United States)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2016-10-01

    The detection of changes using remote sensing imagery has become a broad field of research with many approaches for many different applications. Besides the simple detection of changes between at least two images acquired at different times, analyses which aim on the change type or category are at least equally important. In this study, an approach for a semi-automatic classification of change segments is presented. A sparse dataset is considered to ensure the fast and simple applicability for practical issues. The dataset is given by 15 high resolution (HR) TerraSAR-X (TSX) amplitude images acquired over a time period of one year (11/2013 to 11/2014). The scenery contains the airport of Stuttgart (GER) and its surroundings, including urban, rural, and suburban areas. Time series imagery offers the advantage of analyzing the change frequency of selected areas. In this study, the focus is set on the analysis of small-sized high frequently changing regions like parking areas, construction sites and collecting points consisting of high activity (HA) change objects. For each HA change object, suitable features are extracted and a k-means clustering is applied as the categorization step. Resulting clusters are finally compared to a previously introduced knowledge-based class catalogue, which is modified until an optimal class description results. In other words, the subjective understanding of the scenery semantics is optimized by the data given reality. Doing so, an even sparsely dataset containing only amplitude imagery can be evaluated without requiring comprehensive training datasets. Falsely defined classes might be rejected. Furthermore, classes which were defined too coarsely might be divided into sub-classes. Consequently, classes which were initially defined too narrowly might be merged. An optimal classification results when the combination of previously defined key indicators (e.g., number of clusters per class) reaches an optimum.

  19. Web Approach for Ontology-Based Classification, Integration, and Interdisciplinary Usage of Geoscience Metadata

    Directory of Open Access Journals (Sweden)

    B Ritschel

    2012-10-01

    Full Text Available The Semantic Web is a W3C approach that integrates the different sources of semantics within documents and services using ontology-based techniques. The main objective of this approach in the geoscience domain is the improvement of understanding, integration, and usage of Earth and space science related web content in terms of data, information, and knowledge for machines and people. The modeling and representation of semantic attributes and relations within and among documents can be realized by human readable concept maps and machine readable OWL documents. The objectives for the usage of the Semantic Web approach in the GFZ data center ISDC project are the design of an extended classification of metadata documents for product types related to instruments, platforms, and projects as well as the integration of different types of metadata related to data product providers, users, and data centers. Sources of content and semantics for the description of Earth and space science product types and related classes are standardized metadata documents (e.g., DIF documents, publications, grey literature, and Web pages. Other sources are information provided by users, such as tagging data and social navigation information. The integration of controlled vocabularies as well as folksonomies plays an important role in the design of well formed ontologies.

  20. Clinical features of organophosphate poisoning: A review of different classification systems and approaches

    Directory of Open Access Journals (Sweden)

    John Victor Peter

    2014-01-01

    Full Text Available Purpose: The typical toxidrome in organophosphate (OP poisoning comprises of the Salivation, Lacrimation, Urination, Defecation, Gastric cramps, Emesis (SLUDGE symptoms. However, several other manifestations are described. We review the spectrum of symptoms and signs in OP poisoning as well as the different approaches to clinical features in these patients. Materials and Methods: Articles were obtained by electronic search of PubMed® between 1966 and April 2014 using the search terms organophosphorus compounds or phosphoric acid esters AND poison or poisoning AND manifestations. Results: Of the 5026 articles on OP poisoning, 2584 articles pertained to human poisoning; 452 articles focusing on clinical manifestations in human OP poisoning were retrieved for detailed evaluation. In addition to the traditional approach of symptoms and signs of OP poisoning as peripheral (muscarinic, nicotinic and central nervous system receptor stimulation, symptoms were alternatively approached using a time-based classification. In this, symptom onset was categorized as acute (within 24-h, delayed (24-h to 2-week or late (beyond 2-week. Although most symptoms occur with minutes or hours following acute exposure, delayed onset symptoms occurring after a period of minimal or mild symptoms, may impact treatment and timing of the discharge following acute exposure. Symptoms and signs were also viewed as an organ specific as cardiovascular, respiratory or neurological manifestations. An organ specific approach enables focused management of individual organ dysfunction that may vary with different OP compounds. Conclusions: Different approaches to the symptoms and signs in OP poisoning may better our understanding of the underlying mechanism that in turn may assist with the management of acutely poisoned patients.

  1. Clinical features of organophosphate poisoning: A review of different classification systems and approaches.

    Science.gov (United States)

    Peter, John Victor; Sudarsan, Thomas Isiah; Moran, John L

    2014-11-01

    The typical toxidrome in organophosphate (OP) poisoning comprises of the Salivation, Lacrimation, Urination, Defecation, Gastric cramps, Emesis (SLUDGE) symptoms. However, several other manifestations are described. We review the spectrum of symptoms and signs in OP poisoning as well as the different approaches to clinical features in these patients. Articles were obtained by electronic search of PubMed(®) between 1966 and April 2014 using the search terms organophosphorus compounds or phosphoric acid esters AND poison or poisoning AND manifestations. Of the 5026 articles on OP poisoning, 2584 articles pertained to human poisoning; 452 articles focusing on clinical manifestations in human OP poisoning were retrieved for detailed evaluation. In addition to the traditional approach of symptoms and signs of OP poisoning as peripheral (muscarinic, nicotinic) and central nervous system receptor stimulation, symptoms were alternatively approached using a time-based classification. In this, symptom onset was categorized as acute (within 24-h), delayed (24-h to 2-week) or late (beyond 2-week). Although most symptoms occur with minutes or hours following acute exposure, delayed onset symptoms occurring after a period of minimal or mild symptoms, may impact treatment and timing of the discharge following acute exposure. Symptoms and signs were also viewed as an organ specific as cardiovascular, respiratory or neurological manifestations. An organ specific approach enables focused management of individual organ dysfunction that may vary with different OP compounds. Different approaches to the symptoms and signs in OP poisoning may better our understanding of the underlying mechanism that in turn may assist with the management of acutely poisoned patients.

  2. A support vector machine approach for classification of welding defects from ultrasonic signals

    Science.gov (United States)

    Chen, Yuan; Ma, Hong-Wei; Zhang, Guang-Ming

    2014-07-01

    Defect classification is an important issue in ultrasonic non-destructive evaluation. A layered multi-class support vector machine (LMSVM) classification system, which combines multiple SVM classifiers through a layered architecture, is proposed in this paper. The proposed LMSVM classification system is applied to the classification of welding defects from ultrasonic test signals. The measured ultrasonic defect echo signals are first decomposed into wavelet coefficients by the wavelet packet transform. The energy of the wavelet coefficients at different frequency channels are used to construct the feature vectors. The bees algorithm (BA) is then used for feature selection and SVM parameter optimisation for the LMSVM classification system. The BA-based feature selection optimises the energy feature vectors. The optimised feature vectors are input to the LMSVM classification system for training and testing. Experimental results of classifying welding defects demonstrate that the proposed technique is highly robust, precise and reliable for ultrasonic defect classification.

  3. Assessment of the classification abilities of the CNS multi-parametric optimization approach by the method of logistic regression.

    Science.gov (United States)

    Raevsky, O A; Polianczyk, D E; Mukhametov, A; Grigorev, V Y

    2016-08-01

    Assessment of "CNS drugs/CNS candidates" classification abilities of the multi-parametric optimization (CNS MPO) approach was performed by logistic regression. It was found that the five out of the six separately used physical-chemical properties (topological polar surface area, number of hydrogen-bonded donor atoms, basicity, lipophilicity of compound in neutral form and at pH = 7.4) provided accuracy of recognition below 60%. Only the descriptor of molecular weight (MW) could correctly classify two-thirds of the studied compounds. Aggregation of all six properties in the MPOscore did not improve the classification, which was worse than the classification using only MW. The results of our study demonstrate the imperfection of the CNS MPO approach; in its current form it is not very useful for computer design of new, effective CNS drugs.

  4. In silico prediction of ROCK II inhibitors by different classification approaches.

    Science.gov (United States)

    Cai, Chuipu; Wu, Qihui; Luo, Yunxia; Ma, Huili; Shen, Jiangang; Zhang, Yongbin; Yang, Lei; Chen, Yunbo; Wen, Zehuai; Wang, Qi

    2017-08-02

    ROCK II is an important pharmacological target linked to central nervous system disorders such as Alzheimer's disease. The purpose of this research is to generate ROCK II inhibitor prediction models by machine learning approaches. Firstly, four sets of descriptors were calculated with MOE 2010 and PaDEL-Descriptor, and optimized by F-score and linear forward selection methods. In addition, four classification algorithms were used to initially build 16 classifiers with k-nearest neighbors [Formula: see text], naïve Bayes, Random forest, and support vector machine. Furthermore, three sets of structural fingerprint descriptors were introduced to enhance the predictive capacity of classifiers, which were assessed with fivefold cross-validation, test set validation and external test set validation. The best two models, MFK + MACCS and MLR + SubFP, have both MCC values of 0.925 for external test set. After that, a privileged substructure analysis was performed to reveal common chemical features of ROCK II inhibitors. Finally, binding modes were analyzed to identify relationships between molecular descriptors and activity, while main interactions were revealed by comparing the docking interaction of the most potent and the weakest ROCK II inhibitors. To the best of our knowledge, this is the first report on ROCK II inhibitors utilizing machine learning approaches that provides a new method for discovering novel ROCK II inhibitors.

  5. A Mechanism-based 3D-QSAR Approach for Classification ...

    Science.gov (United States)

    Organophosphate (OP) and carbamate esters can inhibit acetylcholinesterase (AChE) by binding covalently to a serine residue in the enzyme active site, and their inhibitory potency depends largely on affinity for the enzyme and the reactivity of the ester. Despite this understanding, there has been no mechanism-based in silico approach for classification and prediction of the inhibitory potency of ether OPs or carbamates. This prompted us to develop a three dimensional prediction framework for OPs, carbamates, and their analogs. Inhibitory structures of a compound that can form the covalent bond were identified through analysis of docked conformations of the compound and its metabolites. Inhibitory potencies of the selected structures were then predicted using a previously developed three dimensional quantitative structure-active relationship. This approach was validated with a large number of structurally diverse OP and carbamate compounds encompassing widely used insecticides and structural analogs including OP flame retardants and thio- and dithiocarbamate pesticides. The modeling revealed that: (1) in addition to classical OP metabolic activation, the toxicity of carbamate compounds can be dependent on biotransformation, (2) OP and carbamate analogs such as OP flame retardants and thiocarbamate herbicides can act as AChEI, (3) hydrogen bonds at the oxyanion hole is critical for AChE inhibition through the covalent bond, and (4) π–π interaction with Trp86

  6. An Effective Big Data Supervised Imbalanced Classification Approach for Ortholog Detection in Related Yeast Species

    Directory of Open Access Journals (Sweden)

    Deborah Galpert

    2015-01-01

    Full Text Available Orthology detection requires more effective scaling algorithms. In this paper, a set of gene pair features based on similarity measures (alignment scores, sequence length, gene membership to conserved regions, and physicochemical profiles are combined in a supervised pairwise ortholog detection approach to improve effectiveness considering low ortholog ratios in relation to the possible pairwise comparison between two genomes. In this scenario, big data supervised classifiers managing imbalance between ortholog and nonortholog pair classes allow for an effective scaling solution built from two genomes and extended to other genome pairs. The supervised approach was compared with RBH, RSD, and OMA algorithms by using the following yeast genome pairs: Saccharomyces cerevisiae-Kluyveromyces lactis, Saccharomyces cerevisiae-Candida glabrata, and Saccharomyces cerevisiae-Schizosaccharomyces pombe as benchmark datasets. Because of the large amount of imbalanced data, the building and testing of the supervised model were only possible by using big data supervised classifiers managing imbalance. Evaluation metrics taking low ortholog ratios into account were applied. From the effectiveness perspective, MapReduce Random Oversampling combined with Spark SVM outperformed RBH, RSD, and OMA, probably because of the consideration of gene pair features beyond alignment similarities combined with the advances in big data supervised classification.

  7. An Abstract Description Approach to the Discovery and Classification of Bioinformatics Web Sources

    Energy Technology Data Exchange (ETDEWEB)

    Rocco, D; Critchlow, T J

    2003-05-01

    The World Wide Web provides an incredible resource to genomics researchers in the form of dynamic data sources--e.g. BLAST sequence homology search interfaces. The growth rate of these sources outpaces the speed at which they can be manually classified, meaning that the available data is not being utilized to its full potential. Existing research has not addressed the problems of automatically locating, classifying, and integrating classes of bioinformatics data sources. This paper presents an overview of a system for finding classes of bioinformatics data sources and integrating them behind a unified interface. We examine an approach to classifying these sources automatically that relies on an abstract description format: the service class description. This format allows a domain expert to describe the important features of an entire class of services without tying that description to any particular Web source. We present the features of this description format in the context of BLAST sources to show how the service class description relates to Web sources that are being described. We then show how a service class description can be used to classify an arbitrary Web source to determine if that source is an instance of the described service. To validate the effectiveness of this approach, we have constructed a prototype that can correctly classify approximately two-thirds of the BLAST sources we tested. We then examine these results, consider the factors that affect correct automatic classification, and discuss future work.

  8. Using Different Approaches to Approximate a Pareto Front for a Multiobjective Evolutionary Algorithm: Optimal Thinning Regimes for Eucalyptus fastigata

    Directory of Open Access Journals (Sweden)

    Oliver Chikumbo

    2012-01-01

    Full Text Available A stand-level, multiobjective evolutionary algorithm (MOEA for determining a set of efficient thinning regimes satisfying two objectives, that is, value production for sawlog harvesting and volume production for a pulpwood market, was successfully demonstrated for a Eucalyptus fastigata trial in Kaingaroa Forest, New Zealand. The MOEA approximated the set of efficient thinning regimes (with a discontinuous Pareto front by employing a ranking scheme developed by Fonseca and Fleming (1993, which was a Pareto-based ranking (a.k.a Multiobjective Genetic Algorithm—MOGA. In this paper we solve the same problem using an improved version of a fitness sharing Pareto ranking algorithm (a.k.a Nondominated Sorting Genetic Algorithm—NSGA II originally developed by Srinivas and Deb (1994 and examine the results. Our findings indicate that NSGA II approximates the entire Pareto front whereas MOGA only determines a subdomain of the Pareto points.

  9. Automatic approach to solve the morphological galaxy classification problem using the sparse representation technique and dictionary learning

    Science.gov (United States)

    Diaz-Hernandez, R.; Ortiz-Esquivel, A.; Peregrina-Barreto, H.; Altamirano-Robles, L.; Gonzalez-Bernal, J.

    2016-06-01

    The observation of celestial objects in the sky is a practice that helps astronomers to understand the way in which the Universe is structured. However, due to the large number of observed objects with modern telescopes, the analysis of these by hand is a difficult task. An important part in galaxy research is the morphological structure classification based on the Hubble sequence. In this research, we present an approach to solve the morphological galaxy classification problem in an automatic way by using the Sparse Representation technique and dictionary learning with K-SVD. For the tests in this work, we use a database of galaxies extracted from the Principal Galaxy Catalog (PGC) and the APM Equatorial Catalogue of Galaxies obtaining a total of 2403 useful galaxies. In order to represent each galaxy frame, we propose to calculate a set of 20 features such as Hu's invariant moments, galaxy nucleus eccentricity, gabor galaxy ratio and some other features commonly used in galaxy classification. A stage of feature relevance analysis was performed using Relief-f in order to determine which are the best parameters for the classification tests using 2, 3, 4, 5, 6 and 7 galaxy classes making signal vectors of different length values with the most important features. For the classification task, we use a 20-random cross-validation technique to evaluate classification accuracy with all signal sets achieving a score of 82.27 % for 2 galaxy classes and up to 44.27 % for 7 galaxy classes.

  10. The Influence of Polarimetric Parameters and an Object-Based Approach on Land Cover Classification in Coastal Wetlands

    Directory of Open Access Journals (Sweden)

    Yuanyuan Chen

    2014-12-01

    Full Text Available The purpose of this study was to examine how different polarimetric parameters and an object-based approach influence the classification results of various land use/land cover types using fully polarimetric ALOS PALSAR data over coastal wetlands in Yancheng, China. To verify the efficiency of the proposed method, five other classifications (the Wishart supervised classification, the proposed method without polarimetric parameters, the proposed method without an object-based analysis, the proposed method without textural and geometric information and the proposed method using the nearest-neighbor classifier were applied for comparison. The results indicated that some polarimetric parameters, such as Shannon entropy, Krogager_Kd, Alpha, HAAlpha_T11, VanZyl3_Vol, Derd, Barnes2_T33, polarization fraction, Barnes1_T33, Neuman_delta_mod and entropy, greatly improved the classification results. The shape index was a useful feature in distinguishing fish ponds and rivers. The distance to the sea can be regarded as an important factor in reducing the confusion between herbaceous wetland vegetation and grasslands. Furthermore, the decision tree algorithm increased the overall accuracy by 6.8% compared with the nearest neighbor classifier. This research demonstrated that different polarimetric parameters and the object-based approach significantly improved the performance of land cover classification in coastal wetlands using ALOS PALSAR data.

  11. Buildings classification from airborne LiDAR point clouds through OBIA and ontology driven approach

    Science.gov (United States)

    Tomljenovic, Ivan; Belgiu, Mariana; Lampoltshammer, Thomas J.

    2013-04-01

    In the last years, airborne Light Detection and Ranging (LiDAR) data proved to be a valuable information resource for a vast number of applications ranging from land cover mapping to individual surface feature extraction from complex urban environments. To extract information from LiDAR data, users apply prior knowledge. Unfortunately, there is no consistent initiative for structuring this knowledge into data models that can be shared and reused across different applications and domains. The absence of such models poses great challenges to data interpretation, data fusion and integration as well as information transferability. The intention of this work is to describe the design, development and deployment of an ontology-based system to classify buildings from airborne LiDAR data. The novelty of this approach consists of the development of a domain ontology that specifies explicitly the knowledge used to extract features from airborne LiDAR data. The overall goal of this approach is to investigate the possibility for classification of features of interest from LiDAR data by means of domain ontology. The proposed workflow is applied to the building extraction process for the region of "Biberach an der Riss" in South Germany. Strip-adjusted and georeferenced airborne LiDAR data is processed based on geometrical and radiometric signatures stored within the point cloud. Region-growing segmentation algorithms are applied and segmented regions are exported to the GeoJSON format. Subsequently, the data is imported into the ontology-based reasoning process used to automatically classify exported features of interest. Based on the ontology it becomes possible to define domain concepts, associated properties and relations. As a consequence, the resulting specific body of knowledge restricts possible interpretation variants. Moreover, ontologies are machinable and thus it is possible to run reasoning on top of them. Available reasoners (FACT++, JESS, Pellet) are used to check

  12. Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben;

    2016-01-01

    Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based......). The results of this study suggest that our proposed methodology is specialized to deal with the classification problem of highly imbalanced classes with significant overlap....... on the portioning of information space; and (2) use of the genetic algorithm to solve combinatorial problems for classification. In particular, we will implement our methodology to solve complex classification problems and compare the performance of our classifier with other well-known methods (SVM, KNN, and ANN...

  13. Phylogeny and classification of the trapdoor spider genus Myrmekiaphila: an integrative approach to evaluating taxonomic hypotheses.

    Directory of Open Access Journals (Sweden)

    Ashley L Bailey

    Full Text Available BACKGROUND: Revised by Bond and Platnick in 2007, the trapdoor spider genus Myrmekiaphila comprises 11 species. Species delimitation and placement within one of three species groups was based on modifications of the male copulatory device. Because a phylogeny of the group was not available these species groups might not represent monophyletic lineages; species definitions likewise were untested hypotheses. The purpose of this study is to reconstruct the phylogeny of Myrmekiaphila species using molecular data to formally test the delimitation of species and species-groups. We seek to refine a set of established systematic hypotheses by integrating across molecular and morphological data sets. METHODS AND FINDINGS: Phylogenetic analyses comprising Bayesian searches were conducted for a mtDNA matrix composed of contiguous 12S rRNA, tRNA-val, and 16S rRNA genes and a nuclear DNA matrix comprising the glutamyl and prolyl tRNA synthetase gene each consisting of 1348 and 481 bp, respectively. Separate analyses of the mitochondrial and nuclear genome data and a concatenated data set yield M. torreya and M. millerae paraphyletic with respect to M. coreyi and M. howelli and polyphyletic fluviatilis and foliata species groups. CONCLUSIONS: Despite the perception that molecular data present a solution to a crisis in taxonomy, studies like this demonstrate the efficacy of an approach that considers data from multiple sources. A DNA barcoding approach during the species discovery process would fail to recognize at least two species (M. coreyi and M. howelli whereas a combined approach more accurately assesses species diversity and illuminates speciation pattern and process. Concomitantly these data also demonstrate that morphological characters likewise fail in their ability to recover monophyletic species groups and result in an unnatural classification. Optimizations of these characters demonstrate a pattern of "Dollo evolution" wherein a complex character

  14. Toward the Improvement of Trail Classification in National Parks Using the Recreation Opportunity Spectrum Approach

    Science.gov (United States)

    Oishi, Yoshitaka

    2013-06-01

    Trail settings in national parks are essential management tools for improving both ecological conservation efforts and the quality of visitor experiences. This study proposes a plan for the appropriate maintenance of trails in Chubusangaku National Park, Japan, based on the recreation opportunity spectrum (ROS) approach. First, we distributed 452 questionnaires to determine park visitors' preferences for setting a trail (response rate = 68 %). Respondents' preferences were then evaluated according to the following seven parameters: access, remoteness, naturalness, facilities and site management, social encounters, visitor impact, and visitor management. Using nonmetric multidimensional scaling and cluster analysis, the visitors were classified into seven groups. Last, we classified the actual trails according to the visitor questionnaire criteria to examine the discrepancy between visitors' preferences and actual trail settings. The actual trail classification indicated that while most developed trails were located in accessible places, primitive trails were located in remote areas. However, interestingly, two visitor groups seemed to prefer a well-conserved natural environment and, simultaneously, easily accessible trails. This finding does not correspond to a premise of the ROS approach, which supposes that primitive trails should be located in remote areas without ready access. Based on this study's results, we propose that creating trails, which afford visitors the opportunity to experience a well-conserved natural environment in accessible areas is a useful means to provide visitors with diverse recreation opportunities. The process of data collection and analysis in this study can be one approach to produce ROS maps for providing visitors with recreational opportunities of greater diversity and higher quality.

  15. Fish Springs National Wildlife Refuge water regime map

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Water regime map for Fish Springs National Wildlife Refuge. This map of water regimes on the refuge was created along with the National Vegetation Classification...

  16. Classification of first-episode psychosis: a multi-modal multi-feature approach integrating structural and diffusion imaging.

    Science.gov (United States)

    Peruzzo, Denis; Castellani, Umberto; Perlini, Cinzia; Bellani, Marcella; Marinelli, Veronica; Rambaldelli, Gianluca; Lasalvia, Antonio; Tosato, Sarah; De Santi, Katia; Murino, Vittorio; Ruggeri, Mirella; Brambilla, Paolo

    2015-06-01

    Currently, most of the classification studies of psychosis focused on chronic patients and employed single machine learning approaches. To overcome these limitations, we here compare, to our best knowledge for the first time, different classification methods of first-episode psychosis (FEP) using multi-modal imaging data exploited on several cortical and subcortical structures and white matter fiber bundles. 23 FEP patients and 23 age-, gender-, and race-matched healthy participants were included in the study. An innovative multivariate approach based on multiple kernel learning (MKL) methods was implemented on structural MRI and diffusion tensor imaging. MKL provides the best classification performances in comparison with the more widely used support vector machine, enabling the definition of a reliable automatic decisional system based on the integration of multi-modal imaging information. Our results show a discrimination accuracy greater than 90 % between healthy subjects and patients with FEP. Regions with an accuracy greater than 70 % on different imaging sources and measures were middle and superior frontal gyrus, parahippocampal gyrus, uncinate fascicles, and cingulum. This study shows that multivariate machine learning approaches integrating multi-modal and multisource imaging data can classify FEP patients with high accuracy. Interestingly, specific grey matter structures and white matter bundles reach high classification reliability when using different imaging modalities and indices, potentially outlining a prefronto-limbic network impaired in FEP with particular regard to the right hemisphere.

  17. An Axiomatic Approach to the notion of Similarity of individual Sequences and their Classification

    CERN Document Server

    Ziv, Jacob

    2011-01-01

    An axiomatic approach to the notion of similarity of sequences, that seems to be natural in many cases (e.g. Phylogenetic analysis), is proposed. Despite of the fact that it is not assume that the sequences are a realization of a probabilistic process (e.g. a variable-order Markov process), it is demonstrated that any classifier that fully complies with the proposed similarity axioms must be based on modeling of the training data that is contained in a (long) individual training sequence via a suffix tree with no more than O(N) leaves (or, alternatively, a table with O(N) entries) where N is the length of the test sequence. Some common classification algorithms may be slightly modified to comply with the proposed axiomatic conditions and the resulting organization of the training data, thus yielding a formal justification for their good empirical performance without relying on any a-priori (sometimes unjustified) probabilistic assumption. One such case is discussed in details.

  18. Comparison of two approaches for the classification of 16S rRNA gene sequences.

    Science.gov (United States)

    Chatellier, Sonia; Mugnier, Nathalie; Allard, Françoise; Bonnaud, Bertrand; Collin, Valérie; van Belkum, Alex; Veyrieras, Jean-Baptiste; Emler, Stefan

    2014-10-01

    The use of 16S rRNA gene sequences for microbial identification in clinical microbiology is accepted widely, and requires databases and algorithms. We compared a new research database containing curated 16S rRNA gene sequences in combination with the lca (lowest common ancestor) algorithm (RDB-LCA) to a commercially available 16S rDNA Centroid approach. We used 1025 bacterial isolates characterized by biochemistry, matrix-assisted laser desorption/ionization time-of-flight MS and 16S rDNA sequencing. Nearly 80 % of isolates were identified unambiguously at the species level by both classification platforms used. The remaining isolates were mostly identified correctly at the genus level due to the limited resolution of 16S rDNA sequencing. Discrepancies between both 16S rDNA platforms were due to differences in database content and the algorithm used, and could amount to up to 10.5 %. Up to 1.4 % of the analyses were found to be inconclusive. It is important to realize that despite the overall good performance of the pipelines for analysis, some inconclusive results remain that require additional in-depth analysis performed using supplementary methods.

  19. Idiopathic interstitial pneumonias and emphysema: detection and classification using a texture-discriminative approach

    Science.gov (United States)

    Fetita, C.; Chang-Chien, K. C.; Brillet, P. Y.; Pr"teux, F.; Chang, R. F.

    2012-03-01

    Our study aims at developing a computer-aided diagnosis (CAD) system for fully automatic detection and classification of pathological lung parenchyma patterns in idiopathic interstitial pneumonias (IIP) and emphysema using multi-detector computed tomography (MDCT). The proposed CAD system is based on three-dimensional (3-D) mathematical morphology, texture and fuzzy logic analysis, and can be divided into four stages: (1) a multi-resolution decomposition scheme based on a 3-D morphological filter was exploited to discriminate the lung region patterns at different analysis scales. (2) An additional spatial lung partitioning based on the lung tissue texture was introduced to reinforce the spatial separation between patterns extracted at the same resolution level in the decomposition pyramid. Then, (3) a hierarchic tree structure was exploited to describe the relationship between patterns at different resolution levels, and for each pattern, six fuzzy membership functions were established for assigning a probability of association with a normal tissue or a pathological target. Finally, (4) a decision step exploiting the fuzzy-logic assignments selects the target class of each lung pattern among the following categories: normal (N), emphysema (EM), fibrosis/honeycombing (FHC), and ground glass (GDG). According to a preliminary evaluation on an extended database, the proposed method can overcome the drawbacks of a previously developed approach and achieve higher sensitivity and specificity.

  20. Comparison of Electrocardiogram Signals in Men and Women during Creativity with Classification Approaches

    Directory of Open Access Journals (Sweden)

    Sahar ZAKERI

    2016-07-01

    Full Text Available Electrocardiogram (ECG analysis is mostly used as a valuable tool in the evaluation of cognitive tasks. By taking and analyzing measurements in vast quantities, researchers are working toward a better understanding of how human physiological systems work. For the first time, this study investigated the function of the cardiovascular system during creative thinking. In addition, the difference between male/female and normal/creativity states from ECG signals was investigated. Overall, the purpose of this paper was to illustrate the heart working during the creativity, and discover the creative men or women subjects. For these goals, six nonlinear features of the ECG signal were extracted to detect creativity states. During the three tasks of the Torrance Tests of Creative Thinking (TTCT- Figural B, ECG signals were recorded from 52 participants (26 men and 26 women. Then, the proficiency of two kinds of classification approaches was evaluated: Artificial Neural Network (ANN and Support Vector Machine (SVM. The results indicated the high accuracy rate of discriminations between male/female (96.09% and normal/creativity states (95.84% using ANN classifier. Therefore, the proposed method can be useful to detect the creativity states.

  1. A novel approach for three dimensional dendrite spine segmentation and classification

    Science.gov (United States)

    He, Tiancheng; Xue, Zhong; Wong, Stephen T. C.

    2012-02-01

    Dendritic spines are small, bulbous cellular compartments that carry synapses. Biologists have been studying the biochemical and genetic pathways by examining the morphological changes of the dendritic spines at the intracellular level. Automatic dendritic spine detection from high resolution microscopic images is an important step for such morphological studies. In this paper, a novel approach to automated dendritic spine detection is proposed based on a nonlinear degeneration model. Dendritic spines are recognized as small objects with variable shapes attached to dendritic backbones. We explore the problem of dendritic spine detection from a different angle, i.e., the nonlinear degeneration equation (NDE) is utilized to enhance the morphological differences between the dendrite and spines. Using NDE, we simulated degeneration for dendritic spine detection. Based on the morphological features, the shrinking rate on dendrite pixels is different from that on spines, so that spines can be detected and segmented after degeneration simulation. Then, to separate spines into different types, Gaussian curvatures were employed, and the biomimetic pattern recognition theory was applied for spine classification. In the experiments, we compared quantitatively the spine detection accuracy with previous methods, and the results showed the accuracy and superiority of our methods.

  2. A Machine-learning approach to classification of X-ray sources

    Science.gov (United States)

    Hare, Jeremy; Kargaltsev, Oleg; Rangelov, Blagoy; Pavlov, George; Posselt, Bettina; Volkov, Igor

    2017-08-01

    Chandra and XMM-Newton X-ray observatories have serendipitously detected a large number of Galactic sources. Although their properties are automatically extracted and stored in catalogs, most of these sources remain unexplored. Classifying these sources can enable population studies on much larger scales and may also reveal new types of X-ray sources. For most of these sources the X-ray data alone are not enough to identify their nature, and multiwavelength data must be used. We developed a multiwavelength classification pipeline (MUWCLASS), which relies on supervised machine learning and a rich training dataset. We describe the training dataset, the pipeline and its testing, and will show/discuss how the code performs in different example environments, such as unidentified gamma-ray sources, supernova remnants, dwarf galaxies, stellar clusters, and the inner Galactic plane. We also discuss the application of this approach to the data from upcoming new X-ray observatories (e.g., eROSITA, Athena).

  3. Classification of LV wall motion in cardiac MRI using kernel Dictionary Learning with a parametric approach.

    Science.gov (United States)

    Mantilla, Juan; Paredes, Jose; Bellanger, Jean-J; Donal, Erwan; Leclercq, Christophe; Medina, Ruben; Garreau, Mireille

    2015-01-01

    In this paper, we propose a parametric approach for the assessment of wall motion in Left Ventricle (LV) function in cardiac cine-Magnetic Resonance Imaging (MRI). Time-signal intensity curves (TSICs) are identified in Spatio-temporal image profiles extracted from different anatomical segments in a cardiac MRI sequence. Different parameters are constructed from specific TSICs that present a decreasing then increasing shape reflecting dynamic information of the LV contraction. The parameters extracted from these curves are related to: 1) an average curve based on a clustering process, 2) curve skewness and 3) cross correlation values between each average clustered curve and a patient-specific reference. Several tests are performed in order to construct different vectors to train a sparse classifier based on kernel Dictionary Learning (DL). Results are compared with other classifiers like Support Vector Machine (SVM) and Discriminative Dictionary Learning. The best classification performance is obtained with information of skewness and the average curve with an accuracy about 94% using the mentioned sparse based kernel DL with a radial basis function kernel.

  4. Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben

    2016-01-01

    on the portioning of information space; and (2) use of the genetic algorithm to solve combinatorial problems for classification. In particular, we will implement our methodology to solve complex classification problems and compare the performance of our classifier with other well-known methods (SVM, KNN, and ANN...

  5. Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben

    on the portioning of information space; and (2) use of the genetic algorithm to solve combinatorial problems for classification. In particular, we will implement our methodology to solve complex classification problems and compare the performance of our classifier with other well-known methods (SVM, KNN, and ANN...

  6. Building and Solving Odd-One-Out Classification Problems: A Systematic Approach

    Science.gov (United States)

    Ruiz, Philippe E.

    2011-01-01

    Classification problems ("find the odd-one-out") are frequently used as tests of inductive reasoning to evaluate human or animal intelligence. This paper introduces a systematic method for building the set of all possible classification problems, followed by a simple algorithm for solving the problems of the R-ASCM, a psychometric test derived…

  7. A Statistical Approach of Texton Based Texture Classification Using LPboosting Classifier

    Directory of Open Access Journals (Sweden)

    C. Vivek

    2014-05-01

    Full Text Available The aim of the study in this research deals with the accurate texture classification and the image texture analysis has a voluminous errand prospective in real world applications. In this study, the texton co-occurrence matrix applied to the Broadatz database images that derive the template texton grid images and it undergoes to the discrete shearlet transform to decompose the image. The entropy lineage parameters of redundant and interpolate at a certain point which congregating adjacent regions based on geometric properties then the classification is apprehended by comparing the similarity between the estimated distributions of all detail sub bands through the strong LP boosting classification with various weak classifier configurations. We show that the resulted texture features while incurring the maximum of the discriminative information. Our hybrid classification method significantly outperforms the existing texture descriptors and stipulates classification accuracy in the state-of-the-art real world imaging applications.

  8. Sources of variation in hydrological classifications: Time scale, flow series origin and classification procedure

    Science.gov (United States)

    Peñas, Francisco J.; Barquín, José; Álvarez, César

    2016-07-01

    Classification of flow regimes in water management and hydroecological research has grown significantly in recent years. However, depending on available data and the procedures applied, there may be several credible classifications for a specific catchment. In this study, three inductive classifications derived from different initial flow data and one expert-driven classification were defined. The hydrological interpretation, statistical performance and spatial correspondence of these classifications were compared. Daily Gauged Classification (DC) was derived from daily flow data while Monthly Gauged Classification (MC) and Monthly Modeled Classification (MMC) were derived from monthly flow series, using gauged and modeled flow data, respectively. Expert-Driven Classification (EDC) was based on a Spanish nationwide hydrological classification, which is being used in the current River Basin Management Plans. The results showed that MC accounted for much of the critical hydrological information variability comprised within the DC. However, it also presented limitations regarding the inability to represent important hydroecological attributes, especially those related to droughts and high flow events. In addition, DC and MC presented an equivalent performance more than 60% of the time and obtained a mean ARI value of 0.4, indicating a similar classification structure. DC and MC outperformed MMC 100% and more than 50% of the times when they were compared by means of the classification strength and ANOVA, respectively. MMC also showed low correspondence with these classifications (ARI = 0.20). Thus, the use of modeled flow series should be limited to poorly gauged areas. Finally, the significantly reduced performance and the uneven distribution of classes found in EDC questions its application for different management objectives. This study shows that the selection of the most suitable approach according to the available data has significant implications for the

  9. Feature Learning Based Approach for Weed Classification Using High Resolution Aerial Images from a Digital Camera Mounted on a UAV

    Directory of Open Access Journals (Sweden)

    Calvin Hung

    2014-12-01

    Full Text Available The development of low-cost unmanned aerial vehicles (UAVs and light weight imaging sensors has resulted in significant interest in their use for remote sensing applications. While significant attention has been paid to the collection, calibration, registration and mosaicking of data collected from small UAVs, the interpretation of these data into semantically meaningful information can still be a laborious task. A standard data collection and classification work-flow requires significant manual effort for segment size tuning, feature selection and rule-based classifier design. In this paper, we propose an alternative learning-based approach using feature learning to minimise the manual effort required. We apply this system to the classification of invasive weed species. Small UAVs are suited to this application, as they can collect data at high spatial resolutions, which is essential for the classification of small or localised weed outbreaks. In this paper, we apply feature learning to generate a bank of image filters that allows for the extraction of features that discriminate between the weeds of interest and background objects. These features are pooled to summarise the image statistics and form the input to a texton-based linear classifier that classifies an image patch as weed or background. We evaluated our approach to weed classification on three weeds of significance in Australia: water hyacinth, tropical soda apple and serrated tussock. Our results showed that collecting images at 5–10 m resulted in the highest classifier accuracy, indicated by F1 scores of up to 94%.

  10. Feature Selection as a Time and Cost-Saving Approach for Land Suitability Classification (Case Study of Shavur Plain, Iran

    Directory of Open Access Journals (Sweden)

    Saeid Hamzeh

    2016-10-01

    Full Text Available Land suitability classification is important in planning and managing sustainable land use. Most approaches to land suitability analysis combine a large number of land and soil parameters, and are time-consuming and costly. In this study, a potentially useful technique (combined feature selection and fuzzy-AHP method to increase the efficiency of land suitability analysis was presented. To this end, three different feature selection algorithms—random search, best search and genetic methods—were used to determine the most effective parameters for land suitability classification for the cultivation of barely in the Shavur Plain, southwest Iran. Next, land suitability classes were calculated for all methods by using the fuzzy-AHP approach. Salinity (electrical conductivity (EC, alkalinity (exchangeable sodium percentage (ESP, wetness and soil texture were selected using the random search method. Gypsum, EC, ESP, and soil texture were selected using both the best search and genetic methods. The result shows a strong agreement between the standard fuzzy-AHP methods and methods presented in this study. The values of Kappa coefficients were 0.82, 0.79 and 0.79 for the random search, best search and genetic methods, respectively, compared with the standard fuzzy-AHP method. Our results indicate that EC, ESP, soil texture and wetness are the most effective features for evaluating land suitability classification for the cultivation of barely in the study area, and uses of these parameters, together with their appropriate weights as obtained from fuzzy-AHP, can perform good results for land suitability classification. So, the combined feature selection presented and the fuzzy-AHP approach has the potential to save time and money for land suitability classification.

  11. Regimes internacionais

    OpenAIRE

    Meireles, André Bezerra

    2004-01-01

    Dissertação (mestraddo) - Universidade Federal de Santa Catarina, Centro de Ciências Jurídicas. Programa de Pós-Graduação em Direito. Qual o papel dos regimes internacionais com relação ao comportamento dos agentes das relações internacionais contemporâneas, em especial, o Estado? Dentro das negociações de uma esfera internacional caracterizada por uma forte interdependência econômica, verificada a existência de múltiplos canais de conexões entre as sociedades, e uma tendência contínua p...

  12. A Bayes fusion method based ensemble classification approach for Brown cloud application

    Directory of Open Access Journals (Sweden)

    M.Krishnaveni

    2014-03-01

    Full Text Available Classification is a recurrent task of determining a target function that maps each attribute set to one of the predefined class labels. Ensemble fusion is one of the suitable classifier model fusion techniques which combine the multiple classifiers to perform high classification accuracy than individual classifiers. The main objective of this paper is to combine base classifiers using ensemble fusion methods namely Decision Template, Dempster-Shafer and Bayes to compare the accuracy of the each fusion methods on the brown cloud dataset. The base classifiers like KNN, MLP and SVM have been considered in ensemble classification in which each classifier with four different function parameters. From the experimental study it is proved, that the Bayes fusion method performs better classification accuracy of 95% than Decision Template of 80%, Dempster-Shaferof 85%, in a Brown Cloud image dataset.

  13. Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive approaches

    CSIR Research Space (South Africa)

    Musee, N

    2012-05-01

    Full Text Available -1 NANOLCA Symposium, "Safety issues and regulatory challenges of nanomaterials", San Sebastian, Spain, 3-4 May 2012 Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive...

  14. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  15. Automated, high accuracy classification of Parkinsonian disorders: a pattern recognition approach.

    Directory of Open Access Journals (Sweden)

    Andre F Marquand

    Full Text Available Progressive supranuclear palsy (PSP, multiple system atrophy (MSA and idiopathic Parkinson's disease (IPD can be clinically indistinguishable, especially in the early stages, despite distinct patterns of molecular pathology. Structural neuroimaging holds promise for providing objective biomarkers for discriminating these diseases at the single subject level but all studies to date have reported incomplete separation of disease groups. In this study, we employed multi-class pattern recognition to assess the value of anatomical patterns derived from a widely available structural neuroimaging sequence for automated classification of these disorders. To achieve this, 17 patients with PSP, 14 with IPD and 19 with MSA were scanned using structural MRI along with 19 healthy controls (HCs. An advanced probabilistic pattern recognition approach was employed to evaluate the diagnostic value of several pre-defined anatomical patterns for discriminating the disorders, including: (i a subcortical motor network; (ii each of its component regions and (iii the whole brain. All disease groups could be discriminated simultaneously with high accuracy using the subcortical motor network. The region providing the most accurate predictions overall was the midbrain/brainstem, which discriminated all disease groups from one another and from HCs. The subcortical network also produced more accurate predictions than the whole brain and all of its constituent regions. PSP was accurately predicted from the midbrain/brainstem, cerebellum and all basal ganglia compartments; MSA from the midbrain/brainstem and cerebellum and IPD from the midbrain/brainstem only. This study demonstrates that automated analysis of structural MRI can accurately predict diagnosis in individual patients with Parkinsonian disorders, and identifies distinct patterns of regional atrophy particularly useful for this process.

  16. BClass: A Bayesian Approach Based on Mixture Models for Clustering and Classification of Heterogeneous Biological Data

    Directory of Open Access Journals (Sweden)

    Arturo Medrano-Soto

    2004-12-01

    Full Text Available Based on mixture models, we present a Bayesian method (called BClass to classify biological entities (e.g. genes when variables of quite heterogeneous nature are analyzed. Various statistical distributions are used to model the continuous/categorical data commonly produced by genetic experiments and large-scale genomic projects. We calculate the posterior probability of each entry to belong to each element (group in the mixture. In this way, an original set of heterogeneous variables is transformed into a set of purely homogeneous characteristics represented by the probabilities of each entry to belong to the groups. The number of groups in the analysis is controlled dynamically by rendering the groups as 'alive' and 'dormant' depending upon the number of entities classified within them. Using standard Metropolis-Hastings and Gibbs sampling algorithms, we constructed a sampler to approximate posterior moments and grouping probabilities. Since this method does not require the definition of similarity measures, it is especially suitable for data mining and knowledge discovery in biological databases. We applied BClass to classify genes in RegulonDB, a database specialized in information about the transcriptional regulation of gene expression in the bacterium Escherichia coli. The classification obtained is consistent with current knowledge and allowed prediction of missing values for a number of genes. BClass is object-oriented and fully programmed in Lisp-Stat. The output grouping probabilities are analyzed and interpreted using graphical (dynamically linked plots and query-based approaches. We discuss the advantages of using Lisp-Stat as a programming language as well as the problems we faced when the data volume increased exponentially due to the ever-growing number of genomic projects.

  17. Poster abstract: A machine learning approach for vehicle classification using passive infrared and ultrasonic sensors

    KAUST Repository

    Warriach, Ehsan Ullah

    2013-01-01

    This article describes the implementation of four different machine learning techniques for vehicle classification in a dual ultrasonic/passive infrared traffic flow sensors. Using k-NN, Naive Bayes, SVM and KNN-SVM algorithms, we show that KNN-SVM significantly outperforms other algorithms in terms of classification accuracy. We also show that some of these algorithms could run in real time on the prototype system. Copyright © 2013 ACM.

  18. The topical corticosteroid classification called into question: towards a new approach.

    Science.gov (United States)

    Humbert, Philippe; Guichard, Alexandre

    2015-05-01

    Vasoconstrictor assay described in 1962 was an interesting assessment of potency of topical corticosteroids at the beginning of these new therapies, however knowledge and technology have evolved and the classification should follow. A topical corticosteroids with a strong vasoconstrictor effect, as determined by vasoconstrictor assay, has not necessary a strong anti-inflammatory effect. Therefore a specific classification adapted to the therapeutic target is needed to be more efficient and thus reduce side effects and corticophobia.

  19. Sensitivity of streamflows to hydroclimatic fluctuations: resilience and regime shifts

    Science.gov (United States)

    Botter, Gianluca; Basso, Stefano; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2016-04-01

    Landscape and climate alterations foreshadow global-scale shifts of river flow regimes. However, a theory that identifies the range of foreseen impacts on streamflows resulting from inhomogeneous forcings and sensitivity gradients across diverse regimes is lacking. In this contribution, we use a dimensionless index embedding simple climate and landscape attributes (the ratio of the mean interarrival of streamflow-producing rainfall events and the mean catchment response time) to discriminate erratic regimes with enhanced intra-seasonal streamflow variability from persistent regimes endowed with regular flow patterns. The proposed classification is successfully applied to 110 seasonal streamflow distributions observed in 44 catchments of the Alps and the United States, allowing the identification of emerging patterns in space and time. In the same framework, the impact of multi-scale fluctuations of the underlying climatic drivers (temperature, precipitation) on the streamflow distributions can be analyzed. Theoretical and empirical data show that erratic regimes, typical of rivers with low mean discharges, are highly resilient in that they hold a reduced sensitivity to variations in the external forcing. Specific temporal trajectories of streamflow distributions and flow regime shifts driven by land-cover change and rainfall patterns can be also evidenced. The approach developed offers an objective basis for the analysis and prediction of the impact of climate/landscape change on water resources.

  20. Semi-supervised hyperspectral classification from a small number of training samples using a co-training approach

    Science.gov (United States)

    Romaszewski, Michał; Głomb, Przemysław; Cholewa, Michał

    2016-11-01

    We present a novel semi-supervised algorithm for classification of hyperspectral data from remote sensors. Our method is inspired by the Tracking-Learning-Detection (TLD) framework, originally applied for tracking objects in a video stream. TLD introduced the co-training approach called P-N learning, making use of two independent 'experts' (or learners) that scored samples in different feature spaces. In a similar fashion, we formulated the hyperspectral classification task as a co-training problem, that can be solved with the P-N learning scheme. Our method uses both spatial and spectral features of data, extending a small set of initial labelled samples during the process of region growing. We show that this approach is stable and achieves very good accuracy even for small training sets. We analyse the algorithm's performance on several publicly available hyperspectral data sets.

  1. An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications

    Directory of Open Access Journals (Sweden)

    Bao-Gang Hu

    2016-02-01

    Full Text Available In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers result in non-Bayesian solutions. For both types of errors, we derive the closed-form relations between each bound and error components. When Fano’s lower bound in a diagram of “Error Probability vs. Conditional Entropy” is realized based on the approach, its interpretations are enlarged by including non-Bayesian errors and the two situations along with independent properties of the variables. A new upper bound for the Bayesian error is derived with respect to the minimum prior probability, which is generally tighter than Kovalevskij’s upper bound.

  2. A wrapper-based approach for feature selection and classification of major depressive disorder-bipolar disorders.

    Science.gov (United States)

    Tekin Erguzel, Turker; Tas, Cumhur; Cebi, Merve

    2015-09-01

    Feature selection (FS) and classification are consecutive artificial intelligence (AI) methods used in data analysis, pattern classification, data mining and medical informatics. Beside promising studies in the application of AI methods to health informatics, working with more informative features is crucial in order to contribute to early diagnosis. Being one of the prevalent psychiatric disorders, depressive episodes of bipolar disorder (BD) is often misdiagnosed as major depressive disorder (MDD), leading to suboptimal therapy and poor outcomes. Therefore discriminating MDD and BD at earlier stages of illness could help to facilitate efficient and specific treatment. In this study, a nature inspired and novel FS algorithm based on standard Ant Colony Optimization (ACO), called improved ACO (IACO), was used to reduce the number of features by removing irrelevant and redundant data. The selected features were then fed into support vector machine (SVM), a powerful mathematical tool for data classification, regression, function estimation and modeling processes, in order to classify MDD and BD subjects. Proposed method used coherence, a promising quantitative electroencephalography (EEG) biomarker, values calculated from alpha, theta and delta frequency bands. The noteworthy performance of novel IACO-SVM approach stated that it is possible to discriminate 46 BD and 55 MDD subjects using 22 of 48 features with 80.19% overall classification accuracy. The performance of IACO algorithm was also compared to the performance of standard ACO, genetic algorithm (GA) and particle swarm optimization (PSO) algorithms in terms of their classification accuracy and number of selected features. In order to provide an almost unbiased estimate of classification error, the validation process was performed using nested cross-validation (CV) procedure.

  3. Imputation And Classification Of Missing Data Using Least Square Support Vector Machines – A New Approach In Dementia Diagnosis

    Directory of Open Access Journals (Sweden)

    T R Sivapriya

    2012-07-01

    Full Text Available This paper presents a comparison of different data imputation approaches used in filling missing data and proposes a combined approach to estimate accurately missing attribute values in a patient database. The present study suggests a more robust technique that is likely to supply a value closer to the one that is missing for effective classification and diagnosis. Initially data is clustered and z-score method is used to select possible values of an instance with missing attribute values. Then multiple imputation method using LSSVM (Least Squares Support Vector Machine is applied to select the most appropriate values for the missing attributes. Five imputed datasets have been used to demonstrate the performance of the proposed method. Experimental results show that our method outperforms conventional methods of multiple imputation and mean substitution. Moreover, the proposed method CZLSSVM (Clustered Z-score Least Square Support Vector Machine has been evaluated in two classification problems for incomplete data. The efficacy of the imputation methods have been evaluated using LSSVM classifier. Experimental results indicate that accuracy of the classification is increases with CZLSSVM in the case of missing attribute value estimation. It is found that CZLSSVM outperforms other data imputation approaches like decision tree, rough sets and artificial neural networks, K-NN (K-Nearest Neighbour and SVM. Further it is observed that CZLSSVM yields 95 per cent accuracy and prediction capability than other methods included and tested in the study.

  4. A Numerical Modelling Approach for Time-Dependent Deformation of Hot Forming Tools under the Creep-Fatigue Regime

    Directory of Open Access Journals (Sweden)

    B. Reggiani

    2016-01-01

    Full Text Available The present study was aimed at predicting the time-dependent deformation of tools used in hot forming applications subjected to the creep-fatigue regime. An excessive accumulated plastic deformation is configured as one of the three main causes of premature failure of tools in these critical applications and it is accumulated cycle by cycle without evident marks leading to noncompliant products. With the aim of predicting this accumulated deformation, a novel procedure was developed, presented, and applied to the extrusion process as an example. A time-hardening primary creep law was used and novel regression equations for the law’s coefficients were developed to account not only for the induced stress-temperature state but also for the dwell-time value, which is determined by the selected set of process parameters and die design. The procedure was validated against experimental data both on a small-scale extrusion die at different stress, temperature, load states, and for different geometries and on an industrial extrusion die which was discarded due to the excessive plastic deformation after 64 cycles. A numerical-experimental good agreement was achieved.

  5. Σpider diagram: a universal and versatile approach for system comparison and classification: application to solvent properties.

    Science.gov (United States)

    Lesellier, E

    2015-04-10

    Classification methods based on physico-chemical properties are very useful in analytical chemistry, both for extraction and separation processes. Depending on the number of parameters, several classification approaches can be used: by plotting two- or three-dimensional maps (triangles, cubes, spheres); by calculating comparison values for one system with reference to another one, i.e. the ranking factor F, or the Neue selectivity difference s(2); or with chemometric methods (principal component analysis-PCA or hierarchical cluster analysis-HCA). All these methods display advantages and drawbacks: some of them are limited by the number of studied parameters (e.g. three for triangle or sphere plots); others require a new calculation when changing the reference point (F; s(2)), while for chemometric methods (PCA, HCA), the relationships between the clusters and the physico-chemical properties are not always easily understandable. From previous studies performed in supercritical fluid chromatography for stationary phase classification on the basis of linear solvation energy relationships (LSER) including five parameters, we developed a classification map called the Σpider diagram. This diagram allows plotting in a two-dimensional map the location of varied systems, having as many parameters as the ones required getting a satisfactory classification. It can be three, five, eight, or any number. In the present paper, we apply this diagram, and the calculation mode to obtain this diagram, to different solvent classifications: Snyder triangle, solvatochromic solvent selectivity, Hansen parameters, and also to LSER Abraham descriptors and COSMO-RS parameters. The new figure based on Snyder data does not change the global view of groups, except by the use of corrected data from literature, and allows adding the polarity value onto the map. For the solvatochromic solvent selectivity, it leads to achieve a better view of solvents having no acidic character. For Hansen

  6. Physiotherapy movement based classification approaches to low back pain: comparison of subgroups through review and developer/expert survey

    Directory of Open Access Journals (Sweden)

    Karayannis Nicholas V

    2012-02-01

    Full Text Available Abstract Background Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". Methods A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT, Treatment Based Classification (TBC, Pathoanatomic Based Classification (PBC, Movement System Impairment Classification (MSI, and O'Sullivan Classification System (OCS schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Results Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i loading strategies (MDT, TBC, PBC aimed at eliciting a phenomenon

  7. Physiotherapy movement based classification approaches to low back pain: comparison of subgroups through review and developer/expert survey.

    Science.gov (United States)

    Karayannis, Nicholas V; Jull, Gwendolen A; Hodges, Paul W

    2012-02-20

    Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP) patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT), Treatment Based Classification (TBC), Pathoanatomic Based Classification (PBC), Movement System Impairment Classification (MSI), and O'Sullivan Classification System (OCS) schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i) loading strategies (MDT, TBC, PBC) aimed at eliciting a phenomenon of centralisation of symptoms; and (ii) modified

  8. Sentiment analysis. An example of application and evaluation of RID dictionary and Bayesian classification methods in qualitative data analysis approach

    Directory of Open Access Journals (Sweden)

    Krzysztof Tomanek

    2014-05-01

    Full Text Available The purpose of this article is to present the basic methods for classifying text data. These methods make use of achievements earned in areas such as: natural language processing, the analysis of unstructured data. I introduce and compare two analytical techniques applied to text data. The first analysis makes use of thematic vocabulary tool (sentiment analysis. The second technique uses the idea of Bayesian classification and applies, so-called, naive Bayes algorithm. My comparison goes towards grading the efficiency of use of these two analytical techniques. I emphasize solutions that are to be used to build dictionary accurate for the task of text classification. Then, I compare supervised classification to automated unsupervised analysis’ effectiveness. These results reinforce the conclusion that a dictionary which has received good evaluation as a tool for classification should be subjected to review and modification procedures if is to be applied to new empirical material. Adaptation procedures used for analytical dictionary become, in my proposed approach, the basic step in the methodology of textual data analysis.

  9. Effect of fuzzy partitioning in Crohn's disease classification: a neuro-fuzzy-based approach.

    Science.gov (United States)

    Ahmed, Sk Saddam; Dey, Nilanjan; Ashour, Amira S; Sifaki-Pistolla, Dimitra; Bălas-Timar, Dana; Balas, Valentina E; Tavares, João Manuel R S

    2017-01-01

    Crohn's disease (CD) diagnosis is a tremendously serious health problem due to its ultimately effect on the gastrointestinal tract that leads to the need of complex medical assistance. In this study, the backpropagation neural network fuzzy classifier and a neuro-fuzzy model are combined for diagnosing the CD. Factor analysis is used for data dimension reduction. The effect on the system performance has been investigated when using fuzzy partitioning and dimension reduction. Additionally, further comparison is done between the different levels of the fuzzy partition to reach the optimal performance accuracy level. The performance evaluation of the proposed system is estimated using the classification accuracy and other metrics. The experimental results revealed that the classification with level-8 partitioning provides a classification accuracy of 97.67 %, with a sensitivity and specificity of 96.07 and 100 %, respectively.

  10. Classification and prognosis evaluation of individual teeth--a comprehensive approach.

    Science.gov (United States)

    Samet, Nachum; Jotkowitz, Anna

    2009-05-01

    Following a complete evaluation of the patient, treatment planning requires the analysis of individual teeth, accurate diagnosis, and prognosis evaluation. Currently, there is no accepted comprehensive, standardized, and meaningful classification system for the evaluation of individual teeth that offers a common language for dental professionals. A search was conducted reviewing existing literature relating to classification and prognostication of individual teeth. The dimensions determined to be of importance to gain an overall perspective of the individual relative tooth prognosis were the periodontal, restorative, endodontic, and occlusal plane perspectives. The authors present a comprehensive classification system by conjugating the literature and currently accepted concepts in dentistry. This easy-to-use system assesses the condition of individual teeth and enables a relative prognostic value to be attached to those teeth based on tooth condition and patient-level factors.

  11. Cuckoo search optimisation for feature selection in cancer classification: a new approach.

    Science.gov (United States)

    Gunavathi, C; Premalatha, K

    2015-01-01

    Cuckoo Search (CS) optimisation algorithm is used for feature selection in cancer classification using microarray gene expression data. Since the gene expression data has thousands of genes and a small number of samples, feature selection methods can be used for the selection of informative genes to improve the classification accuracy. Initially, the genes are ranked based on T-statistics, Signal-to-Noise Ratio (SNR) and F-statistics values. The CS is used to find the informative genes from the top-m ranked genes. The classification accuracy of k-Nearest Neighbour (kNN) technique is used as the fitness function for CS. The proposed method is experimented and analysed with ten different cancer gene expression datasets. The results show that the CS gives 100% average accuracy for DLBCL Harvard, Lung Michigan, Ovarian Cancer, AML-ALL and Lung Harvard2 datasets and it outperforms the existing techniques in DLBCL outcome and prostate datasets.

  12. Multi-criteria classification approach with polynomial aggregation function and incomplete certain information

    Institute of Scientific and Technical Information of China (English)

    Wang Jianqiang

    2006-01-01

    The relationship between the importance of criterion and the criterion aggregation function is discussed, criterion's weight and combinational weights between some criteria are defined, and a multi-criteria classification method with incomplete certain information and polynomial aggregation function is proposed. First, linear programming is constructed by classification to reference alternative set (assignment examples) and incomplete certain information on criterion's weights. Then the coefficient of the polynomial aggregation function and thresholds of categories are gained by solving the linear programming. And the consistency index of alternatives is obtained, the classification of the alternatives is achieved. The certain criteria's values of categories and uncertain criteria's values of categories are discussed in the method. Finally, an example shows the feasibility and availability of this method.

  13. Improvement of the classification quality in detection of Hashimoto's disease with a combined classifier approach.

    Science.gov (United States)

    Omiotek, Zbigniew

    2017-08-01

    The purpose of the study was to construct an efficient classifier that, along with a given reduced set of discriminant features, could be used as a part of the computer system in automatic identification and classification of ultrasound images of the thyroid gland, which is aimed to detect cases affected by Hashimoto's thyroiditis. A total of 10 supervised learning techniques and a majority vote for the combined classifier were used. Two models were proposed as a result of the classifier's construction. The first one is based on the K-nearest neighbours method (for K = 7). It uses three discriminant features and affords sensitivity equal to 88.1%, specificity of 66.7% and classification error at a level of 21.8%. The second model is a combined classifier, which was constructed using three-component classifiers. They are based on the K-nearest neighbours method (for K = 7), linear discriminant analysis and a boosting algorithm. The combined classifier is based on 48 discriminant features. It allows to achieve the classification sensitivity equal to 88.1%, specificity of 69.4% and classification error at a level of 20.5%. The combined classifier allows to improve the classification quality compared to the single model. The models, built as a part of the automatic computer system, may support the physician, especially in first-contact hospitals, in diagnosis of cases that are difficult to recognise based on ultrasound images. The high sensitivity of constructed classification models indicates high detection accuracy of the sick cases, and this is beneficial to the patients from a medical point of view.

  14. Biomedical literature classification using encyclopedic knowledge: a Wikipedia-based bag-of-concepts approach.

    Science.gov (United States)

    Mouriño García, Marcos Antonio; Pérez Rodríguez, Roberto; Anido Rifón, Luis E

    2015-01-01

    Automatic classification of text documents into a set of categories has a lot of applications. Among those applications, the automatic classification of biomedical literature stands out as an important application for automatic document classification strategies. Biomedical staff and researchers have to deal with a lot of literature in their daily activities, so it would be useful a system that allows for accessing to documents of interest in a simple and effective way; thus, it is necessary that these documents are sorted based on some criteria-that is to say, they have to be classified. Documents to classify are usually represented following the bag-of-words (BoW) paradigm. Features are words in the text-thus suffering from synonymy and polysemy-and their weights are just based on their frequency of occurrence. This paper presents an empirical study of the efficiency of a classifier that leverages encyclopedic background knowledge-concretely Wikipedia-in order to create bag-of-concepts (BoC) representations of documents, understanding concept as "unit of meaning", and thus tackling synonymy and polysemy. Besides, the weighting of concepts is based on their semantic relevance in the text. For the evaluation of the proposal, empirical experiments have been conducted with one of the commonly used corpora for evaluating classification and retrieval of biomedical information, OHSUMED, and also with a purpose-built corpus of MEDLINE biomedical abstracts, UVigoMED. Results obtained show that the Wikipedia-based bag-of-concepts representation outperforms the classical bag-of-words representation up to 157% in the single-label classification problem and up to 100% in the multi-label problem for OHSUMED corpus, and up to 122% in the single-label classification problem and up to 155% in the multi-label problem for UVigoMED corpus.

  15. Biomedical literature classification using encyclopedic knowledge: a Wikipedia-based bag-of-concepts approach

    Directory of Open Access Journals (Sweden)

    Marcos Antonio Mouriño García

    2015-09-01

    Full Text Available Automatic classification of text documents into a set of categories has a lot of applications. Among those applications, the automatic classification of biomedical literature stands out as an important application for automatic document classification strategies. Biomedical staff and researchers have to deal with a lot of literature in their daily activities, so it would be useful a system that allows for accessing to documents of interest in a simple and effective way; thus, it is necessary that these documents are sorted based on some criteria—that is to say, they have to be classified. Documents to classify are usually represented following the bag-of-words (BoW paradigm. Features are words in the text—thus suffering from synonymy and polysemy—and their weights are just based on their frequency of occurrence. This paper presents an empirical study of the efficiency of a classifier that leverages encyclopedic background knowledge—concretely Wikipedia—in order to create bag-of-concepts (BoC representations of documents, understanding concept as “unit of meaning”, and thus tackling synonymy and polysemy. Besides, the weighting of concepts is based on their semantic relevance in the text. For the evaluation of the proposal, empirical experiments have been conducted with one of the commonly used corpora for evaluating classification and retrieval of biomedical information, OHSUMED, and also with a purpose-built corpus of MEDLINE biomedical abstracts, UVigoMED. Results obtained show that the Wikipedia-based bag-of-concepts representation outperforms the classical bag-of-words representation up to 157% in the single-label classification problem and up to 100% in the multi-label problem for OHSUMED corpus, and up to 122% in the single-label classification problem and up to 155% in the multi-label problem for UVigoMED corpus.

  16. A Characteristics-Based Approach to Radioactive Waste Classification in Advanced Nuclear Fuel Cycles

    Science.gov (United States)

    Djokic, Denia

    The radioactive waste classification system currently used in the United States primarily relies on a source-based framework. This has lead to numerous issues, such as wastes that are not categorized by their intrinsic risk, or wastes that do not fall under a category within the framework and therefore are without a legal imperative for responsible management. Furthermore, in the possible case that advanced fuel cycles were to be deployed in the United States, the shortcomings of the source-based classification system would be exacerbated: advanced fuel cycles implement processes such as the separation of used nuclear fuel, which introduce new waste streams of varying characteristics. To be able to manage and dispose of these potential new wastes properly, development of a classification system that would assign appropriate level of management to each type of waste based on its physical properties is imperative. This dissertation explores how characteristics from wastes generated from potential future nuclear fuel cycles could be coupled with a characteristics-based classification framework. A static mass flow model developed under the Department of Energy's Fuel Cycle Research & Development program, called the Fuel-cycle Integration and Tradeoffs (FIT) model, was used to calculate the composition of waste streams resulting from different nuclear fuel cycle choices: two modified open fuel cycle cases (recycle in MOX reactor) and two different continuous-recycle fast reactor recycle cases (oxide and metal fuel fast reactors). This analysis focuses on the impact of waste heat load on waste classification practices, although future work could involve coupling waste heat load with metrics of radiotoxicity and longevity. The value of separation of heat-generating fission products and actinides in different fuel cycles and how it could inform long- and short-term disposal management is discussed. It is shown that the benefits of reducing the short-term fission

  17. A K-theoretic approach to the classification of symmetric spaces

    CERN Document Server

    Bohle, Dennis

    2011-01-01

    We show that the classification of the symmetric spaces can be achieved by K-theoretical methods. We focus on Hermitian symmetric spaces of non-compact type, and define K-theory for JB*-triples along the lines of C*-theory. K-groups have to be provided with further invariants in order to classify. Among these are the cycles obtained from so called grids, intimately connected to the root systems of an underlying Lie-algebra and thus reminiscent of the classical classification scheme.

  18. Fingerprint pattern classification approach based on the coordinate geometry of singularities

    CSIR Research Space (South Africa)

    Msiza, IS

    2009-10-01

    Full Text Available and the Accept-Reject Rates. A. Classification Accuracy The fingerprint images are classified by a fingerprint clas- sification expert before running a total (TOT) of 431 data instances through the proposed classifier. The results obtained from... I. It can be observed that not even a single PA fingerprint is mis-classified as TA and, out of 75, only 04 TA fingerprints are mis-classified as PA. B. Accept-Reject Rates Following the classification accuracy measured in the previ- ous section...

  19. An efficient approach of EEG feature extraction and classification for brain computer interface

    Institute of Scientific and Technical Information of China (English)

    Wu Ting; Yan Guozheng; Yang Banghua

    2009-01-01

    In the study of brain-computer interfaces, a method of feature extraction and classification used for two kinds of imaginations is proposed. It considers Euclidean distance between mean traces recorded from the channels with two kinds of imaginations as a feature, and determines imagination classes using threshold value. It analyzed the background of experiment and theoretical foundation referring to the data sets of BCI 2003, and compared the classification precision with the best result of the competition. The result shows that the method has a high precision and is advantageous for being applied to practical systems.

  20. Transradial Amputee Gesture Classification Using an Optimal Number of sEMG Sensors: An Approach Using ICA Clustering.

    Science.gov (United States)

    Naik, Ganesh R; Al-Timemy, Ali H; Nguyen, Hung T

    2016-08-01

    Surface electromyography (sEMG)-based pattern recognition studies have been widely used to improve the classification accuracy of upper limb gestures. Information extracted from multiple sensors of the sEMG recording sites can be used as inputs to control powered upper limb prostheses. However, usage of multiple EMG sensors on the prosthetic hand is not practical and makes it difficult for amputees due to electrode shift/movement, and often amputees feel discomfort in wearing sEMG sensor array. Instead, using fewer numbers of sensors would greatly improve the controllability of prosthetic devices and it would add dexterity and flexibility in their operation. In this paper, we propose a novel myoelectric control technique for identification of various gestures using the minimum number of sensors based on independent component analysis (ICA) and Icasso clustering. The proposed method is a model-based approach where a combination of source separation and Icasso clustering was utilized to improve the classification performance of independent finger movements for transradial amputee subjects. Two sEMG sensor combinations were investigated based on the muscle morphology and Icasso clustering and compared to Sequential Forward Selection (SFS) and greedy search algorithm. The performance of the proposed method has been validated with five transradial amputees, which reports a higher classification accuracy ( > 95%). The outcome of this study encourages possible extension of the proposed approach to real time prosthetic applications.

  1. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food.

    Science.gov (United States)

    Päßler, Sebastian; Wolff, Matthias; Fischer, Wolf-Joachim

    2012-06-01

    Obesity and nutrition-related diseases are currently growing challenges for medicine. A precise and timesaving method for food intake monitoring is needed. For this purpose, an approach based on the classification of sounds produced during food intake is presented. Sounds are recorded non-invasively by miniature microphones in the outer ear canal. A database of 51 participants eating seven types of food and consuming one drink has been developed for algorithm development and model training. The database is labeled manually using a protocol with introductions for annotation. The annotation procedure is evaluated using Cohen's kappa coefficient. The food intake activity is detected by the comparison of the signal energy of in-ear sounds to environmental sounds recorded by a reference microphone. Hidden Markov models are used for the recognition of single chew or swallowing events. Intake cycles are modeled as event sequences in finite-state grammars. Classification of consumed food is realized by a finite-state grammar decoder based on the Viterbi algorithm. We achieved a detection accuracy of 83% and a food classification accuracy of 79% on a test set of 10% of all records. Our approach faces the need of monitoring the time and occurrence of eating. With differentiation of consumed food, a first step toward the goal of meal weight estimation is taken.

  2. An automated Pearson's correlation change classification (APC3) approach for GC/MS metabonomic data using total ion chromatograms (TICs).

    Science.gov (United States)

    Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei

    2013-05-21

    A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.

  3. A wavelet-based time frequency analysis approach for classification of motor imagery for brain computer interface applications

    Science.gov (United States)

    Qin, Lei; He, Bin

    2005-12-01

    Electroencephalogram (EEG) recordings during motor imagery tasks are often used as input signals for brain-computer interfaces (BCIs). The translation of these EEG signals to control signals of a device is based on a good classification of various kinds of imagination. We have developed a wavelet-based time-frequency analysis approach for classifying motor imagery tasks. Time-frequency distributions (TFDs) were constructed based on wavelet decomposition and event-related (de)synchronization patterns were extracted from symmetric electrode pairs. The weighted energy difference of the electrode pairs was then compared to classify the imaginary movement. The present method has been tested in nine human subjects and reached an averaged classification rate of 78%. The simplicity of the present technique suggests that it may provide an alternative method for EEG-based BCI applications.

  4. On Internet Traffic Classification: A Two-Phased Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Taimur Bakhshi

    2016-01-01

    Full Text Available Traffic classification utilizing flow measurement enables operators to perform essential network management. Flow accounting methods such as NetFlow are, however, considered inadequate for classification requiring additional packet-level information, host behaviour analysis, and specialized hardware limiting their practical adoption. This paper aims to overcome these challenges by proposing two-phased machine learning classification mechanism with NetFlow as input. The individual flow classes are derived per application through k-means and are further used to train a C5.0 decision tree classifier. As part of validation, the initial unsupervised phase used flow records of fifteen popular Internet applications that were collected and independently subjected to k-means clustering to determine unique flow classes generated per application. The derived flow classes were afterwards used to train and test a supervised C5.0 based decision tree. The resulting classifier reported an average accuracy of 92.37% on approximately 3.4 million test cases increasing to 96.67% with adaptive boosting. The classifier specificity factor which accounted for differentiating content specific from supplementary flows ranged between 98.37% and 99.57%. Furthermore, the computational performance and accuracy of the proposed methodology in comparison with similar machine learning techniques lead us to recommend its extension to other applications in achieving highly granular real-time traffic classification.

  5. Probabilistic Gait Classification in Children with Cerebral Palsy: A Bayesian Approach

    Science.gov (United States)

    Van Gestel, Leen; De Laet, Tinne; Di Lello, Enrico; Bruyninckx, Herman; Molenaers, Guy; Van Campenhout, Anja; Aertbelien, Erwin; Schwartz, Mike; Wambacq, Hans; De Cock, Paul; Desloovere, Kaat

    2011-01-01

    Three-dimensional gait analysis (3DGA) generates a wealth of highly variable data. Gait classifications help to reduce, simplify and interpret this vast amount of 3DGA data and thereby assist and facilitate clinical decision making in the treatment of CP. CP gait is often a mix of several clinically accepted distinct gait patterns. Therefore,…

  6. Probabilistic Gait Classification in Children with Cerebral Palsy: A Bayesian Approach

    Science.gov (United States)

    Van Gestel, Leen; De Laet, Tinne; Di Lello, Enrico; Bruyninckx, Herman; Molenaers, Guy; Van Campenhout, Anja; Aertbelien, Erwin; Schwartz, Mike; Wambacq, Hans; De Cock, Paul; Desloovere, Kaat

    2011-01-01

    Three-dimensional gait analysis (3DGA) generates a wealth of highly variable data. Gait classifications help to reduce, simplify and interpret this vast amount of 3DGA data and thereby assist and facilitate clinical decision making in the treatment of CP. CP gait is often a mix of several clinically accepted distinct gait patterns. Therefore,…

  7. Comparing and optimizing land use classification in a Himalayan area using parametric and non parametric approaches

    NARCIS (Netherlands)

    Sterk, G.; Sameer Saran,; Raju, P.L.N.; Amit, Bharti

    2007-01-01

    Supervised classification is one of important tasks in remote sensing image interpretation, in which the image pixels are classified to various predefined land use/land cover classes based on the spectral reflectance values in different bands. In reality some classes may have very close spectral ref

  8. Automated Brain Image classification using Neural Network Approach and Abnormality Analysis

    Directory of Open Access Journals (Sweden)

    P.Muthu Krishnammal

    2015-06-01

    Full Text Available Image segmentation of surgical images plays an important role in diagnosis and analysis the anatomical structure of human body. Magnetic Resonance Imaging (MRI helps in obtaining a structural image of internal parts of the body. This paper aims at developing an automatic support system for stage classification using learning machine and to detect brain Tumor by fuzzy clustering methods to detect the brain Tumor in its early stages and to analyze anatomical structures. The three stages involved are: feature extraction using GLCM and the tumor classification using PNN-RBF network and segmentation using SFCM. Here fast discrete curvelet transformation is used to analyze texture of an image which be used as a base for a Computer Aided Diagnosis (CAD system .The Probabilistic Neural Network with radial basis function is employed to implement an automated Brain Tumor classification. It classifies the stage of Brain Tumor that is benign, malignant or normal automatically. Then the segmentation of the brain abnormality using Spatial FCM and the severity of the tumor is analysed using the number of tumor cells in the detected abnormal region.The proposed method reports promising results in terms of training performance and classification accuracies.

  9. A hybrid approach to software repository retrieval: Blending faceted classification and type signatures

    Science.gov (United States)

    Eichmann, David A.

    1992-01-01

    We present a user interface for software reuse repository that relies both on the informal semantics of faceted classification and the formal semantics of type signatures for abstract data types. The result is an interface providing both structural and qualitative feedback to a software reuser.

  10. A classification-based approach to monitoring the safety of dynamic systems

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Langseth, Helge; Nielsen, Thomas Dyhre

    2014-01-01

    as possible. Motivated by this prob- lem setting, we propose a general model for classification in dynamic domains, and exemplify its use by showing how it can be employed for activity detection. We con- struct our model by using well known statistical techniques as building-blocks, and evaluate each step...

  11. A novel algorithm for ventricular arrhythmia classification using a fuzzy logic approach.

    Science.gov (United States)

    Weixin, Nong

    2016-12-01

    In the present study, it has been shown that an unnecessary implantable cardioverter-defibrillator (ICD) shock is often delivered to patients with an ambiguous ECG rhythm in the overlap zone between ventricular tachycardia (VT) and ventricular fibrillation (VF); these shocks significantly increase mortality. Therefore, accurate classification of the arrhythmia into VT, organized VF (OVF) or disorganized VF (DVF) is crucial to assist ICDs to deliver appropriate therapy. A classification algorithm using a fuzzy logic classifier was developed for accurately classifying the arrhythmias into VT, OVF or DVF. Compared with other studies, our method aims to combine ten ECG detectors that are calculated in the time domain and the frequency domain in addition to different levels of complexity for detecting subtle structure differences between VT, OVF and DVF. The classification in the overlap zone between VT and VF is refined by this study to avoid ambiguous identification. The present method was trained and tested using public ECG signal databases. A two-level classification was performed to first detect VT with an accuracy of 92.6 %, and then the discrimination between OVF and DVF was detected with an accuracy of 84.5 %. The validation results indicate that the proposed method has superior performance in identifying the organization level between the three types of arrhythmias (VT, OVF and DVF) and is promising for improving the appropriate therapy choice and decreasing the possibility of sudden cardiac death.

  12. Neuropsychological test selection for cognitive impairment classification: A machine learning approach.

    Science.gov (United States)

    Weakley, Alyssa; Williams, Jennifer A; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2015-01-01

    Reducing the amount of testing required to accurately detect cognitive impairment is clinically relevant. The aim of this research was to determine the fewest number of clinical measures required to accurately classify participants as healthy older adult, mild cognitive impairment (MCI), or dementia using a suite of classification techniques. Two variable selection machine learning models (i.e., naive Bayes, decision tree), a logistic regression, and two participant datasets (i.e., clinical diagnosis; Clinical Dementia Rating, CDR) were explored. Participants classified using clinical diagnosis criteria included 52 individuals with dementia, 97 with MCI, and 161 cognitively healthy older adults. Participants classified using CDR included 154 individuals with CDR = 0, 93 individuals with CDR = 0.5, and 25 individuals with CDR = 1.0+. A total of 27 demographic, psychological, and neuropsychological variables were available for variable selection. No significant difference was observed between naive Bayes, decision tree, and logistic regression models for classification of both clinical diagnosis and CDR datasets. Participant classification (70.0-99.1%), geometric mean (60.9-98.1%), sensitivity (44.2-100%), and specificity (52.7-100%) were generally satisfactory. Unsurprisingly, the MCI/CDR = 0.5 participant group was the most challenging to classify. Through variable selection only 2-9 variables were required for classification and varied between datasets in a clinically meaningful way. The current study results reveal that machine learning techniques can accurately classify cognitive impairment and reduce the number of measures required for diagnosis.

  13. A Game-Based Approach to Learning the Idea of Chemical Elements and Their Periodic Classification

    Science.gov (United States)

    Franco-Mariscal, Antonio Joaquín; Oliva-Martínez, José María; Blanco-López, Ángel; España-Ramos, Enrique

    2016-01-01

    In this paper, the characteristics and results of a teaching unit based on the use of educational games to learn the idea of chemical elements and their periodic classification in secondary education are analyzed. The method is aimed at Spanish students aged 15-16 and consists of 24 1-h sessions. The results obtained on implementing the teaching…

  14. An Object-Based Classification Approach for Mapping Migrant Housing in the Mega-Urban Area of the Pearl River Delta (China

    Directory of Open Access Journals (Sweden)

    Sebastian D’Oleire-Oltmanns

    2011-08-01

    Full Text Available Urban areas develop on formal and informal levels. Informal development is often highly dynamic, leading to a lag of spatial information about urban structure types. In this work, an object-based remote sensing approach will be presented to map the migrant housing urban structure type in the Pearl River Delta, China. SPOT5 data were utilized for the classification (auxiliary data, particularly up-to-date cadastral data, were not available. A hierarchically structured classification process was used to create (spectral independence from single satellite scenes and to arrive at a transferrable classification process. Using the presented classification approach, an overall classification accuracy of migrant housing of 68.0% is attained.

  15. Making the ecosystem approach operational-Can regime shifts in ecological- and governance systems facilitate the transition?

    DEFF Research Database (Denmark)

    Österblom, H.; Gårdmark, A.; Bergström, L.

    2010-01-01

    Effectively reducing cumulative impacts on marine ecosystems requires co-evolution between science, policy and practice. Here, long-term social–ecological changes in the Baltic Sea are described, illustrating how the process of making the ecosystem approach operational in a large marine ecosystem...... stimulating innovations and re-organizing governance structures at drainage basin level to the Baltic Sea catchment as a whole. Experimentation and innovation at local to the regional levels is critical for a transition to ecosystem-based management. Establishing science-based learning platforms at sub......-basin scales could facilitate this process....

  16. A Model-Based Approach to Infer Shifts in Regional Fire Regimes Over Time Using Sediment Charcoal Records

    Science.gov (United States)

    Itter, M.; Finley, A. O.; Hooten, M.; Higuera, P. E.; Marlon, J. R.; McLachlan, J. S.; Kelly, R.

    2016-12-01

    Sediment charcoal records are used in paleoecological analyses to identify individual local fire events and to estimate fire frequency and regional biomass burned at centennial to millenial time scales. Methods to identify local fire events based on sediment charcoal records have been well developed over the past 30 years, however, an integrated statistical framework for fire identification is still lacking. We build upon existing paleoecological methods to develop a hierarchical Bayesian point process model for local fire identification and estimation of fire return intervals. The model is unique in that it combines sediment charcoal records from multiple lakes across a region in a spatially-explicit fashion leading to estimation of a joint, regional fire return interval in addition to lake-specific local fire frequencies. Further, the model estimates a joint regional charcoal deposition rate free from the effects of local fires that can be used as a measure of regional biomass burned over time. Finally, the hierarchical Bayesian approach allows for tractable error propagation such that estimates of fire return intervals reflect the full range of uncertainty in sediment charcoal records. Specific sources of uncertainty addressed include sediment age models, the separation of local versus regional charcoal sources, and generation of a composite charcoal record The model is applied to sediment charcoal records from a dense network of lakes in the Yukon Flats region of Alaska. The multivariate joint modeling approach results in improved estimates of regional charcoal deposition with reduced uncertainty in the identification of individual fire events and local fire return intervals compared to individual lake approaches. Modeled individual-lake fire return intervals range from 100 to 500 years with a regional interval of roughly 200 years. Regional charcoal deposition to the network of lakes is correlated up to 50 kilometers. Finally, the joint regional charcoal

  17. Machine Learning Approaches to Classification of Seafloor Features from High Resolution Sonar Data

    Science.gov (United States)

    Smith, D. G.; Ed, L.; Sofge, D.; Elmore, P. A.; Petry, F.

    2014-12-01

    Navigation charts provide topographic maps of the seafloor created from swaths of sonar data. Converting sonar data to a topographic map is a manual, labor-intensive process that can be greatly assisted by contextual information obtained from automated classification of geomorphological structures. Finding structures such as seamounts can be challenging, as there are no established rules that can be used for decision-making. Often times, it is a determination that is made by human expertise. A variety of feature metrics may be useful for this task and we use a large number of metrics relevant to the task of finding seamounts. We demonstrate this ability in locating seamounts by two related machine learning techniques. As well as achieving good accuracy in classification, the human-understandable set of metrics that are most important for the results are discussed.

  18. A multiple classifier approach for spectral-spatial classification of hyperspectral data

    OpenAIRE

    Tarabalka, Yuliya; Benediktsson, Jon Atli; Tilton, James; Chanussot, Jocelyn

    2010-01-01

    International audience; A new multiple classifier method for spectral-spatial classification of hyperspectral images is proposed. Several classifiers are used independently to classify an image. For every pixel, if all the classifiers have assigned this pixel to the same class, the pixel is kept as a marker, i.e., a seed of the spatial region, with the corresponding class label. We propose to use spectral-spatial classifiers at the preliminary step of the marker selection procedure, each of t...

  19. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  20. MYOCLONUS IN CHILDREN: DEFINITIONS AND CLASSIFICATIONS, DIFFERENTIAL DIAGNOSIS, APPROACHES TO THERAPY (A LECTURE

    Directory of Open Access Journals (Sweden)

    M. Yu. Bobylova

    2014-01-01

    Full Text Available Myoclonus is a manifestation of many neurological diseases, by differing in etiology and pathogenesis. The high prevalence of myoclonus in children with cardinally different prognoses of diseases of not only the nervous system, but also other organs and systems causes to resume investigations into myoclonus as a syndrome, to specify its terminology and classification, to improve diagnostic criteria, and to optimize additional diagnostic schemes.

  1. Neuropsychological Test Selection for Cognitive Impairment Classification: A Machine Learning Approach

    Science.gov (United States)

    Williams, Jennifer A.; Schmitter-Edgecombe, Maureen; Cook, Diane J.

    2016-01-01

    Introduction Reducing the amount of testing required to accurately detect cognitive impairment is clinically relevant. The aim of this research was to determine the fewest number of clinical measures required to accurately classify participants as healthy older adult, mild cognitive impairment (MCI) or dementia using a suite of classification techniques. Methods Two variable selection machine learning models (i.e., naive Bayes, decision tree), a logistic regression, and two participant datasets (i.e., clinical diagnosis, clinical dementia rating; CDR) were explored. Participants classified using clinical diagnosis criteria included 52 individuals with dementia, 97 with MCI, and 161 cognitively healthy older adults. Participants classified using CDR included 154 individuals CDR = 0, 93 individuals with CDR = 0.5, and 25 individuals with CDR = 1.0+. Twenty-seven demographic, psychological, and neuropsychological variables were available for variable selection. Results No significant difference was observed between naive Bayes, decision tree, and logistic regression models for classification of both clinical diagnosis and CDR datasets. Participant classification (70.0 – 99.1%), geometric mean (60.9 – 98.1%), sensitivity (44.2 – 100%), and specificity (52.7 – 100%) were generally satisfactory. Unsurprisingly, the MCI/CDR = 0.5 participant group was the most challenging to classify. Through variable selection only 2 – 9 variables were required for classification and varied between datasets in a clinically meaningful way. Conclusions The current study results reveal that machine learning techniques can accurately classifying cognitive impairment and reduce the number of measures required for diagnosis. PMID:26332171

  2. Classification of Household Appliance Operation Cycles: A Case-Study Approach

    Directory of Open Access Journals (Sweden)

    Zeyu Wang

    2015-09-01

    Full Text Available In recent years, a new generation of power grid system, referred to as the Smart Grid, with an aim of managing electricity demand in a sustainable, reliable, and economical manner has emerged. With greater knowledge of operational characteristics of individual appliances, necessary automation control strategies can be developed in the Smart Grid to operate appliances in an efficient manner. This paper provides a way of classifying different operational cycles of a household appliance by introducing an unsupervised learning algorithm called k-means clustering. An intrinsic method known as silhouette coefficient was used to measure the classification quality. An identification process is also discussed in this paper to help users identify the operation mode each types of operation cycle stands for. A case study using a typical household refrigerator is presented to validate the proposed method. Results show that the proposed the classification and identification method can partition and identify different operation cycles adequately. Classification of operation cycles for such appliances is beneficial for Smart Grid as it provides a clear and convincing understanding of the operation modes for effective power management.

  3. A HYBRID APPROACH USING C MEAN AND CART FOR CLASSIFICATION IN DATA MINING

    Directory of Open Access Journals (Sweden)

    Jasbir Malik

    2012-09-01

    Full Text Available Data Mining is a field of search and researches ofdata. Mining the data means fetching out a piece ofdata from a huge data block. The basic work in thedata mining can be categorized in two subsequentways. One is called classification and the other iscalled clustering. Although both refers to some kind ofsame region but still there are differences in both theterms. The classification of the data is only possible ifyou have modified and identified the clusters. In thepresented research paper, our aim is to find out themaximum number of clusters in a specified region byapplying the area searching algorithms. Classificationis always based on two things. aThe area which youchoose for the classification that is the cluster region.bThe kind of dataset which you are going to apply onthe selected region .To increase the accuracy of thesearching technique, any one would need to focus ontwo things . aWhether the data set has been cauterizedin proper manner or not .bIf the clusters are defined ,whether they fit into the appropriate classified area ornot .

  4. Wall-Corner Classification Using Sonar: A New Approach Based on Geometric Features

    Directory of Open Access Journals (Sweden)

    Ginés Benet

    2010-11-01

    Full Text Available Ultrasonic signals coming from rotary sonar sensors in a robot gives us several features about the environment. This enables us to locate and classify the objects in the scenario of the robot. Each object and reflector produces a series of peaks in the amplitude of the signal. The radial and angular position of the sonar sensor gives information about location and their amplitudes offer information about the nature of the surface. Early works showed that the amplitude can be modeled and used to classify objects with very good results at short distances—80% average success in classifying both walls and corners at distances less than 1.5 m. In this paper, a new set of geometric features derived from the amplitude analysis of the echo is presented. These features constitute a set of characteristics that can be used to improve the results of classification at distances from 1.5 m to 4 m. Also, a comparative study on classification algorithms widely used in pattern recognition techniques has been carried out for sensor distances ranging between 0.5 to 4 m, and with incidence angles ranging between 20º to 70º. Experimental results show an enhancement on the success in classification rates when these geometric features are considered.

  5. Wall-corner classification using sonar: a new approach based on geometric features.

    Science.gov (United States)

    Martínez, Milagros; Benet, Ginés

    2010-01-01

    Ultrasonic signals coming from rotary sonar sensors in a robot gives us several features about the environment. This enables us to locate and classify the objects in the scenario of the robot. Each object and reflector produces a series of peaks in the amplitude of the signal. The radial and angular position of the sonar sensor gives information about location and their amplitudes offer information about the nature of the surface. Early works showed that the amplitude can be modeled and used to classify objects with very good results at short distances-80% average success in classifying both walls and corners at distances less than 1.5 m. In this paper, a new set of geometric features derived from the amplitude analysis of the echo is presented. These features constitute a set of characteristics that can be used to improve the results of classification at distances from 1.5 m to 4 m. Also, a comparative study on classification algorithms widely used in pattern recognition techniques has been carried out for sensor distances ranging between 0.5 to 4 m, and with incidence angles ranging between 20° to 70°. Experimental results show an enhancement on the success in classification rates when these geometric features are considered.

  6. Classification of Hypertrophy of Labia Minora: Consideration of a Multiple Component Approach.

    Science.gov (United States)

    González, Pablo I

    2015-11-01

    Labia minora hypertrophy of unknown and under-reported incidence in the general population is considered a variant of normal anatomy. Its origin is multi-factorial including genetic, hormonal, and infectious factors, and voluntary elongation of the labiae minorae in some cultures. Consults with patients bothered by this condition have been increasing with patients complaining of poor aesthetics and symptoms such as difficulty with vaginal secretions, vulvovaginitis, chronic irritation, and superficial dyspareunia, all of which can have a negative effect on these patients' sexuality and self esteem. Surgical management of labial hypertrophy is an option for women with these physical complaints or aesthetic issues. Labia minora hypertrophy can consist of multiple components, including the clitoral hood, lateral prepuce, frenulum, and the body of the labia minora. To date, there is not a consensus in the literature with respect to the classification and definition of varying grades of hypertrophy, aside from measurement of the length in centimeters. In order to offer patients the most appropriate surgical technique, an objective and understandable classification that can be used as part of the preoperative evaluation is necessary. Such a classification should have the aim of offering patients the best cosmetic and functional results with the fewest complications.

  7. A new approach to dual-band polarimetric radar remote sensing image classification

    Institute of Scientific and Technical Information of China (English)

    XU Junyi; YANG Jian; PENG Yingning

    2005-01-01

    It is very important to efficiently represent the target scattering characteristics in applications of polarimetric radar remote sensing. Three probability mass functions are introduced in this paper for target representation: using similarity parameters to describe target average scattering mechanism, using the eigenvalues of a target coherency matrix to describe target scattering randomness, and using radar received power to describe target scattering intensity. The concept of cross-entropy is employed to measure the difference between two scatterers based on the probability mass functions. Three parts of difference between scatterers are measured separately as the difference of average scattering mechanism, the difference of scattering randomness and the difference of scattering intensity, so that the usage of polarimetric data can be highly efficient and flexible. The supervised/unsupervised image classification schemes and their simplified versions are established based on the minimum cross-entropy principle. They are demonstrated to have better classification performance than the maximum likelihood classifier based on the Wishart distribution assumption, both in supervised and in unsupervised classification.

  8. An automated classification approach to ranking photospheric proxies of magnetic energy build-up

    CERN Document Server

    Al-Ghraibah, Amani; McAteer, R T James

    2015-01-01

    We study the photospheric magnetic field of ~2000 active regions in solar cycle 23 to search for parameters indicative of energy build-up and subsequent release as a solar flare. We extract three sets of parameters: snapshots in space and time- total flux, magnetic gradients, and neutral lines; evolution in time- flux evolution; structures at multiple size scales- wavelet analysis. This combines pattern recognition and classification techniques via a relevance vector machine to determine whether a region will flare. We consider classification performance using all 38 extracted features and several feature subsets. Classification performance is quantified using both the true positive rate and the true negative rate. Additionally, we compute the true skill score which provides an equal weighting to true positive rate and true negative rate and the Heidke skill score to allow comparison to other flare forecasting work. We obtain a true skill score of ~0.5 for any predictive time window in the range 2-24hr, with ...

  9. A novel approach to internal crown characterization for coniferous tree species classification

    Science.gov (United States)

    Harikumar, A.; Bovolo, F.; Bruzzone, L.

    2016-10-01

    The knowledge about individual trees in forest is highly beneficial in forest management. High density small foot- print multi-return airborne Light Detection and Ranging (LiDAR) data can provide a very accurate information about the structural properties of individual trees in forests. Every tree species has a unique set of crown structural characteristics that can be used for tree species classification. In this paper, we use both the internal and external crown structural information of a conifer tree crown, derived from a high density small foot-print multi-return LiDAR data acquisition for species classification. Considering the fact that branches are the major building blocks of a conifer tree crown, we obtain the internal crown structural information using a branch level analysis. The structure of each conifer branch is represented using clusters in the LiDAR point cloud. We propose the joint use of the k-means clustering and geometric shape fitting, on the LiDAR data projected onto a novel 3-dimensional space, to identify branch clusters. After mapping the identified clusters back to the original space, six internal geometric features are estimated using a branch-level analysis. The external crown characteristics are modeled by using six least correlated features based on cone fitting and convex hull. Species classification is performed using a sparse Support Vector Machines (sparse SVM) classifier.

  10. [New Approach to the Mitotype Classification in Black Honeybee Apis mellifera mellifera and Iberian Honeybee Apis mellifera iberiensis].

    Science.gov (United States)

    Ilyasov, R A; Poskryakov, A V; Petukhov, A V; Nikolenko, A G

    2016-03-01

    The black honeybee Apis mellifera mellifera L. is today the only subspecies of honeybee which is suitable for commercial breeding in the climatic conditions of Northern Europe with long cold winters. The main problem of the black honeybee in Russia and European countries is the preservation of the indigenous gene pool purity, which is lost as a result of hybridization with subspecies, A. m. caucasica, A. m. carnica, A. m. carpatica, and A. m. armeniaca, introduced from southern regions. Genetic identification of the subspecies will reduce the extent of hybridization and provide the gene pool conservation of the black honeybee. Modern classification of the honeybee mitotypes is mainly based on the combined use ofthe DraI restriction endonuclease recognition site polymorphism and sequence polymorphism of the mtDNA COI-COII region. We performed a comparative analysis of the mtDNA COI-COII region sequence polymorphism in the honeybees ofthe evolutionary lineage M from Ural and West European populations of black honeybee A. m. mellifera and Spanish bee A. m. iberiensis. A new approach to the classification of the honeybee M mitotypes was suggested. Using this approach and on the basis of the seven most informative SNPs of the mtDNA COI-COII region, eight honeybee mitotype groups were identified. In addition, it is suggested that this approach will simplify the previously proposed complicated mitotype classification and will make it possible to assess the level of the mitotype diversity and to identify the mitotypes that are the most valuable for the honeybee breeding and rearing.

  11. Interactive Naive Bayesian network: A new approach of constructing gene-gene interaction network for cancer classification.

    Science.gov (United States)

    Tian, Xue W; Lim, Joon S

    2015-01-01

    Naive Bayesian (NB) network classifier is a simple and well-known type of classifier, which can be easily induced from a DNA microarray data set. However, a strong conditional independence assumption of NB network sometimes can lead to weak classification performance. In this paper, we propose a new approach of interactive naive Bayesian (INB) network to weaken the conditional independence of NB network and classify cancers using DNA microarray data set. We selected the differently expressed genes (DEGs) to reduce the dimension of the microarray data set. Then, an interactive parent which has the biggest influence among all DEGs is searched for each DEG. And then we calculate a weight to represent the interactive relationship between a DEG and its parent. Finally, the gene-gene interaction network is constructed. We experimentally test the INB network in terms of classification accuracy using leukemia and colon DNA microarray data sets, then we compare it with the NB network. The INB network can get higher classification accuracies than NB network. And INB network can show the gene-gene interactions visually.

  12. EMD-DWT based transform domain feature reduction approach for quantitative multi-class classification of breast lesions.

    Science.gov (United States)

    Ara, Sharmin R; Bashar, Syed Khairul; Alam, Farzana; Hasan, Md Kamrul

    2017-09-01

    Using a large set of ultrasound features does not necessarily ensure improved quantitative classification of breast tumors; rather, it often degrades the performance of a classifier. In this paper, we propose an effective feature reduction approach in the transform domain for improved multi-class classification of breast tumors. Feature transformation methods, such as empirical mode decomposition (EMD) and discrete wavelet transform (DWT), followed by a filter- or wrapper-based subset selection scheme are used to extract a set of non-redundant and more potential transform domain features through decorrelation of an optimally ordered sequence of N ultrasonic bi-modal (i.e., quantitative ultrasound and elastography) features. The proposed transform domain bi-modal reduced feature set with different conventional classifiers will classify 201 breast tumors into benign-malignant as well as BI-RADS⩽3, 4, and 5 categories. For the latter case, an inadmissible error probability is defined for the subset selection using a wrapper/filter. The classifiers use train truth from histopathology/cytology for binary (i.e., benign-malignant) separation of tumors and then bi-modal BI-RADS scores from the radiologists for separating malignant tumors into BI-RADS category 4 and 5. A comparative performance analysis of several widely used conventional classifiers is also presented to assess their efficacy for the proposed transform domain reduced feature set for classification of breast tumors. The results show that our transform domain bimodal reduced feature set achieves improvement of 5.35%, 3.45%, and 3.98%, respectively, in sensitivity, specificity, and accuracy as compared to that of the original domain optimal feature set for benign-malignant classification of breast tumors. In quantitative classification of breast tumors into BI-RADS categories⩽3, 4, and 5, the proposed transform domain reduced feature set attains improvement of 3.49%, 9.07%, and 3.06%, respectively, in

  13. Multi-phase classification by a least-squares support vector machine approach in tomography images of geological samples

    Science.gov (United States)

    Khan, Faisal; Enzmann, Frieder; Kersten, Michael

    2016-03-01

    Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.

  14. Mean-field and linear regime approach to magnetic hyperthermia of core-shell nanoparticles: can tiny nanostructures fight cancer?

    Science.gov (United States)

    Carrião, Marcus S.; Bakuzis, Andris F.

    2016-04-01

    The phenomenon of heat dissipation by magnetic materials interacting with an alternating magnetic field, known as magnetic hyperthermia, is an emergent and promising therapy for many diseases, mainly cancer. Here, a magnetic hyperthermia model for core-shell nanoparticles is developed. The theoretical calculation, different from previous models, highlights the importance of heterogeneity by identifying the role of surface and core spins on nanoparticle heat generation. We found that the most efficient nanoparticles should be obtained by selecting materials to reduce the surface to core damping factor ratio, increasing the interface exchange parameter and tuning the surface to core anisotropy ratio for each material combination. From our results we propose a novel heat-based hyperthermia strategy with the focus on improving the heating efficiency of small sized nanoparticles instead of larger ones. This approach might have important implications for cancer treatment and could help improving clinical efficacy.The phenomenon of heat dissipation by magnetic materials interacting with an alternating magnetic field, known as magnetic hyperthermia, is an emergent and promising therapy for many diseases, mainly cancer. Here, a magnetic hyperthermia model for core-shell nanoparticles is developed. The theoretical calculation, different from previous models, highlights the importance of heterogeneity by identifying the role of surface and core spins on nanoparticle heat generation. We found that the most efficient nanoparticles should be obtained by selecting materials to reduce the surface to core damping factor ratio, increasing the interface exchange parameter and tuning the surface to core anisotropy ratio for each material combination. From our results we propose a novel heat-based hyperthermia strategy with the focus on improving the heating efficiency of small sized nanoparticles instead of larger ones. This approach might have important implications for cancer

  15. Predicting allergic contact dermatitis: a hierarchical structure activity relationship (SAR) approach to chemical classification using topological and quantum chemical descriptors

    Science.gov (United States)

    Basak, Subhash C.; Mills, Denise; Hawkins, Douglas M.

    2008-06-01

    A hierarchical classification study was carried out based on a set of 70 chemicals—35 which produce allergic contact dermatitis (ACD) and 35 which do not. This approach was implemented using a regular ridge regression computer code, followed by conversion of regression output to binary data values. The hierarchical descriptor classes used in the modeling include topostructural (TS), topochemical (TC), and quantum chemical (QC), all of which are based solely on chemical structure. The concordance, sensitivity, and specificity are reported. The model based on the TC descriptors was found to be the best, while the TS model was extremely poor.

  16. World Climate Classification and Search: Data Mining Approach Utilizing Dynamic Time Warping Similarity Function

    Science.gov (United States)

    Stepinski, T. F.; Netzel, P.; Jasiewicz, J.

    2014-12-01

    We have developed a novel method for classification and search of climate over the global land surface excluding Antarctica. Our method classifies climate on the basis of the outcome of time series segmentation and clustering. We use WorldClim 30 arc sec. (approx. 1 km) resolution grid data which is based on 50 years of climatic observations. Each cell in a grid is assigned a 12 month series consisting of 50-years monthly averages of mean, maximum, and minimum temperatures as well as the total precipitation. The presented method introduces several innovations with comparison to existing data-driven methods of world climate classifications. First, it uses only climatic rather than bioclimatic data. Second, it employs object-oriented methodology - the grid is first segmented before climatic segments are classified. Third, and most importantly, the similarity between climates in two given cells is performed using the dynamic time warping (DTW) measure instead of the Euclidean distance. The DTW is known to be superior to Euclidean distance for time series, but has not been utilized before in classification of global climate. To account for computational expense of DTW we use highly efficient GeoPAT software (http://sil.uc.edu/gitlist/) that, in the first step, segments the grid into local regions of uniform climate. In the second step, the segments are classified. We also introduce a climate search - a GeoWeb-based method for interactive presentation of global climate information in the form of query-and-retrieval. A user selects a geographical location and the system returns a global map indicating level of similarity between local climates and a climate in the selected location. The results of the search for location: "University of Cincinnati, Main Campus" are presented on attached map. The results of the search for location: "University of Cincinnati, Main Campus" are presented on the map. We have compared the results of our method to Koeppen classification scheme

  17. Multi-fluid Approach to High-frequency Waves in Plasmas. II. Small-amplitude Regime in Partially Ionized Media

    Science.gov (United States)

    Martínez-Gómez, David; Soler, Roberto; Terradas, Jaume

    2017-03-01

    The presence of neutral species in a plasma has been shown to greatly affect the properties of magnetohydrodynamic waves. For instance, the interaction between ions and neutrals through momentum transfer collisions causes the damping of Alfvén waves and alters their oscillation frequency and phase speed. When the collision frequencies are larger than the frequency of the waves, single-fluid magnetohydrodynamic approximations can accurately describe the effects of partial ionization, since there is a strong coupling between the various species. However, at higher frequencies, the single-fluid models are not applicable and more complex approaches are required. Here, we use a five-fluid model with three ionized and two neutral components, which takes into consideration Hall’s current and Ohm’s diffusion in addition to the friction due to collisions between different species. We apply our model to plasmas composed of hydrogen and helium, and allow the ionization degree to be arbitrary. By analyzing the corresponding dispersion relation and numerical simulations, we study the properties of small-amplitude perturbations. We discuss the effect of momentum transfer collisions on the ion-cyclotron resonances and compare the importance of magnetic resistivity, and ion–neutral and ion–ion collisions on the wave damping at various frequency ranges. Applications to partially ionized plasmas of the solar atmosphere are performed.

  18. Multi-fluid approach to high-frequency waves in plasmas: I. Small-amplitude regime in fully ionized medium

    CERN Document Server

    Martínez-Gómez, David; Terradas, Jaume

    2016-01-01

    Ideal MHD provides an accurate description of low-frequency Alfv\\'en waves in fully ionized plasmas. However, higher frequency waves in many plasmas of the solar atmosphere cannot be correctly described by ideal MHD and a more accurate model is required. Here, we study the properties of small-amplitude incompressible perturbations in both the low and the high frequency ranges in plasmas composed of several ionized species. We use a multi-fluid approach and take into account the effects of collisions between ions and the inclusion of Hall's term in the induction equation. Through the analysis of the corresponding dispersion relations and numerical simulations we check that at high frequencies ions of different species are not as strongly coupled as in the low frequency limit. Hence, they cannot be treated as a single fluid. In addition, elastic collisions between the distinct ionized species are not negligible for high frequency waves since an appreciable damping is obtained. Furthermore, Coulomb collisions be...

  19. Swarm Intelligence Approach Based on Adaptive ELM Classifier with ICGA Selection for Microarray Gene Expression and Cancer Classification

    Directory of Open Access Journals (Sweden)

    T. Karthikeyan

    2014-05-01

    Full Text Available The aim of this research study is based on efficient gene selection and classification of microarray data analysis using hybrid machine learning algorithms. The beginning of microarray technology has enabled the researchers to quickly measure the position of thousands of genes expressed in an organic/biological tissue samples in a solitary experiment. One of the important applications of this microarray technology is to classify the tissue samples using their gene expression representation, identify numerous type of cancer. Cancer is a group of diseases in which a set of cells shows uncontrolled growth, instance that interrupts upon and destroys nearby tissues and spreading to other locations in the body via lymph or blood. Cancer has becomes a one of the major important disease in current scenario. DNA microarrays turn out to be an effectual tool utilized in molecular biology and cancer diagnosis. Microarrays can be measured to establish the relative quantity of mRNAs in two or additional organic/biological tissue samples for thousands/several thousands of genes at the same time. As the superiority of this technique become exactly analysis/identifying the suitable assessment of microarray data in various open issues. In the field of medical sciences multi-category cancer classification play a major important role to classify the cancer types according to the gene expression. The need of the cancer classification has been become indispensible, because the numbers of cancer victims are increasing steadily identified by recent years. To perform this proposed a combination of Integer-Coded Genetic Algorithm (ICGA and Artificial Bee Colony algorithm (ABC, coupled with an Adaptive Extreme Learning Machine (AELM, is used for gene selection and cancer classification. ICGA is used with ABC based AELM classifier to chose an optimal set of genes which results in an efficient hybrid algorithm that can handle sparse data and sample imbalance. The

  20. Multiple classifier systems in texton-based approach for the classification of CT images of Lung

    DEFF Research Database (Denmark)

    Gangeh, Mehrdad J.; Sørensen, Lauge; Shaker, Saher B.

    2010-01-01

    In this paper, we propose using texton signatures based on raw pixel representation along with a parallel multiple classifier system for the classification of emphysema in computed tomography images of the lung. The multiple classifier system is composed of support vector machines on the texton.......e., texton size and k value in k-means. Our results show that while aggregation of single decisions by SVMs over various k values using multiple classifier systems helps to improve the results compared to single SVMs, combining over different texton sizes is not beneficial. The performance of the proposed...

  1. Evaluating an ensemble classification approach for crop diversityverification in Danish greening subsidy control

    DEFF Research Database (Denmark)

    Chellasamy, Menaka; Ferre, Ty; Greve, Mogens Humlekrog

    2016-01-01

    Beginning in 2015, Danish farmers are obliged to meet specific crop diversification rules based on total land area and number of crops cultivated to be eligible for new greening subsidies. Hence, there is a need for the Danish government to extend their subsidy control system to verify farmers...... and early summer) WorldView-2 imagery (WV2) and includes the following steps: (1) automatic computation of pixel-based prediction probabilities using multiple neural networks; (2) quantification of the classification uncertainty using Endorsement Theory (ET); (3) discrimination of crop pixels and validation...

  2. Mastectomy or breast conserving surgery? Factors affecting type of surgical treatment for breast cancer – a classification tree approach

    Directory of Open Access Journals (Sweden)

    O'Neill Terry

    2006-04-01

    Full Text Available Abstract Background A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of "propensity" is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Methods Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Results Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Conclusion Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients.

  3. Dimensional approaches to psychiatric classification: refining the research agenda for DSM-V: an introduction.

    Science.gov (United States)

    Regier, Darrel A

    2007-01-01

    The American Psychiatric Association (APA) will publish the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V), in 2012. This paper reviews the extended, multi-faceted research planning preparations that APA has undertaken, several in collaboration with the World Health Organization and the U.S. National Institutes of Health, to assess the current state of diagnosis-relevant research and to generate short- and long-term recommendations for research needed to enrich DSM-V and future psychiatric classifications. This research review and planning process has underscored widespread interest among nosologists in the US and globally regarding the potential benefits for research and clinical practice of incorporating a dimensional component into the existing categorical, or binary, classification system in the DSM. Toward this end, the APA and its partners convened an international conference in July 2006 to critically appraise the use of dimensional constructs in psychiatric diagnostic systems. Resultant papers appear in this issue of International Journal of Methods in Psychiatric Research and in a forthcoming monograph to be published by APA.

  4. Approach for Text Classification Based on the Similarity Measurement between Normal Cloud Models

    Directory of Open Access Journals (Sweden)

    Jin Dai

    2014-01-01

    Full Text Available The similarity between objects is the core research area of data mining. In order to reduce the interference of the uncertainty of nature language, a similarity measurement between normal cloud models is adopted to text classification research. On this basis, a novel text classifier based on cloud concept jumping up (CCJU-TC is proposed. It can efficiently accomplish conversion between qualitative concept and quantitative data. Through the conversion from text set to text information table based on VSM model, the text qualitative concept, which is extraction from the same category, is jumping up as a whole category concept. According to the cloud similarity between the test text and each category concept, the test text is assigned to the most similar category. By the comparison among different text classifiers in different feature selection set, it fully proves that not only does CCJU-TC have a strong ability to adapt to the different text features, but also the classification performance is also better than the traditional classifiers.

  5. Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

    Directory of Open Access Journals (Sweden)

    Daniel Peralta

    2015-01-01

    Full Text Available Nowadays, many disciplines have to deal with big datasets that additionally involve a high number of features. Feature selection methods aim at eliminating noisy, redundant, or irrelevant features that may deteriorate the classification performance. However, traditional methods lack enough scalability to cope with datasets of millions of instances and extract successful results in a delimited time. This paper presents a feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets. The algorithm decomposes the original dataset in blocks of instances to learn from them in the map phase; then, the reduce phase merges the obtained partial results into a final vector of feature weights, which allows a flexible application of the feature selection procedure using a threshold to determine the selected subset of features. The feature selection method is evaluated by using three well-known classifiers (SVM, Logistic Regression, and Naive Bayes implemented within the Spark framework to address big data problems. In the experiments, datasets up to 67 millions of instances and up to 2000 attributes have been managed, showing that this is a suitable framework to perform evolutionary feature selection, improving both the classification accuracy and its runtime when dealing with big data problems.

  6. Toward a multi-sensor-based approach to automatic text classification

    Energy Technology Data Exchange (ETDEWEB)

    Dasigi, V.R. [Sacred Heart Univ., Fairfield, CT (United States); Mann, R.C. [Oak Ridge National Lab., TN (United States)

    1995-10-01

    Many automatic text indexing and retrieval methods use a term-document matrix that is automatically derived from the text in question. Latent Semantic Indexing is a method, recently proposed in the Information Retrieval (IR) literature, for approximating a large and sparse term-document matrix with a relatively small number of factors, and is based on a solid mathematical foundation. LSI appears to be quite useful in the problem of text information retrieval, rather than text classification. In this report, we outline a method that attempts to combine the strength of the LSI method with that of neural networks, in addressing the problem of text classification. In doing so, we also indicate ways to improve performance by adding additional {open_quotes}logical sensors{close_quotes} to the neural network, something that is hard to do with the LSI method when employed by itself. The various programs that can be used in testing the system with TIPSTER data set are described. Preliminary results are summarized, but much work remains to be done.

  7. Subspace projection approaches to classification and visualization of neural network-level encoding patterns.

    Directory of Open Access Journals (Sweden)

    Remus Oşan

    Full Text Available Recent advances in large-scale ensemble recordings allow monitoring of activity patterns of several hundreds of neurons in freely behaving animals. The emergence of such high-dimensional datasets poses challenges for the identification and analysis of dynamical network patterns. While several types of multivariate statistical methods have been used for integrating responses from multiple neurons, their effectiveness in pattern classification and predictive power has not been compared in a direct and systematic manner. Here we systematically employed a series of projection methods, such as Multiple Discriminant Analysis (MDA, Principal Components Analysis (PCA and Artificial Neural Networks (ANN, and compared them with non-projection multivariate statistical methods such as Multivariate Gaussian Distributions (MGD. Our analyses of hippocampal data recorded during episodic memory events and cortical data simulated during face perception or arm movements illustrate how low-dimensional encoding subspaces can reveal the existence of network-level ensemble representations. We show how the use of regularization methods can prevent these statistical methods from over-fitting of training data sets when the trial numbers are much smaller than the number of recorded units. Moreover, we investigated the extent to which the computations implemented by the projection methods reflect the underlying hierarchical properties of the neural populations. Based on their ability to extract the essential features for pattern classification, we conclude that the typical performance ranking of these methods on under-sampled neural data of large dimension is MDA>PCA>ANN>MGD.

  8. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    Science.gov (United States)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as "targeted change detection". Based on a one-class classifier "Support Vector Domain Description (SVDD)", a novel algorithm named "Three-layer SVDD Fusion (TLSF)" is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  9. Binary Image Classification: A Genetic Programming Approach to the Problem of Limited Training Instances.

    Science.gov (United States)

    Al-Sahaf, Harith; Zhang, Mengjie; Johnston, Mark

    2016-01-01

    In the computer vision and pattern recognition fields, image classification represents an important yet difficult task. It is a challenge to build effective computer models to replicate the remarkable ability of the human visual system, which relies on only one or a few instances to learn a completely new class or an object of a class. Recently we proposed two genetic programming (GP) methods, one-shot GP and compound-GP, that aim to evolve a program for the task of binary classification in images. The two methods are designed to use only one or a few instances per class to evolve the model. In this study, we investigate these two methods in terms of performance, robustness, and complexity of the evolved programs. We use ten data sets that vary in difficulty to evaluate these two methods. We also compare them with two other GP and six non-GP methods. The results show that one-shot GP and compound-GP outperform or achieve results comparable to competitor methods. Moreover, the features extracted by these two methods improve the performance of other classifiers with handcrafted features and those extracted by a recently developed GP-based method in most cases.

  10. Audio Classification in Speech and Music: A Comparison between a Statistical and a Neural Approach

    Directory of Open Access Journals (Sweden)

    Bugatti Alessandro

    2002-01-01

    Full Text Available We focus the attention on the problem of audio classification in speech and music for multimedia applications. In particular, we present a comparison between two different techniques for speech/music discrimination. The first method is based on Zero crossing rate and Bayesian classification. It is very simple from a computational point of view, and gives good results in case of pure music or speech. The simulation results show that some performance degradation arises when the music segment contains also some speech superimposed on music, or strong rhythmic components. To overcome these problems, we propose a second method, that uses more features, and is based on neural networks (specifically a multi-layer Perceptron. In this case we obtain better performance, at the expense of a limited growth in the computational complexity. In practice, the proposed neural network is simple to be implemented if a suitable polynomial is used as the activation function, and a real-time implementation is possible even if low-cost embedded systems are used.

  11. Full hierarchic versus non-hierarchic classification approaches for mapping sealed surfaces at the rural-urban fringe using high-resolution satellite data.

    Science.gov (United States)

    De Roeck, Tim; Van de Voorde, Tim; Canters, Frank

    2009-01-01

    Since 2008 more than half of the world population is living in cities and urban sprawl is continuing. Because of these developments, the mapping and monitoring of urban environments and their surroundings is becoming increasingly important. In this study two object-oriented approaches for high-resolution mapping of sealed surfaces are compared: a standard non-hierarchic approach and a full hierarchic approach using both multi-layer perceptrons and decision trees as learning algorithms. Both methods outperform the standard nearest neighbour classifier, which is used as a benchmark scenario. For the multi-layer perceptron approach, applying a hierarchic classification strategy substantially increases the accuracy of the classification. For the decision tree approach a one-against-all hierarchic classification strategy does not lead to an improvement of classification accuracy compared to the standard all-against-all approach. Best results are obtained with the hierarchic multi-layer perceptron classification strategy, producing a kappa value of 0.77. A simple shadow reclassification procedure based on characteristics of neighbouring objects further increases the kappa value to 0.84.

  12. A multitemporal probabilistic error correction approach to SVM classification of alpine glacier exploiting sentinel-1 images (Conference Presentation)

    Science.gov (United States)

    Callegari, Mattia; Marin, Carlo; Notarnicola, Claudia; Carturan, Luca; Covi, Federico; Galos, Stephan; Seppi, Roberto

    2016-10-01

    In mountain regions and their forelands, glaciers are key source of melt water during the middle and late ablation season, when most of the winter snow has already melted. Furthermore, alpine glaciers are recognized as sensitive indicators of climatic fluctuations. Monitoring glacier extent changes and glacier surface characteristics (i.e. snow, firn and bare ice coverage) is therefore important for both hydrological applications and climate change studies. Satellite remote sensing data have been widely employed for glacier surface classification. Many approaches exploit optical data, such as from Landsat. Despite the intuitive visual interpretation of optical images and the demonstrated capability to discriminate glacial surface thanks to the combination of different bands, one of the main disadvantages of available high-resolution optical sensors is their dependence on cloud conditions and low revisit time frequency. Therefore, operational monitoring strategies relying only on optical data have serious limitations. Since SAR data are insensitive to clouds, they are potentially a valid alternative to optical data for glacier monitoring. Compared to past SAR missions, the new Sentinel-1 mission provides much higher revisit time frequency (two acquisitions each 12 days) over the entire European Alps, and this number will be doubled once the Sentinel1-b will be in orbit (April 2016). In this work we present a method for glacier surface classification by exploiting dual polarimetric Sentinel-1 data. The method consists of a supervised approach based on Support Vector Machine (SVM). In addition to the VV and VH signals, we tested the contribution of local incidence angle, extracted from a digital elevation model and orbital information, as auxiliary input feature in order to account for the topographic effects. By exploiting impossible temporal transition between different classes (e.g. if at a given date one pixel is classified as rock it cannot be classified as

  13. Exchange rate regimes and external financial stability

    Directory of Open Access Journals (Sweden)

    Stoica Ovidiu

    2016-01-01

    Full Text Available Financial stability within the framework of the global financial crisis has become a common topic for researchers and practitioners. In order to analyse the impact of exchange rate regimes on financial stability we use both the de jure and de facto exchange rate classifications. We apply the model to a 1999-2010 annual data sample for 135 countries and territories, grouped by the level of economic development. Our second focus is the investigation of the effects of the exchange rate regimes in three economic integration areas (member countries of the European Union 27, the Southern Common Market, and the Association of Southeast Asian Nations on financial stability. Our results generally support the central banks’ concerns that the flexibility of exchange rate regimes should be reduced in order to sustain financial stability; however, the findings are not robust when using alternative regime classifications.

  14. Projected changes in hydrological regimes and glacier coverage in the Ötztal Alps (Austria) based on a multi-model approach

    Science.gov (United States)

    Hanzer, Florian; Stoll, Elena; Förster, Kristian; Nemec, Johanna; Oesterle, Felix; Berlin, Stefan; Schöber, Johannes; Huttenlau, Matthias; Strasser, Ulrich

    2017-04-01

    Assessing the amount of water resources stored in mountain catchments as snow and ice as well as the timing of meltwater production and the resulting streamflow runoff is of high interest for glaciohydrological investigations and hydropower production. Quantifications of the uncertainties included in predictions of future runoff regimes are important for long-term water resources planning. We present a multi-model investigation of the effects of future climate change on glaciers and hydrology for the Rofenache headwater catchment (98 km2, approx. 1/3 glacierized) in the Ötztal Alps (Austria). Two independent glaciohydrological modeling approaches with differing complexity are applied: i) the semi-distributed hydrological model HQsim coupled to a zero-dimensional glacier evolution model, operating on daily time steps, and ii) the fully distributed energy and mass balance model AMUNDSEN extended with an empirical glacier retreat parameterization (Δh approach), operating on 3-hourly time steps. Statistically downscaled, bias-corrected, and (for the sub-daily model runs) temporally disaggregated EURO-CORDEX regional climate simulations covering the RCP2.6, RCP4.5, and RCP8.5 scenarios are used as meteorological forcing. Model results are evaluated in terms of magnitude and change of the contributions of the individual runoff components (snowmelt, ice melt, rain) in the subcatchments as well as the change in glacier volume and area. The bandwidth of the results allows to analyze and quantify both the uncertainties induced by the different RCM forcing data sets as well as by the two glaciohydrological modeling approaches.

  15. Current Approaches to Diagnosis and Classification Features of Neuroosteoarthropathy Charcot (literature review

    Directory of Open Access Journals (Sweden)

    Balatiuk Irina

    2016-12-01

    Full Text Available The article provides the analysis of the publications of domestic and foreign authors on such complication of diabetes as diabetic Charcot osteoarthropathy. It formulates modern domestic classification of diabetic foot syndrome. It has been stated that diabetic foot syndrome is a serious medical and social problem, due to the high level of disability of patients, it causes significant social and economic losses to society. Pathogenetic basis for the development of diabetic osteoarthropathy is a combination of uncontrolled bone resorption and the lack of sensitivity of the defense, which leads to the destruction of joints. The gold standard for diagnosis of Charcot osteoarthropathy is X-ray densitometry that allows you to objectively assess the state of bone mineral density.

  16. Classification of Phylogenetic Profiles for Protein Function Prediction: An SVM Approach

    Science.gov (United States)

    Kotaru, Appala Raju; Joshi, Ramesh C.

    Predicting the function of an uncharacterized protein is a major challenge in post-genomic era due to problems complexity and scale. Having knowledge of protein function is a crucial link in the development of new drugs, better crops, and even the development of biochemicals such as biofuels. Recently numerous high-throughput experimental procedures have been invented to investigate the mechanisms leading to the accomplishment of a protein’s function and Phylogenetic profile is one of them. Phylogenetic profile is a way of representing a protein which encodes evolutionary history of proteins. In this paper we proposed a method for classification of phylogenetic profiles using supervised machine learning method, support vector machine classification along with radial basis function as kernel for identifying functionally linked proteins. We experimentally evaluated the performance of the classifier with the linear kernel, polynomial kernel and compared the results with the existing tree kernel. In our study we have used proteins of the budding yeast saccharomyces cerevisiae genome. We generated the phylogenetic profiles of 2465 yeast genes and for our study we used the functional annotations that are available in the MIPS database. Our experiments show that the performance of the radial basis kernel is similar to polynomial kernel is some functional classes together are better than linear, tree kernel and over all radial basis kernel outperformed the polynomial kernel, linear kernel and tree kernel. In analyzing these results we show that it will be feasible to make use of SVM classifier with radial basis function as kernel to predict the gene functionality using phylogenetic profiles.

  17. A Multidimensional Item Response Modeling Approach for Improving Subscale Proficiency Estimation and Classification

    Science.gov (United States)

    Yao, Lihua; Boughton, Keith A.

    2007-01-01

    Several approaches to reporting subscale scores can be found in the literature. This research explores a multidimensional compensatory dichotomous and polytomous item response theory modeling approach for subscale score proficiency estimation, leading toward a more diagnostic solution. It also develops and explores the recovery of a Markov chain…

  18. A Feature Mining Based Approach for the Classification of Text Documents into Disjoint Classes.

    Science.gov (United States)

    Nieto Sanchez, Salvador; Triantaphyllou, Evangelos; Kraft, Donald

    2002-01-01

    Proposes a new approach for classifying text documents into two disjoint classes. Highlights include a brief overview of document clustering; a data mining approach called the One Clause at a Time (OCAT) algorithm which is based on mathematical logic; vector space model (VSM); and comparing the OCAT to the VSM. (Author/LRW)

  19. The impacts of different long-term fertilization regimes on the bioavailability of arsenic in soil: integrating chemical approach with Escherichia coli arsRp::luc-based biosensor.

    Science.gov (United States)

    Hou, Qi-Hui; Ma, An-Zhou; Lv, Di; Bai, Zhi-Hui; Zhuang, Xu-Liang; Zhuang, Guo-Qiang

    2014-07-01

    An Escherichia coli arsRp::luc-based biosensor was constructed to measure the bioavailability of arsenic (As) in soil. In previous induction experiments, it produced a linear response (R (2) = 0.96, P bioavailability of arsenic (As) in soil. Per the BCR-SEPs analysis, the application of M and M + NPK led to a significant (P bioavailable As in manure-fertilized (M and M + NPK) soil was significantly higher (P bioavailable As. More significantly, E. coli biosensor-determined As was only 18.46-85.17 % of exchangeable As and 20.68-90.1 % of reducible As based on BCR-SEPs. In conclusion, NKP fertilization was recommended as a more suitable regime in As-polluted soil especially with high As concentration, and this E. coli arsRp::luc-based biosensor was a more realistic approach in assessing the bioavailability of As in soil since it would not overrate the risk of As to the environment.

  20. A multi-scale superpixel classification approach to the detection of regions of interest in whole slide histopathology images

    Science.gov (United States)

    Bejnordi, Babak E.; Litjens, Geert; Hermsen, Meyke; Karssemeijer, Nico; van der Laak, Jeroen A. W. M.

    2015-03-01

    This paper presents a new algorithm for automatic detection of regions of interest in whole slide histopathological images. The proposed algorithm generates and classifies superpixels at multiple resolutions to detect regions of interest. The algorithm emulates the way the pathologist examines the whole slide histopathology image by processing the image at low magnifications and performing more sophisticated analysis only on areas requiring more detailed information. However, instead of the traditional usage of fixed sized rectangular patches for the identification of relevant areas, we use superpixels as the visual primitives to detect regions of interest. Rectangular patches can span multiple distinct structures, thus degrade the classification performance. The proposed multi-scale superpixel classification approach yields superior performance for the identification of the regions of interest. For the evaluation, a set of 10 whole slide histopathology images of breast tissue were used. Empirical evaluation of the performance of our proposed algorithm relative to expert manual annotations shows that the algorithm achieves an area under the Receiver operating characteristic (ROC) curve of 0.958, demonstrating its efficacy for the detection of regions of interest.

  1. Hybrid three-dimensional and support vector machine approach for automatic vehicle tracking and classification using a single camera

    Science.gov (United States)

    Kachach, Redouane; Cañas, José María

    2016-05-01

    Using video in traffic monitoring is one of the most active research domains in the computer vision community. TrafficMonitor, a system that employs a hybrid approach for automatic vehicle tracking and classification on highways using a simple stationary calibrated camera, is presented. The proposed system consists of three modules: vehicle detection, vehicle tracking, and vehicle classification. Moving vehicles are detected by an enhanced Gaussian mixture model background estimation algorithm. The design includes a technique to resolve the occlusion problem by using a combination of two-dimensional proximity tracking algorithm and the Kanade-Lucas-Tomasi feature tracking algorithm. The last module classifies the shapes identified into five vehicle categories: motorcycle, car, van, bus, and truck by using three-dimensional templates and an algorithm based on histogram of oriented gradients and the support vector machine classifier. Several experiments have been performed using both real and simulated traffic in order to validate the system. The experiments were conducted on GRAM-RTM dataset and a proper real video dataset which is made publicly available as part of this work.

  2. Improved MODIS aerosol retrieval in urban areas using a land classification approach and empirical orthogonal functions

    Science.gov (United States)

    Levitan, Nathaniel; Gross, Barry

    2016-10-01

    New, high-resolution aerosol products are required in urban areas to improve the spatial coverage of the products, in terms of both resolution and retrieval frequency. These new products will improve our understanding of the spatial variability of aerosols in urban areas and will be useful in the detection of localized aerosol emissions. Urban aerosol retrieval is challenging for existing algorithms because of the high spatial variability of the surface reflectance, indicating the need for improved urban surface reflectance models. This problem can be stated in the language of novelty detection as the problem of selecting aerosol parameters whose effective surface reflectance spectrum is not an outlier in some space. In this paper, empirical orthogonal functions, a reconstruction-based novelty detection technique, is used to perform single-pixel aerosol retrieval using the single angular and temporal sample provided by the MODIS sensor. The empirical orthogonal basis functions are trained for different land classes using the MODIS BRDF MCD43 product. Existing land classification products are used in training and aerosol retrieval. The retrieval is compared against the existing operational MODIS 3 KM Dark Target (DT) aerosol product and co-located AERONET data. Based on the comparison, our method allows for a significant increase in retrieval frequency and a moderate decrease in the known biases of MODIS urban aerosol retrievals.

  3. A multi-modal approach for hand motion classification using surface EMG and accelerometers.

    Science.gov (United States)

    Fougner, A; Scheme, E; Chan, A D C; Englehart, K; Stavdahl, Ø

    2011-01-01

    For decades, electromyography (EMG) has been used for diagnostics, upper-limb prosthesis control, and recently even for more general human-machine interfaces. Current commercial upper limb prostheses usually have only two electrode sites due to cost and space limitations, while researchers often experiment with multiple sites. Micro-machined inertial sensors are gaining popularity in many commercial and research applications where knowledge of the postures and movements of the body is desired. In the present study, we have investigated whether accelerometers, which are relatively cheap, small, robust to noise, and easily integrated in a prosthetic socket; can reduce the need for adding more electrode sites to the prosthesis control system. This was done by adding accelerometers to a multifunction system and also to a simplified system more similar to current commercially available prosthesis controllers, and assessing the resulting changes in classification accuracy. The accelerometer does not provide information on muscle force like EMG electrodes, but the results show that it provides useful supplementary information. Specifically, if one wants to improve a two-site EMG system, one should add an accelerometer affixed to the forearm rather than a third electrode.

  4. Snow white and the seven dwarfs: a multivariate approach to classification of cold tolerance.

    Science.gov (United States)

    Nedved, O

    2000-01-01

    Two main cold hardiness strategies of insects - freeze tolerance in some species, and overwintering in a supercooled state without tolerance of freezing in many others - were recently reclassified. However, I present several problems with the current systems. My suggested classification is based on clearer definitions of the causes of cold injury. I recognize three main mortality factors: freezing of body liquids, cold shock, and cumulative chill injury. Presence or absence of each of these factors produce eight combinations. I have named the eight classes after Snow White and the Seven Dwarfs to avoid nomenclatural confusion. Some of these classes are probably not used as tactics against cold injury by any insect species. Other classes contain so many species that they might be reclassified in more detail, using values of supercooling point and other quantitative parameters. However, widely comparable parameters, like the upper limit of cold injury zone and the sum of injurious temperatures are still rarely published, thus we still lack comprehensive data for multivariate analyses. Every cold hardiness strategy should be characterized by a meaningful class or subclass together with the physiological, biochemical, and behavioural mechanisms employed by the insects. I also point out the existence of strategies that combine two tactics - either a switching strategy (during preparation for winter, population "chooses" which tactic will be used), or a dual strategy (individuals are ready to use one of the tactics depending on the prevailing environmental conditions).

  5. Contemporary approach to diagnosis and classification of renal cell carcinoma with mixed histologic features

    Institute of Scientific and Technical Information of China (English)

    Kanishka Sircar; Priya Rao; Eric Jonasch; Federico A.Monzon; Pheroze Tamboli

    2013-01-01

    Renal cell carcinoma (RCC) is an important contributor to cancer-specific mortality worldwide.Targeted agents that inhibit key subtype-specific signaling pathways have improved survival times and have recently become part of the standard of care for this disease.Accurately diagnosing and classifying RCC on the basis of tumor histology is thus critical.RCC has been traditionally divided into clear-cell and non-clearcell categories,with papillary RCC forming the most common subtype of non-clear-cell RCC.Renal neoplasms with overlapping histologies,such as tumors with mixed clear-cell and papillary features and hybrid renal oncocytic tumors,are increasingly seen in contemporary practice and present a diagnostic challenge with important therapeutic implications.In this review,we discuss the histologic,immunohistochemical,cytogenetic,and clinicopathologic aspects of these differential diagnoses and illustrate how the classification of RCC has evolved to integrate both the tumor's microscopic appearance and its molecular fingerprint.

  6. A physical approach to the classification of indecomposable Virasoro representations from the blob algebra

    Energy Technology Data Exchange (ETDEWEB)

    Gainutdinov, Azat M. [Institut de Physique Théorique, CEA Saclay, 91191 Gif Sur Yvette (France); Jacobsen, Jesper Lykke [LPTENS, 24 rue Lhomond, 75231 Paris (France); Université Pierre et Marie Curie, 4 place Jussieu, 75252 Paris (France); Saleur, Hubert [Institut de Physique Théorique, CEA Saclay, 91191 Gif Sur Yvette (France); Department of Physics, University of Southern California, Los Angeles, CA 90089-0484 (United States); Vasseur, Romain, E-mail: romain.vasseur@cea.fr [Institut de Physique Théorique, CEA Saclay, 91191 Gif Sur Yvette (France); LPTENS, 24 rue Lhomond, 75231 Paris (France)

    2013-08-21

    In the context of Conformal Field Theory (CFT), many results can be obtained from the representation theory of the Virasoro algebra. While the interest in Logarithmic CFTs has been growing recently, the Virasoro representations corresponding to these quantum field theories remain dauntingly complicated, thus hindering our understanding of various critical phenomena. We extend in this paper the construction of Read and Saleur (2007) [1,2], and uncover a deep relationship between the Virasoro algebra and a finite-dimensional algebra characterizing the properties of two-dimensional statistical models, the so-called blob algebra (a proper extension of the Temperley–Lieb algebra). This allows us to explore vast classes of Virasoro representations (projective, tilting, generalized staggered modules, etc.), and to conjecture a classification of all possible indecomposable Virasoro modules (with, in particular, L{sub 0} Jordan cells of arbitrary rank) that may appear in a consistent physical Logarithmic CFT where Virasoro is the maximal local chiral algebra. As by-products, we solve and analyze algebraically quantum-group symmetric XXZ spin chains and sl(2|1) supersymmetric spin chains with extra spins at the boundary, together with the “mirror” spin chain introduced by Martin and Woodcock (2003) [3].

  7. A supervised learning approach for taxonomic classification of core-photosystem-II genes and transcripts in the marine environment

    Directory of Open Access Journals (Sweden)

    Polz Martin F

    2009-05-01

    Full Text Available Abstract Background Cyanobacteria of the genera Synechococcus and Prochlorococcus play a key role in marine photosynthesis, which contributes to the global carbon cycle and to the world oxygen supply. Recently, genes encoding the photosystem II reaction center (psbA and psbD were found in cyanophage genomes. This phenomenon suggested that the horizontal transfer of these genes may be involved in increasing phage fitness. To date, a very small percentage of marine bacteria and phages has been cultured. Thus, mapping genomic data extracted directly from the environment to its taxonomic origin is necessary for a better understanding of phage-host relationships and dynamics. Results To achieve an accurate and rapid taxonomic classification, we employed a computational approach combining a multi-class Support Vector Machine (SVM with a codon usage position specific scoring matrix (cuPSSM. Our method has been applied successfully to classify core-photosystem-II gene fragments, including partial sequences coming directly from the ocean, to seven different taxonomic classes. Applying the method on a large set of DNA and RNA psbA clones from the Mediterranean Sea, we studied the distribution of cyanobacterial psbA genes and transcripts in their natural environment. Using our approach, we were able to simultaneously examine taxonomic and ecological distributions in the marine environment. Conclusion The ability to accurately classify the origin of individual genes and transcripts coming directly from the environment is of great importance in studying marine ecology. The classification method presented in this paper could be applied further to classify other genes amplified from the environment, for which training data is available.

  8. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  9. A learning-based similarity fusion and filtering approach for biomedical image retrieval using SVM classification and relevance feedback.

    Science.gov (United States)

    Rahman, Md Mahmudur; Antani, Sameer K; Thoma, George R

    2011-07-01

    This paper presents a classification-driven biomedical image retrieval framework based on image filtering and similarity fusion by employing supervised learning techniques. In this framework, the probabilistic outputs of a multiclass support vector machine (SVM) classifier as category prediction of query and database images are exploited at first to filter out irrelevant images, thereby reducing the search space for similarity matching. Images are classified at a global level according to their modalities based on different low-level, concept, and keypoint-based features. It is difficult to find a unique feature to compare images effectively for all types of queries. Hence, a query-specific adaptive linear combination of similarity matching approach is proposed by relying on the image classification and feedback information from users. Based on the prediction of a query image category, individual precomputed weights of different features are adjusted online. The prediction of the classifier may be inaccurate in some cases and a user might have a different semantic interpretation about retrieved images. Hence, the weights are finally determined by considering both precision and rank order information of each individual feature representation by considering top retrieved relevant images as judged by the users. As a result, the system can adapt itself to individual searches to produce query-specific results. Experiment is performed in a diverse collection of 5 000 biomedical images of different modalities, body parts, and orientations. It demonstrates the efficiency (about half computation time compared to search on entire collection) and effectiveness (about 10%-15% improvement in precision at each recall level) of the retrieval approach.

  10. Classification of Suncus murinus species complex (Soricidae: Crocidurinae) in Peninsular Malaysia using image analysis and machine learning approaches.

    Science.gov (United States)

    Abu, Arpah; Leow, Lee Kien; Ramli, Rosli; Omar, Hasmahzaiti

    2016-12-22

    Taxonomists frequently identify specimen from various populations based on the morphological characteristics and molecular data. This study looks into another invasive process in identification of house shrew (Suncus murinus) using image analysis and machine learning approaches. Thus, an automated identification system is developed to assist and simplify this task. In this study, seven descriptors namely area, convex area, major axis length, minor axis length, perimeter, equivalent diameter and extent which are based on the shape are used as features to represent digital image of skull that consists of dorsal, lateral and jaw views for each specimen. An Artificial Neural Network (ANN) is used as classifier to classify the skulls of S. murinus based on region (northern and southern populations of Peninsular Malaysia) and sex (adult male and female). Thus, specimen classification using Training data set and identification using Testing data set were performed through two stages of ANNs. At present, the classifier used has achieved an accuracy of 100% based on skulls' views. Classification and identification to regions and sexes have also attained 72.5%, 87.5% and 80.0% of accuracy for dorsal, lateral, and jaw views, respectively. This results show that the shape characteristic features used are substantial because they can differentiate the specimens based on regions and sexes up to the accuracy of 80% and above. Finally, an application was developed and can be used for the scientific community. This automated system demonstrates the practicability of using computer-assisted systems in providing interesting alternative approach for quick and easy identification of unknown species.

  11. Seasonal Separation of African Savanna Components Using Worldview-2 Imagery: A Comparison of Pixel- and Object-Based Approaches and Selected Classification Algorithms

    Directory of Open Access Journals (Sweden)

    Żaneta Kaszta

    2016-09-01

    Full Text Available Separation of savanna land cover components is challenging due to the high heterogeneity of this landscape and spectral similarity of compositionally different vegetation types. In this study, we tested the usability of very high spatial and spectral resolution WorldView-2 (WV-2 imagery to classify land cover components of African savanna in wet and dry season. We compared the performance of Object-Based Image Analysis (OBIA and pixel-based approach with several algorithms: k-nearest neighbor (k-NN, maximum likelihood (ML, random forests (RF, classification and regression trees (CART and support vector machines (SVM. Results showed that classifications of WV-2 imagery produce high accuracy results (>77% regardless of the applied classification approach. However, OBIA had a significantly higher accuracy for almost every classifier with the highest overall accuracy score of 93%. Amongst tested classifiers, SVM and RF provided highest accuracies. Overall classifications of the wet season image provided better results with 93% for RF. However, considering woody leaf-off conditions, the dry season classification also performed well with overall accuracy of 83% (SVM and high producer accuracy for the tree cover (91%. Our findings demonstrate the potential of imagery like WorldView-2 with OBIA and advanced supervised machine-learning algorithms in seasonal fine-scale land cover classification of African savanna.

  12. Approach for classification and taxonomy within family Rickettsiaceae based on the Formal Order Analysis.

    Science.gov (United States)

    Shpynov, S; Pozdnichenko, N; Gumenuk, A

    2015-01-01

    Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach.

  13. A visual data-mining approach using 3D thoracic CT images for classification between benign and malignant pulmonary nodules

    Science.gov (United States)

    Kawata, Yoshiki; Niki, Noboru; Ohamatsu, Hironobu; Kusumoto, Masahiko; Kakinuma, Ryutaro; Mori, Kiyoshi; Yamada, K.; Nishiyama, Hiroyuki; Eguchi, Kenji; Kaneko, Masahiro; Moriyama, Noriyuki

    2003-05-01

    This paper presents a visual data-mining approach to assist physicians for classification between benign and malignant pulmonary nodules. This approach retrieves and displays nodules which exhibit morphological and internal profiles consistent to the nodule in question. It uses a three-dimensional (3-D) CT image database of pulmonary nodules for which diagnosis is known. The central module in this approach makes possible analysis of the query nodule image and extraction of the features of interest: shape, surrounding structure, and internal structure of the nodules. The nodule shape is characterized by principal axes, while the surrounding and internal structure is represented by the distribution pattern of CT density and 3-D curvature indexes. The nodule representation is then applied to a similarity measure such as a correlation coefficient. For each query case, we sort all the nodules of the database from most to less similar ones. By applying the retrieval method to our database, we present its feasibility to search the similar 3-D nodule images.

  14. Investigating the Predictive Value of Functional MRI to Appetitive and Aversive Stimuli: A Pattern Classification Approach

    Science.gov (United States)

    McCabe, Ciara; Rocha-Rego, Vanessa

    2016-01-01

    Background Dysfunctional neural responses to appetitive and aversive stimuli have been investigated as possible biomarkers for psychiatric disorders. However it is not clear to what degree these are separate processes across the brain or in fact overlapping systems. To help clarify this issue we used Gaussian process classifier (GPC) analysis to examine appetitive and aversive processing in the brain. Method 25 healthy controls underwent functional MRI whilst seeing pictures and receiving tastes of pleasant and unpleasant food. We applied GPCs to discriminate between the appetitive and aversive sights and tastes using functional activity patterns. Results The diagnostic accuracy of the GPC for the accuracy to discriminate appetitive taste from neutral condition was 86.5% (specificity = 81%, sensitivity = 92%, p = 0.001). If a participant experienced neutral taste stimuli the probability of correct classification was 92. The accuracy to discriminate aversive from neutral taste stimuli was 82.5% (specificity = 73%, sensitivity = 92%, p = 0.001) and appetitive from aversive taste stimuli was 73% (specificity = 77%, sensitivity = 69%, p = 0.001). In the sight modality, the accuracy to discriminate appetitive from neutral condition was 88.5% (specificity = 85%, sensitivity = 92%, p = 0.001), to discriminate aversive from neutral sight stimuli was 92% (specificity = 92%, sensitivity = 92%, p = 0.001), and to discriminate aversive from appetitive sight stimuli was 63.5% (specificity = 73%, sensitivity = 54%, p = 0.009). Conclusions Our results demonstrate the predictive value of neurofunctional data in discriminating emotional and neutral networks of activity in the healthy human brain. It would be of interest to use pattern recognition techniques and fMRI to examine network dysfunction in the processing of appetitive, aversive and neutral stimuli in psychiatric disorders. Especially where problems with reward and punishment processing have been implicated in the

  15. Geomorphons — a pattern recognition approach to classification and mapping of landforms

    Science.gov (United States)

    Jasiewicz, Jarosław; Stepinski, Tomasz F.

    2013-01-01

    We introduce a novel method for classification and mapping of landform elements from a DEM based on the principle of pattern recognition rather than differential geometry. At the core of the method is the concept of geomorphon (geomorphologic phonotypes) — a simple ternary pattern that serves as an archetype of a particular terrain morphology. A finite number of 498 geomorphons constitute a comprehensive and exhaustive set of all possible morphological terrain types including standard elements of landscape, as well as unfamiliar forms rarely found in natural terrestrial surfaces. A single scan of a DEM assigns an appropriate geomorphon to every cell in the raster using a procedure that self-adapts to identify the most suitable spatial scale at each location. As a result, the method classifies landform elements at a range of different spatial scales with unprecedented computational efficiency. A general purpose geomorphometric map — an interpreted map of topography — is obtained by generalizing allgeomorphons to a small number of the most common landform elements. Due to the robustness and high computational efficiency of the method high resolution geomorphometric maps having continental and even global extents can be generated from giga-cell DEMs. Such maps are a valuable new resource for both manual and automated geomorphometric analyses. In order to demonstrate a practical application of this new method, a 30 m cell- 1 geomorphometric map of the entire country of Poland is generated and the features and potential usage of this map are briefly discussed. The computer implementation of the method is outlined. The code is available in the public domain.

  16. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come for p...

  17. Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals.

    Science.gov (United States)

    Verma, Gyanendra K; Tiwary, Uma Shanker

    2014-11-15

    The purpose of this paper is twofold: (i) to investigate the emotion representation models and find out the possibility of a model with minimum number of continuous dimensions and (ii) to recognize and predict emotion from the measured physiological signals using multiresolution approach. The multimodal physiological signals are: Electroencephalogram (EEG) (32 channels) and peripheral (8 channels: Galvanic skin response (GSR), blood volume pressure, respiration pattern, skin temperature, electromyogram (EMG) and electrooculogram (EOG)) as given in the DEAP database. We have discussed the theories of emotion modeling based on i) basic emotions, ii) cognitive appraisal and physiological response approach and iii) the dimensional approach and proposed a three continuous dimensional representation model for emotions. The clustering experiment on the given valence, arousal and dominance values of various emotions has been done to validate the proposed model. A novel approach for multimodal fusion of information from a large number of channels to classify and predict emotions has also been proposed. Discrete Wavelet Transform, a classical transform for multiresolution analysis of signal has been used in this study. The experiments are performed to classify different emotions from four classifiers. The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for 'Depressing' with 85.46% using SVM. The 32 EEG channels are considered as independent modes and features from each channel are considered with equal importance. May be some of the channel data are correlated but they may contain supplementary information. In comparison with the results given by others, the high accuracy of 85% with 13 emotions and 32 subjects from our proposed method clearly proves the potential of our multimodal fusion approach.

  18. A Gene Selection Approach based on Clustering for Classification Tasks in Colon Cancer

    Directory of Open Access Journals (Sweden)

    José Antonio CASTELLANOS GARZÓN

    2016-06-01

    Full Text Available Gene selection (GS is an important research area in the analysis of DNA-microarray data, since it involves gene discovery meaningful for a particular target annotation or able to discriminate expression profiles of samples coming from different populations. In this context, a wide number of filter methods have been proposed in the literature to identify subsets of relevant genes in accordance with prefixed targets. Despite the fact that there is a wide number of proposals, the complexity imposed by this problem (GS remains a challenge. Hence, this paper proposes a novel approach for gene selection by using cluster techniques and filter methods on the found groupings to achieve informative gene subsets. As a result of applying our methodology to Colon cancer data, we have identified the best informative gene subset between several one subsets. According to the above, the reached results have proven the reliability of the approach given in this paper.

  19. MULTILEVEL APPROACH OF CBIR TECHNIQUES FOR VEGETABLE CLASSIFICATION USING HYBRID IMAGE FEATURES

    Directory of Open Access Journals (Sweden)

    D. Latha

    2016-02-01

    Full Text Available CBIR is a technique to retrieve images semantically relevant to query image from an image database. The challenge in CBIR is to develop a method that should increase the retrieval accuracy and reduce the retrieval time. In order to improve the retrieval accuracy and runtime, a multilevel CBIR approach is proposed in this paper. In the first level, the color attributes like mean and standard deviations are proposed to calculate on HSV color space to retrieve the images with minimum disparity distance from the database. In order to minimize search area, in the second level Local Ternary Pattern is proposed on images which were selected from the first level. Experimental results and comparisons demonstrate the superiority of the proposed approach.

  20. EMAIL SPAM CLASSIFICATION USING HYBRID APPROACH OF RBF NEURAL NETWORK AND PARTICLE SWARM OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Mohammed Awad

    2016-07-01

    Full Text Available Email is one of the most popular communication media in the current century; it has become an effective and fast method to share and information exchangeall over the world. In recent years, emails users are facing problem which is spam emails. Spam emails are unsolicited, bulk emails are sent by spammers. It consumes storage of mail servers, waste of time and consumes network bandwidth.Many methods used for spam filtering to classify email messages into two groups spam and non-spam. In general, one of the most powerful tools used for data lassification is Artificial Neural Networks (ANNs; it has the capability of dealing a huge amount of data with high dimensionality in better accuracy. One important type of ANNs is the Radial Basis Function Neural Networks (RBFNN that will be used in this work to classify spam message. In this paper, we present a new approach of spam filtering technique which combinesRBFNN and Particles Swarm Optimization (PSO algorithm (HC-RBFPSO. The proposed approach uses PSO algorithm to optimize the RBFNN param eters, depending on the evolutionary heuristic search process of PSO. PSO use to optimize the best position of the RBFNN centers c. The Radii r optimize using K-Nearest Neighbors algorithmand the weights w optimize using Singular Value Decomposition algorithm within each iterative process of PSO depending the fitness (error function. The experiments are conducted on spam dataset namely SPAMBASE downloaded from UCI Machine Learning Repository. The experimental results show that our approach is performed in accuracy compared with other approaches that use the same dataset.

  1. PCA-based ANN approach to leak classification in the main pipes of VVER-1000

    Energy Technology Data Exchange (ETDEWEB)

    Hadad, Kamal; Jabbari, Masoud; Tabadar, Z. [Shiraz Univ. (Iran, Islamic Republic of). School of Mechanical Engineering; Hashemi-Tilehnoee, Mehdi [Islamic Azad Univ., Aliabad Katoul (Iran, Islamic Republic of). Dept. of Engineering

    2012-11-15

    This paper presents a neural network based fault diagnosing approach which allows dynamic crack and leaks fault identification. The method utilizes the Principal Component Analysis (PCA) technique to reduce the problem dimension. Such a dimension reduction approach leads to faster diagnosing and allows a better graphic presentation of the results. To show the effectiveness of the proposed approach, two methodologies are used to train the neural network (NN). At first, a training matrix composed of 14 variables is used to train a Multilayer Perceptron neural network (MLP) with Resilient Backpropagation (RBP) algorithm. Employing the proposed method, a more accurate and simpler network is designed where the input size is reduced from 14 to 6 variables for training the NN. In short, the application of PCA highly reduces the network topology and allows employing more efficient training algorithms. The accuracy, generalization ability, and reliability of the designed networks are verified using 10 simulated events data from a VVER-1000 simulation using DINAMIKA-97 code. Noise is added to the data to evaluate the robustness of the method and the method again shows to be effective and powerful. (orig.)

  2. Sustainable urban regime adjustments

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Jensen, Jens Stissing; Elle, Morten

    2013-01-01

    The endogenous agency that urban governments increasingly portray by making conscious and planned efforts to adjust the regimes they operate within is currently not well captured in transition studies. There is a need to acknowledge the ambiguity of regime enactment at the urban scale. This directs...... attention to the transformative implications of conscious strategic maneuvering by incumbent regime actors, when confronting regime structurations. This article provides insight to processes of regime enactment performed by local governments by applying a flow-oriented perspective on regime dynamics...

  3. Machine-learning approach for local classification of crystalline structures in multiphase systems

    Science.gov (United States)

    Dietz, C.; Kretz, T.; Thoma, M. H.

    2017-07-01

    Machine learning is one of the most popular fields in computer science and has a vast number of applications. In this work we will propose a method that will use a neural network to locally identify crystal structures in a mixed phase Yukawa system consisting of fcc, hcp, and bcc clusters and disordered particles similar to plasma crystals. We compare our approach to already used methods and show that the quality of identification increases significantly. The technique works very well for highly disturbed lattices and shows a flexible and robust way to classify crystalline structures that can be used by only providing particle positions. This leads to insights into highly disturbed crystalline structures.

  4. Human movement activity classification approaches that use wearable sensors and mobile devices

    Science.gov (United States)

    Kaghyan, Sahak; Sarukhanyan, Hakob; Akopian, David

    2013-03-01

    Cell phones and other mobile devices become part of human culture and change activity and lifestyle patterns. Mobile phone technology continuously evolves and incorporates more and more sensors for enabling advanced applications. Latest generations of smart phones incorporate GPS and WLAN location finding modules, vision cameras, microphones, accelerometers, temperature sensors etc. The availability of these sensors in mass-market communication devices creates exciting new opportunities for data mining applications. Particularly healthcare applications exploiting build-in sensors are very promising. This paper reviews different approaches of human activity recognition.

  5. Urban land use and land cover classification using the neural-fuzzy inference approach with Formosat-2 data

    Science.gov (United States)

    Chen, Ho-Wen; Chang, Ni-Bin; Yu, Ruey-Fang; Huang, Yi-Wen

    2009-10-01

    This paper presents a neural-fuzzy inference approach to identify the land use and land cover (LULC) patterns in large urban areas with the 8-meter resolution of multi-spectral images collected by Formosat-2 satellite. Texture and feature analyses support the retrieval of fuzzy rules in the context of data mining to discern the embedded LULC patterns via a neural-fuzzy inference approach. The case study for Taichung City in central Taiwan shows the application potential based on five LULC classes. With the aid of integrated fuzzy rules and a neural network model, the optimal weights associated with these achievable rules can be determined with phenomenological and theoretical implications. Through appropriate model training and validation stages with respect to a groundtruth data set, research findings clearly indicate that the proposed remote sensing technique can structure an improved screening and sequencing procedure when selecting rules for LULC classification. There is no limitation of using broad spectral bands for category separation by this method, such as the ability to reliably separate only a few (4-5) classes. This normalized difference vegetation index (NDVI)-based data mining technique has shown potential for LULC pattern recognition in different regions, and is not restricted to this sensor, location or date.

  6. Object-Based Classification as an Alternative Approach to the Traditional Pixel-Based Classification to Identify Potential Habitat of the Grasshopper Sparrow

    Science.gov (United States)

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  7. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  8. Does the bathing water classification depend on sampling strategy? A bootstrap approach for bathing water quality assessment, according to Directive 2006/7/EC requirements.

    Science.gov (United States)

    López, Iago; Alvarez, César; Gil, José L; Revilla, José A

    2012-11-30

    Data on the 95th and 90th percentiles of bacteriological quality indicators are used to classify bathing waters in Europe, according to the requirements of Directive 2006/7/EC. However, percentile values and consequently, classification of bathing waters depend both on sampling effort and sample-size, which may undermine an appropriate assessment of bathing water classification. To analyse the influence of sampling effort and sample size on water classification, a bootstrap approach was applied to 55 bacteriological quality datasets of several beaches in the Balearic Islands (Spain). Our results show that the probability of failing the regulatory standards of the Directive is high when sample size is low, due to a higher variability in percentile values. In this way, 49% of the bathing waters reaching an "Excellent" classification (95th percentile of Escherichia coli under 250 cfu/100 ml) can fail the "Excellent" regulatory standard due to sampling strategy, when 23 samples per season are considered. This percentage increases to 81% when 4 samples per season are considered. "Good" regulatory standards can also be failed in bathing waters with an "Excellent" classification as a result of these sampling strategies. The variability in percentile values may affect bathing water classification and is critical for the appropriate design and implementation of bathing water Quality Monitoring and Assessment Programs. Hence, variability of percentile values should be taken into account by authorities if an adequate management of these areas is to be achieved. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Modeling Personalized Email Prioritization: Classification-based and Regression-based Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Yoo S.; Yang, Y.; Carbonell, J.

    2011-10-24

    Email overload, even after spam filtering, presents a serious productivity challenge for busy professionals and executives. One solution is automated prioritization of incoming emails to ensure the most important are read and processed quickly, while others are processed later as/if time permits in declining priority levels. This paper presents a study of machine learning approaches to email prioritization into discrete levels, comparing ordinal regression versus classier cascades. Given the ordinal nature of discrete email priority levels, SVM ordinal regression would be expected to perform well, but surprisingly a cascade of SVM classifiers significantly outperforms ordinal regression for email prioritization. In contrast, SVM regression performs well -- better than classifiers -- on selected UCI data sets. This unexpected performance inversion is analyzed and results are presented, providing core functionality for email prioritization systems.

  10. An ensemble-based approach for breast mass classification in mammography images

    Science.gov (United States)

    Ribeiro, Patricia B.; Papa, João. P.; Romero, Roseli A. F.

    2017-03-01

    Mammography analysis is an important tool that helps detecting breast cancer at the very early stages of the disease, thus increasing the quality of life of hundreds of thousands of patients worldwide. In Computer-Aided Detection systems, the identification of mammograms with and without masses (without clinical findings) is highly needed to reduce the false positive rates regarding the automatic selection of regions of interest that may contain some suspicious content. In this work, the introduce a variant of the Optimum-Path Forest (OPF) classifier for breast mass identification, as well as we employed an ensemble-based approach that can enhance the effectiveness of individual classifiers aiming at dealing with the aforementioned purpose. The experimental results also comprise the naïve OPF and a traditional neural network, being the most accurate results obtained through the ensemble of classifiers, with an accuracy nearly to 86%.

  11. Supervised Learning Approach for Spam Classification Analysis using Data Mining Tools

    Directory of Open Access Journals (Sweden)

    R.Deepa Lakshmi

    2010-12-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifierrelated issues. In recent days, Machine learning for spamclassification is an important research issue. This paper exploresand identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysisamong the algorithms has also been presented.

  12. Supervised Learning Approach for Spam Classification Analysis using Data Mining Tools

    Directory of Open Access Journals (Sweden)

    R.Deepa Lakshmi

    2010-11-01

    Full Text Available E-mail is one of the most popular and frequently used ways of communication due to its worldwide accessibility, relatively fast message transfer, and low sending cost. The flaws in the e-mail protocols and the increasing amount of electronic business and financial transactions directly contribute to the increase in e-mail-based threats. Email spam is one of the major problems of the today’s Internet, bringing financial damage to companies and annoying individual users. Among the approaches developed to stop spam, filtering is the one of the most important technique. Many researches in spam filtering have been centered on the more sophisticated classifierrelated issues. In recent days, Machine learning for spamclassification is an important research issue. This paper exploresand identifies the use of different learning algorithms for classifying spam messages from e-mail. A comparative analysisamong the algorithms has also been presented.

  13. Detection of Dispersed Radio Pulses: A machine learning approach to candidate identification and classification

    CERN Document Server

    Devine, Thomas; McLaughlin, Maura

    2016-01-01

    Searching for extraterrestrial, transient signals in astronomical data sets is an active area of current research. However, machine learning techniques are lacking in the literature concerning single-pulse detection. This paper presents a new, two-stage approach for identifying and classifying dispersed pulse groups (DPGs) in single-pulse search output. The first stage identified DPGs and extracted features to characterize them using a new peak identification algorithm which tracks sloping tendencies around local maxima in plots of signal-to-noise ratio vs. dispersion measure. The second stage used supervised machine learning to classify DPGs. We created four benchmark data sets: one unbalanced and three balanced versions using three different imbalance treatments.We empirically evaluated 48 classifiers by training and testing binary and multiclass versions of six machine learning algorithms on each of the four benchmark versions. While each classifier had advantages and disadvantages, all classifiers with im...

  14. Pattern Recognition and Classification of Fatal Traffic Accidents in Israel A Neural Network Approach

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Gitelman, Victoria; Bekhor, Shlomo

    2011-01-01

    on 1,793 fatal traffic accidents occurred during the period between 2003 and 2006 and applies Kohonen and feed-forward back-propagation neural networks with the objective of extracting from the data typical patterns and relevant factors. Kohonen neural networks reveal five compelling accident patterns....... Feed-forward back-propagation neural networks indicate that sociodemographic characteristics of drivers and victims, accident location, and period of the day are extremely relevant factors. Accident patterns suggest that countermeasures are necessary for identified problems concerning mainly vulnerable...... road users such as pedestrians, cyclists, motorcyclists and young drivers. A “safe-system” integrating a system approach for the design of countermeasures and a monitoring process of performance indicators might address the priorities highlighted by the neural networks....

  15. Classification of Sunflower Oil Blends Stabilized by Oleoresin Rosemary (Rosmarinus officinalis L.) Using Multivariate Kinetic Approach.

    Science.gov (United States)

    Upadhyay, Rohit; Mishra, Hari Niwas

    2015-08-01

    The sunflower oil-oleoresin rosemary (Rosmarinus officinalis L.) blends (SORB) at 9 different concentrations (200 to 2000 mg/kg), sunflower oil-tertiary butyl hydroquinone (SOTBHQ ) at 200 mg/kg and control (without preservatives) (SO control ) were oxidized using Rancimat (temperature: 100 to 130 °C; airflow rate: 20 L/h). The oxidative stability of blends was expressed using induction period (IP), oil stability index and photochemiluminescence assay. The linear regression models were generated by plotting ln IP with temperature to estimate the shelf life at 20 °C (SL20 ; R(2) > 0.90). Principal component analysis (PCA) and hierarchical cluster analysis (HCA) was used to classify the oil blends depending upon the oxidative stability and kinetic parameters. The Arrhenius equation adequately described the temperature-dependent kinetics (R(2) > 0.90, P < 0.05) and kinetic parameters viz. activation energies, activation enthalpies, and entropies were calculated in the range of 92.07 to 100.50 kJ/mol, 88.85 to 97.28 kJ/mol, -33.33 to -1.13 J/mol K, respectively. Using PCA, a satisfactory discrimination was noted among SORB, SOTBHQ , and SOcontrol samples. HCA classified the oil blends into 3 different clusters (I, II, and III) where SORB1200 and SORB1500 were grouped together in close proximity with SOTBHQ indicating the comparable oxidative stability. The SL20 was estimated to be 3790, 6974, and 4179 h for SO control, SOTBHQ, and SORB1500, respectively. The multivariate kinetic approach effectively screened SORB1500 as the best blend conferring the highest oxidative stability to sunflower oil. This approach can be adopted for quick and reliable estimation of the oxidative stability of oil samples.

  16. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    Science.gov (United States)

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  17. Influence of intra-event-based flood regime on sediment flow behavior from a typical agro-catchment of the Chinese Loess Plateau

    Science.gov (United States)

    Zhang, Le-Tao; Li, Zhan-Bin; Wang, He; Xiao, Jun-Bo

    2016-07-01

    The pluvial erosion process is significantly affected by tempo-spatial patterns of flood flows. However, despite their importance, only a few studies have investigated the sediment flow behavior that is driven by different flood regimes. The study aims to investigate the effect of intra-event-based flood regimes on the dynamics of sediment exports at Tuanshangou catchment, a typical agricultural catchment (unmanaged) in the hilly loess region on the Chinese Loess Plateau. Measurements of 193 flood events and 158 sediment-producing events were collected from Tuanshangou station between 1961 and 1969. The combined methods of hierarchical clustering approach, discriminant analysis and One-Way ANOVA were used to classify the flood events in terms of their event-based flood characteristics, including flood duration, peak discharge, and event flood runoff depth. The 193 flood events were classified into five regimes, and the mean statistical features of each regime significantly differed. Regime A includes flood events with the shortest duration (76 min), minimum flood crest (0.045 m s-1), least runoff depth (0.2 mm), and highest frequency. Regime B includes flood events with a medium duration (274 min), medium flood crest (0.206 m s-1), and minor runoff depth (0.7 mm). Regime C includes flood events with the longest duration (822 min), medium flood crest (0.236 m s-1), and medium runoff depth (1.7 mm). Regime D includes flood events with a medium duration (239 min), large flood crest (4.21 m s-1), and large runoff depth (10 mm). Regime E includes flood events with a medium duration (304 min), maximum flood crest (8.62 m s-1), and largest runoff depth (25.9 mm). The sediment yield by different flood regimes is ranked as follows: Regime E > Regime D > Regime B > Regime C > Regime A. In terms of event-based average and maximum suspended sediment concentration, these regimes are ordered as follows: Regime E > Regime D > Regime C > Regime B > Regime A. Regimes D and E

  18. Improving Situational Awareness for Precursory Data Classification using Attribute Rough Set Reduction Approach

    Directory of Open Access Journals (Sweden)

    Pushan Kumar Dutta

    2013-11-01

    Full Text Available The task of modeling the distribution of a large number of earthquake events with frequent tremors detected prior to a main shock presents us unique challenges to model a robust classifier tool for rapid responses are needed in order to address victims. We have designed using a relational database for running a geophysical modeling application after connecting database record of all clusters of foreshock events from (1998-2010 for a complete catalog of seismicity analysis for the Himalayan basin. by Nath et al,2010. This paper develops a reduced rough set analysis method and implements this novel structure and reasoning process for foreshock cluster forecasting. In this study, we developed a reusable information technology infrastructure, called Efficient Machine Readable for Emergency Text Selection(EMRETS. The association and importance of precursory information in reference to earthquake rupture analysis is found out through attribute reduction based on rough set analysis. Secondly, find the importance of attributes through information entropy is a novel approach for high dimensional complex polynomial problems pre-dominant in geo-physical research and prospecting. Thirdly, we discuss the reducible indiscernible matrix and decision rule generation for a particular set of geographical co-ordinates leading to the spatial discovery of future earthquake having prior foreshock. This paper proposes a framework for extracting, classifying, analyzing, and presenting semi-structured catalog data sources through feature representation and selection.

  19. Novel approach to predict the azeotropy at any pressure using classification by subgroups

    Directory of Open Access Journals (Sweden)

    Taehyung Kim

    2012-11-01

    Full Text Available Distillation is one of the dominating separation processes, but there are some problems as inseparable mixtures areformed in some cases. This phenomenon is called as azeotropy. It is essential to understand azeotropy in any distillationprocesses since azeotropes, i.e. inseparable mixtures, cannot be separated by ordinary distillation. In this study, to constructa model which predicts the azeotropic formation at any pressure, a novel approach using support vector machine (SVM ispresented. The SVM method is used to classify data in the two classes, that is, azeotropes and non-azeotropes. 13 variables,including pressure, were used as explanatory variables in this model. From the result of the SVM models which were constructed with data measured at 1 atm and data measured at all pressures, the 1 atm model showed a higher prediction performance to the data measured at 1 atm than the all pressure model. Thus, for improving the performance of the all pressuremodel, we focused on intermolecular forces of solvents. The SVM models were constructed with only data of the solventshaving same subgroups. The accuracy of the model increased and it is expected that this proposed method will be used topredict azeotropic formation at any pressure with high accuracy.

  20. An Approach to the Classification of Cutting Vibration on Machine Tools

    Directory of Open Access Journals (Sweden)

    Jeng-Fung Chen

    2016-02-01

    Full Text Available Predictions of cutting vibrations are necessary for improving the operational efficiency, product quality, and safety in the machining process, since the vibration is the main factor for resulting in machine faults. “Cutting vibration” may be caused by setting incorrect parameters before machining is commenced and may affect the precision of the machined work piece. This raises the need to have an effective model that can be used to predict cutting vibrations. In this study, an artificial neural network (ANN model to forecast and classify the cutting vibration of the intelligent machine tool is presented. The factors that may cause cutting vibrations is firstly identified and a dataset for the research purpose is constructed. Then, the applicability of the model is illustrated. Based on the results in the comparative analysis, the artificial neural network approach performed better than the others. Because the vibration can be forecasted and classified, the product quality can be managed. This work may help new workers to avoid operating machine tools incorrectly, and hence can decrease manufacturing costs. It is expected that this study can enhance the performance of machine tools in metalworking sectors.

  1. A connectionist-geostatistical approach for classification of deformation types in ice surfaces

    Science.gov (United States)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.

    2014-12-01

    Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.

  2. Identificando mudanças de regimes sistêmicos em processos econômicos: um procedimento baseado na abordagem de dinâmica de sistemas Identifying systemic regime shifts in economic processes: a procedure based on the system dynamics approach

    Directory of Open Access Journals (Sweden)

    Newton Paulo Bueno

    2013-04-01

    Full Text Available A tese deste trabalho é que as técnicas mais sofisticadas atualmente utilizadas pelos economistas para fazer previsões (métodos não estruturais de previsão, em geral, e modelos de detecção de mudanças de regime, em particular não parecem realmente muito eficazes em prever mudanças radicais de regime como a que ocorreu na economia mundial recentemente. Assim, para aumentar seu grau de acurácia, parece razoável imaginar que tais técnicas devam ser complementadas por abordagens mais holísticas. O objetivo geral deste trabalho é mostrar que a metodologia de dinâmica de sistemas (system dynamics, que permite identificar os ciclos de feedback que comandam a dinâmica de sistemas complexos, parece estar especialmente bem-equipada para se tornar uma dessas abordagens complementares. Pretende-se, especificamente, apresentar um algoritmo sistêmico para identificar processos de mudança de regime como os que ocorrem quando uma economia, após anos de expansão continuada, sofre os efeitos da explosão de uma bolha financeira, como ocorreu recentemente.This paper argues that the sophisticated techniques presently used by economists to forecast macroeconomic variables behavior (non-structural forecasting methods, in general, and regime-switching models, in particular do not seem much effective for anticipating radical regime shifts as recently happened in the world economy. Thus, in order to improve their accuracy, it seems that they should be complemented by more holistic approaches. The general purpose of the paper is to show that the system dynamics methodology, which allows identifying the critical feedback loops that drive complex systems' dynamics, seems to be especially fitted to be one of those complementary approaches. To reach that goal, we present a systemic algorithm which allows identifying regime shift processes as the ones that take place when an economy is hit by the effects of a financial bubble burst.

  3. Identification of area-level influences on regions of high cancer incidence in Queensland, Australia: a classification tree approach

    Directory of Open Access Journals (Sweden)

    Mengersen Kerrie L

    2011-07-01

    Full Text Available Abstract Background Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART approach. Methods Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n = 186,075. For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results The accessibility of a person's residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.

  4. Estimation of source location and ground impedance using a hybrid multiple signal classification and Levenberg-Marquardt approach

    Science.gov (United States)

    Tam, Kai-Chung; Lau, Siu-Kit; Tang, Shiu-Keung

    2016-07-01

    A microphone array signal processing method for locating a stationary point source over a locally reactive ground and for estimating ground impedance is examined in detail in the present study. A non-linear least square approach using the Levenberg-Marquardt method is proposed to overcome the problem of unknown ground impedance. The multiple signal classification method (MUSIC) is used to give the initial estimation of the source location, while the technique of forward backward spatial smoothing is adopted as a pre-processer of the source localization to minimize the effects of source coherence. The accuracy and robustness of the proposed signal processing method are examined. Results show that source localization in the horizontal direction by MUSIC is satisfactory. However, source coherence reduces drastically the accuracy in estimating the source height. The further application of Levenberg-Marquardt method with the results from MUSIC as the initial inputs improves significantly the accuracy of source height estimation. The present proposed method provides effective and robust estimation of the ground surface impedance.

  5. A GHS-consistent approach to health hazard classification of petroleum substances, a class of UVCB substances.

    Science.gov (United States)

    Clark, Charles R; McKee, Richard H; Freeman, James J; Swick, Derek; Mahagaokar, Suneeta; Pigram, Glenda; Roberts, Linda G; Smulders, Chantal J; Beatty, Patrick W

    2013-12-01

    The process streams refined from petroleum crude oil for use in petroleum products are among those designated by USEPA as UVCB substances (unknown or variable composition, complex reaction products and biological materials). They are identified on global chemical inventories with unique Chemical Abstract Services (CAS) numbers and names. The chemical complexity of most petroleum substances presents challenges when evaluating their hazards and can result in differing evaluations due to the varying level of hazardous constituents and differences in national chemical control regulations. Global efforts to harmonize the identification of chemical hazards are aimed at promoting the use of consistent hazard evaluation criteria. This paper discusses a systematic approach for the health hazard evaluation of petroleum substances using chemical categories and the United Nations (UN) Globally Harmonized System (GHS) of classification and labeling. Also described are historical efforts to characterize the hazard of these substances and how they led to the development of categories, the identification of potentially hazardous constituents which should be considered, and a summary of the toxicology of the major petroleum product groups. The use of these categories can increase the utility of existing data, provide better informed hazard evaluations, and reduce the amount of animal testing required.

  6. A Bag of Concepts Approach for Biomedical Document Classification Using Wikipedia Knowledge*. Spanish-English Cross-language Case Study.

    Science.gov (United States)

    Mouriño-García, Marcos A; Pérez-Rodríguez, Roberto; Anido-Rifón, Luis E

    2017-08-16

    The ability to efficiently review the existing literature is essential for the rapid progress of research. This paper describes a classifier of text documents, represented as vectors in spaces of Wikipedia concepts, and analyses its suitability for classification of Spanish biomedical documents when only English documents are available for training. We propose the cross-language concept matching (CLCM) technique, which relies on Wikipedia interlanguage links to convert concept vectors from the Spanish to the English space. The performance of the classifier is compared to several baselines: a classifier based on machine translation, a classifier that represents documents after performing Explicit Semantic Analysis (ESA), and a classifier that uses a domain-specific semantic annotator (MetaMap). The corpus used for the experiments (Cross-Language UVigoMED) was purpose-built for this study, and it is composed of 12,832 English and 2,184 Spanish MEDLINE abstracts. The performance of our approach is superior to any other state-of-the art classifier in the benchmark, with performance increases up to: 124% over classical machine translation, 332% over MetaMap, and 60 times over the classifier based on ESA. The results have statistical significance, showing p-values < 0.0001. Using knowledge mined from Wikipedia to represent documents as vectors in a space of Wikipedia concepts and translating vectors between language-specific concept spaces, a cross-language classifier can be built, and it performs better than several state-of-the-art classifiers.

  7. An Integrated Spatial and Spectral Approach to the Classification of Mediterranean Land Cover Types: the SSC Method.

    NARCIS (Netherlands)

    Jong, de S.M.; Hornstra, T.; Maas, H.G.

    2001-01-01

    Classification of remotely sensed images is often based on assigning classes on a pixel by pixel basis. Such a classification ignores often useful reflectance information in neighbouring pixels. Open types of natural land cover such as maquis and garrigue ecosystems as found in the Mediterranean reg

  8. Structured classification for ED presenting complaints – from free text field-based approach to ICPC-2 ED application

    Directory of Open Access Journals (Sweden)

    Malmström Tomi

    2012-11-01

    Full Text Available Abstract Background Although there is a major need to record and analyse presenting complaints in emergency departments (EDs, no international standard exists. The aim of the present study was to produce structured complaint classification suitable for ED use and to implement it in practice. The structured classification evolved from a study of free text fields and ICPC-2 classification. Methods Presenting complaints in a free text field of ED admissions during a one-year period (n=40610 were analyzed and summarized to 70 presenting complaint groups. The results were compared to ICPC-2 based complaints collected in another ED. An expert panel reviewed the results and produced an ED application of ICPC-2 classification. This study implemented the new classification into an ED. Results The presenting complaints summarized from free text fields and those from ICPC-2 categories were remarkably similar. However, the ICPC-2 classification was too broad for ED; an adapted version was needed. The newly developed classification includes 89 presenting complaints and ED staff found it easy to use. Conclusions ICPC-2 classification can be adapted for ED use. The authors suggest a list of 89 presenting complaints for use in EDs adult patients.

  9. Compulsivity and Impulsivity in Pathological Gambling: Does a Dimensional-Transdiagnostic Approach Add Clinical Utility to DSM-5 Classification?

    Science.gov (United States)

    Bottesi, Gioia; Ghisi, Marta; Ouimet, Allison J; Tira, Michael D; Sanavio, Ezio

    2015-09-01

    Although the phenomenology of Pathological Gambling (PG) is clearly characterized by impulsive features, some of the Diagnostic and Statistical Manual of Mental Disorder (DSM-5) criteria for PG are similar to those of Obsessive Compulsive Disorder (OCD). Therefore, the compulsive-impulsive spectrum model may be a better (or complementary) fit with PG phenomenology. The present exploratory research was designed to further investigate the compulsive and impulsive features characterizing PG, by comparing PG individuals, alcohol dependents (ADs), OCD patients, and healthy controls (HCs) on both self-report and cognitive measures of compulsivity and impulsivity. A better understanding of the shared psychological and cognitive mechanisms underlying differently categorized compulsive and impulsive disorders may significantly impact on both clinical assessment and treatment strategies for PG patients. With respect to self-report measures, PG individuals reported more compulsive and impulsive features than did HCs. As regards motor inhibition ability indices, PG individuals and HCs performed similarly on the Go/No-go task and better than AD individuals and OCD patients. Results from the Iowa Gambling Task highlighted that PG, AD, and OCD participants performed worse than did HCs. An in-depth analysis of each group's learning profile revealed similar patterns of impairment between PG and AD individuals in decision-making processes. Current findings support the utility of adopting a dimensional-transdiagnostic approach to complement the DSM-5 classification when working with PG individuals in clinical practice. Indeed, clinicians are encouraged to assess both compulsivity and impulsivity to provide individualized case conceptualizations and treatment plans focusing on the specific phenomenological features characterizing each PG patient.

  10. Land cover classification with an expert system approach using Landsat ETM imagery: a case study of Trabzon.

    Science.gov (United States)

    Kahya, Oguzhan; Bayram, Bulent; Reis, Selcuk

    2010-01-01

    The main objective of this study is to generate a knowledge base which is composed of user-defined variables and included raster imagery, vector coverage, spatial models, external programs, and simple scalars and to develop an expert classification using Landsat 7 (ETM+) imagery for land cover classification in a part of Trabzon city. Expert systems allow for the integration of remote-sensed data with other sources of geo-referenced information such as land use data, spatial texture, and digital elevation model to obtain greater classification accuracy. Logical decision rules are used with the various datasets to assign class values for each pixel. Expert system is very suitable for the work of image interpretation as a powerful means of information integration. Landsat ETM data acquired in the year 2000 were initially classified into seven classes for land cover using a maximum likelihood decision rule. An expert system was constructed to perform post-classification sorting of the initial land cover classification using additional spatial datasets such as land use data. The overall accuracy of expert classification was 95.80%. Individual class accuracy ranged from 75% to 100% for each class.

  11. 遥感影像分类不确定性的描述与评价方法研究%Description and Evaluation Approach for Uncertaintyof RS Images Classi-fication

    Institute of Scientific and Technical Information of China (English)

    兰泽英; 刘艳芳; 唐祥云; 刘洋

    2009-01-01

    With the development of researches on the classification quality of remote sensing images, researchers thought that uncertainty is the main factor that influences classification quality. This study puts forward an approach to uncertainty repre-sentation, which is developed from two aspects: formalized description and comprehensive evaluation. First, we complete the classification using fuzzy surveillance approach, taking it as a formalized description of classification uncertainty. Then we in-troduce a hybrid entropy model for classification uncertainty evaluation, which can meet the requirement of comprehensive reflection of several uncertainties, while constructing the evaluation index from pixel scale with the full consideration of the different contribution to the error rate of each pixel. Finally, an application example will be studied to examine the new method. The result shows that the evaluation results fully reflect the classification quality, when compared with the conventional evaluation method which constructs models from unitary uncertainty and category scale.

  12. Characterization of post-traumatic stress disorder using resting-state fMRI with a multi-level parametric classification approach.

    Science.gov (United States)

    Liu, Feng; Xie, Bing; Wang, Yifeng; Guo, Wenbin; Fouche, Jean-Paul; Long, Zhiliang; Wang, Wenqin; Chen, Heng; Li, Meiling; Duan, Xujun; Zhang, Jiang; Qiu, Mingguo; Chen, Huafu

    2015-03-01

    Functional neuroimaging studies have found intra-regional activity and inter-regional connectivity alterations in patients with post-traumatic stress disorder (PTSD). However, the results of these studies are based on group-level statistics and therefore it is unclear whether PTSD can be discriminated at single-subject level, for instance using the machine learning approach. Here, we proposed a novel framework to identify PTSD using multi-level measures derived from resting-state functional MRI (fMRI). Specifically, three levels of measures were extracted as classification features: (1) regional amplitude of low-frequency fluctuations (univariate feature), which represents local spontaneous synchronous neural activity; (2) temporal functional connectivity (bivariate feature), which represents the extent of similarity of local activity between two regions, and (3) spatial functional connectivity (multivariate feature), which represents the extent of similarity of temporal correlation maps between two regions. Our method was evaluated on 20 PTSD patients and 20 demographically matched healthy controls. The experimental results showed that the features of each level could successfully discriminate PTSD patients from healthy controls. Furthermore, the combination of multi-level features using multi-kernel learning can further improve the classification performance. Specifically, the classification accuracy obtained by the proposed framework was 92.5 %, which was an increase of at least 5 and 17.5 % from the two-level and single-level feature based methods, respectively. Particularly, the limbic structure and prefrontal cortex provided the most discriminant features for classification, consistent with results reported in previous studies. Together, this study demonstrated for the first time that patients with PTSD can be identified at the individual level using resting-state fMRI data. The promising classification results indicated that this method may provide a

  13. Fourier-transform infrared spectroscopy coupled with a classification machine for the analysis of blood plasma or serum: a novel diagnostic approach for ovarian cancer.

    Science.gov (United States)

    Gajjar, Ketan; Trevisan, Júlio; Owens, Gemma; Keating, Patrick J; Wood, Nicholas J; Stringfellow, Helen F; Martin-Hirsch, Pierre L; Martin, Francis L

    2013-07-21

    Currently available screening tests do not deliver the required sensitivity and specificity for accurate diagnosis of ovarian or endometrial cancer. Infrared (IR) spectroscopy of blood plasma or serum is a rapid, versatile, and relatively non-invasive approach which could characterize biomolecular alterations due to cancer and has potential to be utilized as a screening or diagnostic tool. In the past, no such approach has been investigated for its applicability in screening and/or diagnosis of gynaecological cancers. We set out to determine whether attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy coupled with a proposed classification machine could be applied to IR spectra obtained from plasma and serum for accurate class prediction (cancer vs. normal). Plasma and serum samples were obtained from ovarian cancer cases (n = 30), endometrial cancer cases (n = 30) and non-cancer controls (n = 30), and subjected to ATR-FTIR spectroscopy. Four derived datasets were processed to estimate the real-world diagnosis of ovarian and endometrial cancer. Classification results for ovarian cancer were remarkable (up to 96.7%), whereas endometrial cancer was classified with a relatively high accuracy (up to 81.7%). The results from different combinations of feature extraction and classification methods, and also classifier ensembles, were compared. No single classification system performed best for all different datasets. This demonstrates the need for a framework that can accommodate a diverse set of analytical methods in order to be adaptable to different datasets. This pilot study suggests that ATR-FTIR spectroscopy of blood is a robust tool for accurate diagnosis, and carries the potential to be utilized as a screening test for ovarian cancer in primary care settings. The proposed classification machine is a powerful tool which could be applied to classify the vibrational spectroscopy data of different biological systems (e.g., tissue, urine, saliva

  14. Semi-automatic extraction of supra-glacial features using fuzzy logic approach for object-oriented classification on WorldView-2 imagery

    Science.gov (United States)

    Jawak, Shridhar D.; Palanivel, Yogesh V.; Alvarinho, Luis J.

    2016-04-01

    High resolution satellite data provide high spatial, spectral and contextual information. Spatial and contextual information of image objects are in demand to extract the information from high resolution satellite data. The supraglacial environment includes several features that are present on the surface of the glacier. The extraction of features from supraglacial environment is quite challenging using pixel-based image analysis. To overcome this, objectoriented approach is implemented. This paper aims at the extraction of geo-information from the supraglacial environment from high resolution satellite image by object-oriented image analysis using the fuzzy logic approach. The object-oriented image analysis involves the multiresolution segmentation for the creation of objects followed by the classification of objects using the fuzzy logic approach. The multiresolution segmentation is executed on the pixel level initially which merges pixels for the creation of objects thus minimizing their heterogeneity. This is followed by the development of rule sets for the classification of various features such as blue ice, debris, snow from the supraglacial environment in WorldView-2 data. The area of extracted feature is compared with the reference data and misclassified area of each feature using various bands is determined. The present object oriented classification achieved an overall accuracy of ≈ 92% for classifying supraglacial features. Finally, it is suggested that Red band is quite effective in the extraction of blue ice and snow, while NIR1 band is effective in debris extraction.

  15. A Hidden Markov Models Approach for Crop Classification: Linking Crop Phenology to Time Series of Multi-Sensor Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Sofia Siachalou

    2015-03-01

    Full Text Available Vegetation monitoring and mapping based on multi-temporal imagery has recently received much attention due to the plethora of medium-high spatial resolution satellites and the improved classification accuracies attained compared to uni-temporal approaches. Efficient image processing strategies are needed to exploit the phenological information present in temporal image sequences and to limit data redundancy and computational complexity. Within this framework, we implement the theory of Hidden Markov Models in crop classification, based on the time-series analysis of phenological states, inferred by a sequence of remote sensing observations. More specifically, we model the dynamics of vegetation over an agricultural area of Greece, characterized by spatio-temporal heterogeneity and small-sized fields, using RapidEye and Landsat ETM+ imagery. In addition, the classification performance of image sequences with variable spatial and temporal characteristics is evaluated and compared. The classification model considering one RapidEye and four pan-sharpened Landsat ETM+ images was found superior, resulting in a conditional kappa from 0.77 to 0.94 per class and an overall accuracy of 89.7%. The results highlight the potential of the method for operational crop mapping in Euro-Mediterranean areas and provide some hints for optimal image acquisition windows regarding major crop types in Greece.

  16. An approach to the diagnosis of flat intraepithelial lesions of the urinary bladder using the World Health Organization/ International Society of Urological Pathology consensus classification system.

    Science.gov (United States)

    Amin, Mahul B; McKenney, Jesse K

    2002-07-01

    The classification of flat urothelial (transitional cell) lesions with atypia has historically varied in its application from institution to institution with no fewer than six major nomenclature systems proposed in the past 25 years. In 1998, the World Health Organization/ International Society of Urological Pathology (WHO/ISUP) published a consensus classification that included the following categories for flat urinary bladder lesions: reactive atypia, atypia of unknown significance, dysplasia (low-grade intraepithelial neoplasia), and carcinoma in situ (high-grade intraepithelial neoplasia). This classification expands the definition traditionally used for urothelial carcinoma in situ, basing its diagnosis primarily on the severity of cytologic changes. In proposing the classification system for flat lesions of the bladder with atypia, it was hoped that consistent use of uniform diagnostic terminology would ultimately aid in a better understanding of the biology of these lesions. In this review, the authors discuss the history of the concept of flat urothelial neoplasia, the rationale and histologic criteria for the WHO/ISUP diagnostic categories, an approach to the diagnosis of flat lesions, and problems and pitfalls associated with their recognition in routine surgical pathology specimens.

  17. Using the International Classification of Functioning, Disability and Health Children and Youth version in education systems: a new approach to eligibility.

    Science.gov (United States)

    Hollenweger, Judith; Moretti, Marta

    2012-02-01

    In developed countries, establishing eligibility for persons with disabilities is a requirement for accessing specialized services or benefits. The underlying conceptualizations of disability are often problematic because they concentrate on deficits but try to promote social participation and focus on dependence while trying to strengthen independence. In addition, such conceptualizations are unable to respond to the rights-based approach of the UN Convention on the Rights of Persons with Disabilities. The International Classification of Functioning, Disability and Health Version for Children and Youth provides a model and classification that allows relating disease- or impairment-specific information to participation in the life domains relevant for a specific policy area. Establishing eligibility in education systems needs to be compatible with the principles of inclusive education, participation, and social justice. In addition, the overall goals of education and individualized goals for a specific child with disabilities need to be taken into account. Using the International Classification of Functioning, Disability and Health Version for Children and Youth as a model and classification, the different factors influencing eligibility-related decisions (impairments, activity/participation, environment, personal factors) can be made transparent to provide the basis for a decision-making process to which parents and the child actively contribute.

  18. A discriminative model-constrained EM approach to 3D MRI brain tissue classification and intensity non-uniformity correction

    Energy Technology Data Exchange (ETDEWEB)

    Wels, Michael; Hornegger, Joachim [Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander University Erlangen-Nuremberg, Martensstr. 3, 91058 Erlangen (Germany); Zheng Yefeng; Comaniciu, Dorin [Corporate Research and Technologies, Siemens Corporate Technology, 755 College Road East, Princeton, NJ 08540 (United States); Huber, Martin, E-mail: michael.wels@informatik.uni-erlangen.de [Corporate Research and Technologies, Siemens Corporate Technology, Guenther-Scharowsky-Str. 1, 91058 Erlangen (Germany)

    2011-06-07

    We describe a fully automated method for tissue classification, which is the segmentation into cerebral gray matter (GM), cerebral white matter (WM), and cerebral spinal fluid (CSF), and intensity non-uniformity (INU) correction in brain magnetic resonance imaging (MRI) volumes. It combines supervised MRI modality-specific discriminative modeling and unsupervised statistical expectation maximization (EM) segmentation into an integrated Bayesian framework. While both the parametric observation models and the non-parametrically modeled INUs are estimated via EM during segmentation itself, a Markov random field (MRF) prior model regularizes segmentation and parameter estimation. Firstly, the regularization takes into account knowledge about spatial and appearance-related homogeneity of segments in terms of pairwise clique potentials of adjacent voxels. Secondly and more importantly, patient-specific knowledge about the global spatial distribution of brain tissue is incorporated into the segmentation process via unary clique potentials. They are based on a strong discriminative model provided by a probabilistic boosting tree (PBT) for classifying image voxels. It relies on the surrounding context and alignment-based features derived from a probabilistic anatomical atlas. The context considered is encoded by 3D Haar-like features of reduced INU sensitivity. Alignment is carried out fully automatically by means of an affine registration algorithm minimizing cross-correlation. Both types of features do not immediately use the observed intensities provided by the MRI modality but instead rely on specifically transformed features, which are less sensitive to MRI artifacts. Detailed quantitative evaluations on standard phantom scans and standard real-world data show the accuracy and robustness of the proposed method. They also demonstrate relative superiority in comparison to other state-of-the-art approaches to this kind of computational task: our method achieves average

  19. An unsupervised two-stage clustering approach for forest structure classification based on X-band InSAR data - A case study in complex temperate forest stands

    Science.gov (United States)

    Abdullahi, Sahra; Schardt, Mathias; Pretzsch, Hans

    2017-05-01

    Forest structure at stand level plays a key role for sustainable forest management, since the biodiversity, productivity, growth and stability of the forest can be positively influenced by managing its structural diversity. In contrast to field-based measurements, remote sensing techniques offer a cost-efficient opportunity to collect area-wide information about forest stand structure with high spatial and temporal resolution. Especially Interferometric Synthetic Aperture Radar (InSAR), which facilitates worldwide acquisition of 3d information independent from weather conditions and illumination, is convenient to capture forest stand structure. This study purposes an unsupervised two-stage clustering approach for forest structure classification based on height information derived from interferometric X-band SAR data which was performed in complex temperate forest stands of Traunstein forest (South Germany). In particular, a four dimensional input data set composed of first-order height statistics was non-linearly projected on a two-dimensional Self-Organizing Map, spatially ordered according to similarity (based on the Euclidean distance) in the first stage and classified using the k-means algorithm in the second stage. The study demonstrated that X-band InSAR data exhibits considerable capabilities for forest structure classification. Moreover, the unsupervised classification approach achieved meaningful and reasonable results by means of comparison to aerial imagery and LiDAR data.

  20. Measuring autocratic regime stability

    Directory of Open Access Journals (Sweden)

    Joseph Wright

    2016-01-01

    Full Text Available Researchers measure regime stability in autocratic contexts using a variety of data sources that capture distinct concepts. Often this research uses concepts developed for the study of democratic politics, such as leadership change or institutionalized authority, to construct measures of regime breakdown in non-democratic contexts. This article assesses whether the measure a researcher chooses influences the results they obtain by examining data on executive leadership, political authority, and autocratic regimes. We illustrate the conceptual differences between these variables by extending recent studies in the literature on the political consequences of non-tax revenue and unearned foreign income.

  1. Exchange rate regime choice

    Directory of Open Access Journals (Sweden)

    Beker Emilija

    2006-01-01

    Full Text Available The choice of an adequate exchange rate regime proves to be a highly sensitive field within which the economic authorities present and confirm themselves. The advantages and disadvantages of fixed and flexible exchange rate regimes, which have been quite relativized from the conventional point of view, together with simultaneous, but not synchronized effects of structural and external factors, remain permanently questioned throughout a complex process of exchange rate regime decision making. The paper reflects the attempt of critical identification of the key exchange rate performances with emphasis on continuous non-uniformity and (uncertainty of shelf life of a relevant choice.

  2. Regimes, Non-State Actors and the State System: A 'Structurational' Regime Model

    NARCIS (Netherlands)

    Arts, B.J.M.

    2000-01-01

    Regime analysis has become a popular approach in International Relations theory and in international policy studies. However, current regime models exhibit some shortcomings with regard to (1) addressing non-state actors, and in particular nongovernmental organizations (NGOs), (2) the balancing of

  3. Regimes, Non-State Actors and the State System: A 'Structurational' Regime Model

    NARCIS (Netherlands)

    Arts, B.J.M.

    2000-01-01

    Regime analysis has become a popular approach in International Relations theory and in international policy studies. However, current regime models exhibit some shortcomings with regard to (1) addressing non-state actors, and in particular nongovernmental organizations (NGOs), (2) the balancing of a

  4. A Novel Approach to Developing a Supervised Spatial Decision Support System for Image Classification: A Study of Paddy Rice Investigation

    Directory of Open Access Journals (Sweden)

    Shih-Hsun Chang

    2014-01-01

    Full Text Available Paddy rice area estimation via remote sensing techniques has been well established in recent years. Texture information and vegetation indicators are widely used to improve the classification accuracy of satellite images. Accordingly, this study employs texture information and vegetation indicators as ancillary information for classifying paddy rice through remote sensing images. In the first stage, the images are attained using a remote sensing technique and ancillary information is employed to increase the accuracy of classification. In the second stage, we decide to construct an efficient supervised classifier, which is used to evaluate the ancillary information. In the third stage, linear discriminant analysis (LDA is introduced. LDA is a well-known method for classifying images to various categories. Also, the particle swarm optimization (PSO algorithm is employed to optimize the LDA classification outcomes and increase classification performance. In the fourth stage, we discuss the strategy of selecting different window sizes and analyze particle numbers and iteration numbers with corresponding accuracy. Accordingly, a rational strategy for the combination of ancillary information is introduced. Afterwards, the PSO algorithm improves the accuracy rate from 82.26% to 89.31%. The improved accuracy results in a much lower salt-and-pepper effect in the thematic map.

  5. Classification and evaluation strategies of auto-segmentation approaches for PET: Report of AAPM task group No. 211.

    Science.gov (United States)

    Hatt, Mathieu; Lee, John A; Schmidtlein, Charles R; Naqa, Issam El; Caldwell, Curtis; De Bernardi, Elisabetta; Lu, Wei; Das, Shiva; Geets, Xavier; Gregoire, Vincent; Jeraj, Robert; MacManus, Michael P; Mawlawi, Osama R; Nestle, Ursula; Pugachev, Andrei B; Schöder, Heiko; Shepherd, Tony; Spezi, Emiliano; Visvikis, Dimitris; Zaidi, Habib; Kirov, Assen S

    2017-06-01

    The purpose of this educational report is to provide an overview of the present state-of-the-art PET auto-segmentation (PET-AS) algorithms and their respective validation, with an emphasis on providing the user with help in understanding the challenges and pitfalls associated with selecting and implementing a PET-AS algorithm for a particular application. A brief description of the different types of PET-AS algorithms is provided using a classification based on method complexity and type. The advantages and the limitations of the current PET-AS algorithms are highlighted based on current publications and existing comparison studies. A review of the available image datasets and contour evaluation metrics in terms of their applicability for establishing a standardized evaluation of PET-AS algorithms is provided. The performance requirements for the algorithms and their dependence on the application, the radiotracer used and the evaluation criteria are described and discussed. Finally, a procedure for algorithm acceptance and implementation, as well as the complementary role of manual and auto-segmentation are addressed. A large number of PET-AS algorithms have been developed within the last 20 years. Many of the proposed algorithms are based on either fixed or adaptively selected thresholds. More recently, numerous papers have proposed the use of more advanced image analysis paradigms to perform semi-automated delineation of the PET images. However, the level of algorithm validation is variable and for most published algorithms is either insufficient or inconsistent which prevents recommending a single algorithm. This is compounded by the fact that realistic image configurations with low signal-to-noise ratios (SNR) and heterogeneous tracer distributions have rarely been used. Large variations in the evaluation methods used in the literature point to the need for a standardized evaluation protocol. Available comparison studies suggest that PET-AS algorithms relying

  6. GA(M)E-QSAR: a novel, fully automatic genetic-algorithm-(meta)-ensembles approach for binary classification in ligand-based drug design.

    Science.gov (United States)

    Pérez-Castillo, Yunierkis; Lazar, Cosmin; Taminau, Jonatan; Froeyen, Mathy; Cabrera-Pérez, Miguel Ángel; Nowé, Ann

    2012-09-24

    Computer-aided drug design has become an important component of the drug discovery process. Despite the advances in this field, there is not a unique modeling approach that can be successfully applied to solve the whole range of problems faced during QSAR modeling. Feature selection and ensemble modeling are active areas of research in ligand-based drug design. Here we introduce the GA(M)E-QSAR algorithm that combines the search and optimization capabilities of Genetic Algorithms with the simplicity of the Adaboost ensemble-based classification algorithm to solve binary classification problems. We also explore the usefulness of Meta-Ensembles trained with Adaboost and Voting schemes to further improve the accuracy, generalization, and robustness of the optimal Adaboost Single Ensemble derived from the Genetic Algorithm optimization. We evaluated the performance of our algorithm using five data sets from the literature and found that it is capable of yielding similar or better classification results to what has been reported for these data sets with a higher enrichment of active compounds relative to the whole actives subset when only the most active chemicals are considered. More important, we compared our methodology with state of the art feature selection and classification approaches and found that it can provide highly accurate, robust, and generalizable models. In the case of the Adaboost Ensembles derived from the Genetic Algorithm search, the final models are quite simple since they consist of a weighted sum of the output of single feature classifiers. Furthermore, the Adaboost scores can be used as ranking criterion to prioritize chemicals for synthesis and biological evaluation after virtual screening experiments.

  7. Statistical hydrodynamic models for mixing instability flows in turbulent regime: theoretical 0D evaluation criteria and comparison of one and two-fluid approaches; Modeles hydrodynamiques statistiques pour les ecoulements d'instabilites de melange en regime developpe: criteres theoriques d'evaluation ''0D'' et comparaison des approches mono et bifluides

    Energy Technology Data Exchange (ETDEWEB)

    Llor, A

    2001-07-01

    Theoretical criteria are defined to perform quick analytical evaluations of statistical hydro models for turbulent mixing flows induced by Kelvin-Helmholtz, Rayleigh-Taylor and Richtmyer-Meshkov instabilities. They are based on a global energy balance analysis of the mixing zone ('0D' projection) in the limit of zero Atwood number, for incompressible fluids, and in self-similar regime. It is then shown that single-fluid descriptions must be replaced by two-fluid descriptions, particularly for the Rayleigh-Taylor case with variable acceleration. The interaction between a shock and heterogeneities is also considered. Various approaches for the development of new models are finally given. (author)

  8. Classification, diagnosis, and approach to treatment for angioedema: consensus report from the Hereditary Angioedema International Working Group.

    Science.gov (United States)

    Cicardi, M; Aberer, W; Banerji, A; Bas, M; Bernstein, J A; Bork, K; Caballero, T; Farkas, H; Grumach, A; Kaplan, A P; Riedl, M A; Triggiani, M; Zanichelli, A; Zuraw, B

    2014-05-01

    Angioedema is defined as localized and self-limiting edema of the subcutaneous and submucosal tissue, due to a temporary increase in vascular permeability caused by the release of vasoactive mediator(s). When angioedema recurs without significant wheals, the patient should be diagnosed to have angioedema as a distinct disease. In the absence of accepted classification, different types of angioedema are not uniquely identified. For this reason, the European Academy of Allergy and Clinical Immunology gave its patronage to a consensus conference aimed at classifying angioedema. Four types of acquired and three types of hereditary angioedema were identified as separate forms from the analysis of the literature and were presented in detail at the meeting. Here, we summarize the analysis of the data and the resulting classification of angioedema.

  9. INCODE-DK 2014. Classification of cause of intrauterine fetal death – a new approach to perinatal audit

    DEFF Research Database (Denmark)

    Maroun, Lisa Leth; Ramsing, Mette; Olsen, Tina Elisabeth

    on a national level as described in the national guideline for IUFD. Multidisciplinary perinatal audit is an important tool in the evaluation of stillbirth, however, the establishment of the C-IUFD has until now been hampered by the lack of a recommended classification system. Material and methods...... With the intention of improving the evaluation process for IUFD a working group of fetal pathologists and obstetricians was established in 2013 by the Danish Society of Obstetricians and Gynaecology (DSOG) and the Danish Pathology Society (DPAS). Two selected modern international classification systems (CODAC......) was developed by translation and adaptation to Danish conditions on the basis of updated literature. The section on placental pathology was adapted to the recent Danish guideline for placental examination 2013. In addition a new perinatal audit scheme (INCODE perinatal audittabel 2014) was created based...

  10. INCODE-DK 2014. Classification of cause of intrauterine fetal death – a new approach to perinatal audit

    DEFF Research Database (Denmark)

    Maroun, Lisa Leth; Ramsing, Mette; Olsen, Tina Elisabeth

    on a national level as described in the national guideline for IUFD. Multidisciplinary perinatal audit is an important tool in the evaluation of stillbirth, however, the establishment of the C-IUFD has until now been hampered by the lack of a recommended classification system. Material and methods...... With the intention of improving the evaluation process for IUFD a working group of fetal pathologists and obstetricians was established in 2013 by the Danish Society of Obstetricians and Gynaecology (DSOG) and the Danish Pathology Society (DPAS). Two selected modern international classification systems (CODAC......) was developed by translation and adaptation to Danish conditions on the basis of updated literature. The section on placental pathology was adapted to the recent Danish guideline for placental examination 2013. In addition a new perinatal audit scheme (INCODE perinatal audittabel 2014) was created based...

  11. Development of the approaches to the classification and groups of the tasks fulfilling by the internal auditors

    OpenAIRE

    Шалімова, Наталія Станіславівна; Андрощук, Ірина Іванівна

    2015-01-01

    The aim of the article is critical rethinking and revision of types of tasks that can be performed within internal audit, given the development of internationally recognized professional organizations in this area, including the Institute of Internal Auditors.It is proved a necessity of classification of tasks performed within internal audit based on international standards of professional practice of internal auditing, developed by the Institute of Internal Auditors, using requirements of In...

  12. Causes of death and associated conditions (Codac – a utilitarian approach to the classification of perinatal deaths

    Directory of Open Access Journals (Sweden)

    Harrison Catherine

    2009-06-01

    Full Text Available Abstract A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD, although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes. We tested the Causes of Death and Associated Conditions (Codac classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions. The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies, two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy, a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal. For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured. The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions

  13. HEp-2 Cell Classification: The Role of Gaussian Scale Space Theory as A Pre-processing Approach

    OpenAIRE

    Qi, Xianbiao; Zhao, Guoying; Chen, Jie; Pietikäinen, Matti

    2015-01-01

    \\textit{Indirect Immunofluorescence Imaging of Human Epithelial Type 2} (HEp-2) cells is an effective way to identify the presence of Anti-Nuclear Antibody (ANA). Most existing works on HEp-2 cell classification mainly focus on feature extraction, feature encoding and classifier design. Very few efforts have been devoted to study the importance of the pre-processing techniques. In this paper, we analyze the importance of the pre-processing, and investigate the role of Gaussian Scale Space (GS...

  14. Implementing the Biopharmaceutics Classification System in Drug Development: Reconciling Similarities, Differences, and Shared Challenges in the EMA and US-FDA-Recommended Approaches.

    Science.gov (United States)

    Cardot, J-M; Garcia Arieta, A; Paixao, P; Tasevska, I; Davit, B

    2016-07-01

    The US-FDA recently posted a draft guideline for industry recommending procedures necessary to obtain a biowaiver for immediate-release oral dosage forms based on the Biopharmaceutics Classification System (BCS). This review compares the present FDA BCS biowaiver approach, with the existing European Medicines Agency (EMA) approach, with an emphasis on similarities, difficulties, and shared challenges. Some specifics of the current EMA BCS guideline are compared with those in the recently published draft US-FDA BCS guideline. In particular, similarities and differences in the EMA versus US-FDA approaches to establishing drug solubility, permeability, dissolution, and formulation suitability for BCS biowaiver are critically reviewed. Several case studies are presented to illustrate the (i) challenges of applying for BCS biowaivers for global registration in the face of differences in the EMA and US-FDA BCS biowaiver criteria, as well as (ii) challenges inherent in applying for BCS class I or III designation and common to both jurisdictions.

  15. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    Science.gov (United States)

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  16. Segmentation Based Classification of 3D Urban Point Clouds: A Super-Voxel Based Approach with Evaluation

    Directory of Open Access Journals (Sweden)

    Laurent Trassoudaine

    2013-03-01

    Full Text Available Segmentation and classification of urban range data into different object classes have several challenges due to certain properties of the data, such as density variation, inconsistencies due to missing data and the large data size that require heavy computation and large memory. A method to classify urban scenes based on a super-voxel segmentation of sparse 3D data obtained from LiDAR sensors is presented. The 3D point cloud is first segmented into voxels, which are then characterized by several attributes transforming them into super-voxels. These are joined together by using a link-chain method rather than the usual region growing algorithm to create objects. These objects are then classified using geometrical models and local descriptors. In order to evaluate the results, a new metric that combines both segmentation and classification results simultaneously is presented. The effects of voxel size and incorporation of RGB color and laser reflectance intensity on the classification results are also discussed. The method is evaluated on standard data sets using different metrics to demonstrate its efficacy.

  17. DEFINING RELATIONAL PATHOLOGY IN EARLY CHILDHOOD: THE DIAGNOSTIC CLASSIFICATION OF MENTAL HEALTH AND DEVELOPMENTAL DISORDERS OF INFANCY AND EARLY CHILDHOOD DC:0-5 APPROACH.

    Science.gov (United States)

    Zeanah, Charles H; Lieberman, Alicia

    2016-09-01

    Infant mental health is explicitly relational in its focus, and therefore a diagnostic classification system for early childhood disorders should include attention not only to within-the-child psychopathology but also between child and caregiver psychopathology. In this article, we begin by providing a review of previous efforts to introduce this approach that date back more than 30 years. Next, we introduce changes proposed in the Diagnostic Classification of Mental Health and Developmental Disorders of Infancy and Early Childhood DC:0-5 (ZERO TO THREE, in press). In a major change from previous attempts, the DC:0-5 includes an Axis I "Relationship Specific Disorder of Early Childhood." This disorder intends to capture disordered behavior that is limited to one caregiver relationship rather than cross contextually. An axial characterization is continued from the Diagnostic Classification of Mental Health and Developmental Disorders of Infancy and Early Childhood DC:0-3R (ZERO TO THREE, 2005), but two major changes are introduced. First, the DC:0-5 proposes to simplify ratings of relationship adaptation/maladaptation, and to expand what is rated so that in addition to characterizing the child's relationship with his or her primary caregiver, there also is a characterization of the network of family relationships in which the child develops. This includes coparenting relationships and the entire network of close relationships that impinge on the young child's development and adaptation. © 2016 Michigan Association for Infant Mental Health.

  18. Detecting spatial regimes in ecosystems

    Science.gov (United States)

    Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.

    2017-01-01

    Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.

  19. Hierarchical classification approach for mapping rubber tree growth using per-pixel and object-oriented classifiers with SPOT-5 imagery

    Directory of Open Access Journals (Sweden)

    Hayder Dibs

    2017-06-01

    Full Text Available There has been growing interest in Malaysia to increase the productivity of latex. This made accurate knowledge of rubber tree growth and age distribution a helpful decision making tool for the government, rubber plantation managers, and harvesters. Gathering this information using conventional methods is difficult, time consuming, and limited in spatial coverage. This paper presents hierarchical classification approach to obtain accurate map of rubber tree growth age distribution using SPOT-5 satellite imagery. The objective of the study is to evaluate the performance of pixel-based and object-oriented classifiers for rubber growth classification. At the first level, the general land cover was classified into eight land cover classes (soil, water body, rubber, mature oil palm, young oil palm, forest, urban area, and other vegetation using Mahalanobis distance (MD, k-nearest neighbor (k-NN, and Support Vector Machine (SVM classifiers. Thereafter, the best classification map, k-NN output, was used to select only pixels that belong to the rubber class from the SPOT-5 image. The extracted pixels served as input into the next classification hierarchy where four classifiers, MD, k-NN, SVM, and decision tree (DT, were implemented to map rubber trees into three intra-classes (mature, middle-aged, and young rubbers. The result produced overall accuracy of 97.48%, 96.90%, 96.25%, and 80.80% for k-NN, SVM, MD, and DT respectively. The result indicates that object-oriented classifiers are better than pixel-based methods mapping rubber tree growth.

  20. World Nonproliferation Regime

    Institute of Scientific and Technical Information of China (English)

    Ouyang Liping; Wu Xingzuo

    2007-01-01

    2006 witnessed an intense struggle between nuclear proliferation and nonproliferation. Iran's nuclear issue and North Korea's nuclear test have cast a deep shadow over the current international nonproliferation regime. The international contest for civil nuclear development became especially fierce as global energy prices went up. Such a situation , to some extent, accelerated the pace of nuclear proliferation. Furthermore, the existing international nonproliferation regime, based upon the Nuclear Nonproliferation Treaty (NPT), was affected by loopholes, and the U.S. failed in its ambition to unite other forces to mend fences. The international community needs to come up with a comprehensive and long-term strategy to meet the demand for an effective future nonproliferation regime in a healthy nuclear order.

  1. Seasonal precipitation forecasts for selected regions in West Africa using circulation type classifications in combination with further statistical approaches - Conceptual framework and first results

    Science.gov (United States)

    Bliefernicht, Jan; Laux, Patrik; Waongo, Moussa; Kunstmann, Harald

    2015-04-01

    Providing valuable forecasts of the seasonal precipitation amount for the upcoming rainy season is one of the big challenges for the national weather services in West Africa. Every year a harmonized forecast of the seasonal precipitation amount for the West African region is issued by the national weather services within the PRESAO framework. The PREASO forecast is based on various statistical approaches ranging from a simple subjective analog method based on the experiences of a meteorological expert to objective regression-based approaches by using various sources of input information such as predicted monsoon winds or observed sea surface temperature anomalies close to the West African coastline. The objective of this study is to perform an evaluation of these techniques for selected West African regions and to introduce classification techniques in the current operational practices and to combine these approaches with further techniques for an additional refinement of the forecasting procedure. We use a fuzzy-rule based technique for a classification of (sub-) monthly large-scale atmospheric and oceanic patterns which are combined to further statistical approaches such as an analog method and a data depth approach for the prediction of the (sub-) seasonal precipitation amounts and additional precipitation indices. The study regions are located from the Edges of the Sahel region in the North of Burkina Faso to the coastline of Ghana. A novel precipitation archive based on daily observations provided by the meteorological services of Burkina Faso and Ghana is the basis for the predictands and is used as reference for model evaluation. The performance of the approach is evaluated over a long period (e.g. 50 years) using cross-validation techniques and sophisticated verification measures for an evaluation of a probability forecast. The precipitation forecast of the classification techniques are also compared to the techniques of the PREASAO community, the

  2. Ecosystem Service Valuation Assessments for Protected Area Management: A Case Study Comparing Methods Using Different Land Cover Classification and Valuation Approaches.

    Directory of Open Access Journals (Sweden)

    Charlotte E L Whitham

    Full Text Available Accurate and spatially-appropriate ecosystem service valuations are vital for decision-makers and land managers. Many approaches for estimating ecosystem service value (ESV exist, but their appropriateness under specific conditions or logistical limitations is not uniform. The most accurate techniques are therefore not always adopted. Six different assessment approaches were used to estimate ESV for a National Nature Reserve in southwest China, across different management zones. These approaches incorporated two different land-use land cover (LULC maps and development of three economic valuation techniques, using globally or locally-derived data. The differences in ESV across management zones for the six approaches were largely influenced by the classifications of forest and farmland and how they corresponded with valuation coefficients. With realistic limits on access to time, data, skills and resources, and using acquired estimates from globally-relevant sources, the Buffer zone was estimated as the most valuable (2.494 million ± 1.371 million CNY yr(-1 km(-2 and the Non-protected zone as the least valuable (770,000 ± 4,600 CNY yr(-1 km(-2. However, for both LULC maps, when using the locally-based and more time and skill-intensive valuation approaches, this pattern was generally reversed. This paper provides a detailed practical example of how ESV can differ widely depending on the availability and appropriateness of LULC maps and valuation approaches used, highlighting pitfalls for the managers of protected areas.

  3. Ecosystem Service Valuation Assessments for Protected Area Management: A Case Study Comparing Methods Using Different Land Cover Classification and Valuation Approaches.

    Science.gov (United States)

    Whitham, Charlotte E L; Shi, Kun; Riordan, Philip

    2015-01-01

    Accurate and spatially-appropriate ecosystem service valuations are vital for decision-makers and land managers. Many approaches for estimating ecosystem service value (ESV) exist, but their appropriateness under specific conditions or logistical limitations is not uniform. The most accurate techniques are therefore not always adopted. Six different assessment approaches were used to estimate ESV for a National Nature Reserve in southwest China, across different management zones. These approaches incorporated two different land-use land cover (LULC) maps and development of three economic valuation techniques, using globally or locally-derived data. The differences in ESV across management zones for the six approaches were largely influenced by the classifications of forest and farmland and how they corresponded with valuation coefficients. With realistic limits on access to time, data, skills and resources, and using acquired estimates from globally-relevant sources, the Buffer zone was estimated as the most valuable (2.494 million ± 1.371 million CNY yr(-1) km(-2)) and the Non-protected zone as the least valuable (770,000 ± 4,600 CNY yr(-1) km(-2)). However, for both LULC maps, when using the locally-based and more time and skill-intensive valuation approaches, this pattern was generally reversed. This paper provides a detailed practical example of how ESV can differ widely depending on the availability and appropriateness of LULC maps and valuation approaches used, highlighting pitfalls for the managers of protected areas.

  4. Integrated risk assessment for WFD ecological status classification applied to Llobregat river basin (Spain). Part I-Fuzzy approach to aggregate biological indicators.

    Science.gov (United States)

    Gottardo, S; Semenzin, E; Giove, S; Zabeo, A; Critto, A; de Zwart, D; Ginebreda, A; Marcomini, A

    2011-10-15

    Water Framework Directive (WFD) requirements and recommendations for Ecological Status (ES) classification of surface water bodies do not address all issues that Member States have to face in the implementation process, such as selection of appropriate stressor-specific environmental indicators, definition of class boundaries, aggregation of heterogeneous data and information and uncertainty evaluation. In this context the "One-Out, All-Out" (OOAO) principle is the suggested approach to lead the entire classification procedure and ensure conservative results. In order to support water managers in achieving a more comprehensive and realistic evaluation of ES, an Integrated Risk Assessment (IRA) methodology was developed. It is based on the Weight of Evidence approach and implements a Fuzzy Inference System in order to hierarchically aggregate a set of environmental indicators, which are grouped into five Lines of Evidence (i.e. Biology, Chemistry, Ecotoxicology, Physico-chemistry and Hydromorphology). The whole IRA methodology has been implemented as an individual module into a freeware GIS (Geographic Information System)-based Decision Support System (DSS), named MODELKEY DSS. The paper focuses on the conceptual and mathematical procedure underlying the evaluation of the most complex Line of Evidence, i.e. Biology, which identifies the biological communities that are potentially at risk and the stressors that are most likely responsible for the observed alterations. The results obtained from testing the procedure through application of the MODELKEY DSS to the Llobregat case study are reported and discussed.

  5. The Ten-Group Robson Classification: A Single Centre Approach Identifying Strategies to Optimise Caesarean Section Rates

    Science.gov (United States)

    Tanaka, Keisuke

    2017-01-01

    Caesarean section (CS) rates have been increasing worldwide and have caused concerns. For meaningful comparisons to be made World Health Organization recommends the use of the Ten-Group Robson classification as the global standard for assessing CS rates. 2625 women who birthed over a 12-month period were analysed using this classification. Women with previous CS (group 5) comprised 10.9% of the overall 23.5% CS rate. Women with one previous CS who did not attempt VBAC contributed 5.3% of the overall 23.5% CS rate. Second largest contributor was singleton nulliparous women with cephalic presentation at term (5.1% of the total 23.5%). Induction of labour was associated with higher CS rate (groups 1 and 3) (24.5% versus 11.9% and 6.2% versus 2.6%, resp.). For postdates IOL we recommend a gatekeeper booking system to minimise these being performed <41 weeks. We suggest setting up dedicated VBAC clinic to support for women with one previous CS. Furthermore review of definition of failure to progress in labour not only may lower CS rates in groups 1 and 2a but also would reduce the size of group 5 in the future. PMID:28167965

  6. The Ten-Group Robson Classification: A Single Centre Approach Identifying Strategies to Optimise Caesarean Section Rates

    Directory of Open Access Journals (Sweden)

    Keisuke Tanaka

    2017-01-01

    Full Text Available Caesarean section (CS rates have been increasing worldwide and have caused concerns. For meaningful comparisons to be made World Health Organization recommends the use of the Ten-Group Robson classification as the global standard for assessing CS rates. 2625 women who birthed over a 12-month period were analysed using this classification. Women with previous CS (group 5 comprised 10.9% of the overall 23.5% CS rate. Women with one previous CS who did not attempt VBAC contributed 5.3% of the overall 23.5% CS rate. Second largest contributor was singleton nulliparous women with cephalic presentation at term (5.1% of the total 23.5%. Induction of labour was associated with higher CS rate (groups 1 and 3 (24.5% versus 11.9% and 6.2% versus 2.6%, resp.. For postdates IOL we recommend a gatekeeper booking system to minimise these being performed <41 weeks. We suggest setting up dedicated VBAC clinic to support for women with one previous CS. Furthermore review of definition of failure to progress in labour not only may lower CS rates in groups 1 and 2a but also would reduce the size of group 5 in the future.

  7. Classification of glioblastoma and metastasis for neuropathology intraoperative diagnosis: a multi-resolution textural approach to model the background

    Science.gov (United States)

    Ahmad Fauzi, Mohammad Faizal; Gokozan, Hamza Numan; Elder, Brad; Puduvalli, Vinay K.; Otero, Jose J.; Gurcan, Metin N.

    2014-03-01

    Brain cancer surgery requires intraoperative consultation by neuropathology to guide surgical decisions regarding the extent to which the tumor undergoes gross total resection. In this context, the differential diagnosis between glioblastoma and metastatic cancer is challenging as the decision must be made during surgery in a short time-frame (typically 30 minutes). We propose a method to classify glioblastoma versus metastatic cancer based on extracting textural features from the non-nuclei region of cytologic preparations. For glioblastoma, these regions of interest are filled with glial processes between the nuclei, which appear as anisotropic thin linear structures. For metastasis, these regions correspond to a more homogeneous appearance, thus suitable texture features can be extracted from these regions to distinguish between the two tissue types. In our work, we use the Discrete Wavelet Frames to characterize the underlying texture due to its multi-resolution capability in modeling underlying texture. The textural characterization is carried out in primarily the non-nuclei regions after nuclei regions are segmented by adapting our visually meaningful decomposition segmentation algorithm to this problem. k-nearest neighbor method was then used to classify the features into glioblastoma or metastasis cancer class. Experiment on 53 images (29 glioblastomas and 24 metastases) resulted in average accuracy as high as 89.7% for glioblastoma, 87.5% for metastasis and 88.7% overall. Further studies are underway to incorporate nuclei region features into classification on an expanded dataset, as well as expanding the classification to more types of cancers.

  8. Regimes of justification

    NARCIS (Netherlands)

    Arts, Irma; Buijs, A.E.; Verschoor, G.M.

    2017-01-01

    Legitimacy of environmental management and policies is an important topic in environmental research. Based on the notion of ‘regimes of justification’, we aim to analyse the dynamics in argumentations used to legitimize and de-legitimize Dutch nature conservation practices. Contrary to prior

  9. Agriculture classification using POLSAR data

    DEFF Research Database (Denmark)

    Skriver, Henning; Dall, Jørgen; Ferro-Famil, Laurent

    2005-01-01

    in the crop canopy, particularly between the response of the canopy itself and soil response. It is expected that PolInSAR data will add to the classification potential of POLSAR data by their sensitivity to the vertical distribution of scatterers. Different approaches have been used to classify SAR data...... content of the SAR data they attempt to generate robust, widely applicable methods, which are nonetheless capable of taking local conditions into account. In this paper a classification approach is presented, that uses a knowledge-based approach, where the crops are first classified into broad classes, i...... of the classification process is not as well established as the first part, and both a supervised approach and a knowledge-based approach have been evaluated. Both POLSAR and PolInSAR data may be included in the classification scheme. The classification approach has been evaluated using data from the Danish EMISAR...

  10. Transfer of the nationwide Czech soil survey data to a foreign soil classification - generating input parameters for a process-based soil erosion modelling approach

    Science.gov (United States)

    Beitlerová, Hana; Hieke, Falk; Žížala, Daniel; Kapička, Jiří; Keiser, Andreas; Schmidt, Jürgen; Schindewolf, Marcus

    2017-04-01

    Process-based erosion modelling is a developing and adequate tool to assess, simulate and understand the complex mechanisms of soil loss due to surface runoff. While the current state of available models includes powerful approaches, a major drawback is given by complex parametrization. A major input parameter for the physically based soil loss and deposition model EROSION 3D is represented by soil texture. However, as the model has been developed in Germany it is dependent on the German soil classification. To exploit data generated during a massive nationwide soil survey campaign taking place in the 1960s across the entire Czech Republic, a transfer from the Czech to the German or at least international (e.g. WRB) system is mandatory. During the survey the internal differentiation of grain sizes was realized in a two fractions approach, separating texture into solely above and below 0.01 mm rather than into clayey, silty and sandy textures. Consequently, the Czech system applies a classification of seven different textures based on the respective percentage of large and small particles, while in Germany 31 groups are essential. The followed approach of matching Czech soil survey data to the German system focusses on semi-logarithmic interpolation of the cumulative soil texture curve additionally on a regression equation based on a recent database of 128 soil pits. Furthermore, for each of the seven Czech texture classes a group of typically suitable classes of the German system was derived. A GIS-based spatial analysis to test approaches of interpolation the soil texture was carried out. First results show promising matches and pave the way to a Czech model application of EROSION 3D.

  11. Using resistance and resilience concepts to reduce impacts of annual grasses and altered fire regimes on the sagebrush ecosystem and sage-grouse- A strategic multi-scale approach

    Science.gov (United States)

    Chambers, Jeanne C.; Pyke, David A.; Maestas, Jeremy D.; Boyd, Chad S.; Campbell, Steve; Espinosa, Shawn; Havlina, Doug; Mayer, Kenneth F.; Wuenschel, Amarina

    2014-01-01

    This Report provides a strategic approach for conservation of sagebrush ecosystems and Greater Sage- Grouse (sage-grouse) that focuses specifically on habitat threats caused by invasive annual grasses and altered fire regimes. It uses information on factors that influence (1) sagebrush ecosystem resilience to disturbance and resistance to invasive annual grasses and (2) distribution, relative abundance, and persistence of sage-grouse populations to develop management strategies at both landscape and site scales. A sage-grouse habitat matrix links relative resilience and resistance of sagebrush ecosystems with sage-grouse habitat requirements for landscape cover of sagebrush to help decision makers assess risks and determine appropriate management strategies at landscape scales. Focal areas for management are assessed by overlaying matrix components with sage-grouse Priority Areas for Conservation (PACs), breeding bird densities, and specific habitat threats. Decision tools are discussed for determining the suitability of focal areas for treatment and the most appropriate management treatments.

  12. Cluster analysis of multiple planetary flow regimes

    Science.gov (United States)

    Mo, Kingtse; Ghil, Michael

    1988-01-01

    A modified cluster analysis method developed for the classification of quasi-stationary events into a few planetary flow regimes and for the examination of transitions between these regimes is described. The method was applied first to a simple deterministic model and then to a 500-mbar data set for Northern Hemisphere (NH), for which cluster analysis was carried out in the subspace of the first seven empirical orthogonal functions (EOFs). Stationary clusters were found in the low-frequency band of more than 10 days, while transient clusters were found in the band-pass frequency window between 2.5 and 6 days. In the low-frequency band, three pairs of clusters determined EOFs 1, 2, and 3, respectively; they exhibited well-known regional features, such as blocking, the Pacific/North American pattern, and wave trains. Both model and low-pass data exhibited strong bimodality.

  13. Decision making in double-pedicled DIEP and SIEA abdominal free flap breast reconstructions: An algorithmic approach and comprehensive classification.

    Directory of Open Access Journals (Sweden)

    Charles M Malata

    2015-10-01

    Full Text Available Introduction: The deep inferior epigastric artery perforator (DIEP free flap is the gold standard for autologous breast reconstruction. However, using a single vascular pedicle may not yield sufficient tissue in patients with midline scars or insufficient lower abdominal pannus. Double-pedicled free flaps overcome this problem using different vascular arrangements to harvest the entire lower abdominal flap. The literature is, however, sparse regarding technique selection. We therefore reviewed our experience in order to formulate an algorithm and comprehensive classification for this purpose. Methods: All patients undergoing unilateral double-pedicled abdominal perforator free flap breast reconstruction (AFFBR by a single surgeon (CMM over 40 months were reviewed from a prospectively collected database. Results: Of the 112 consecutive breast free flaps performed, 25 (22% utilised two vascular pedicles. The mean patient age was 45 years (range=27-54. All flaps but one (which used the thoracodorsal system were anastomosed to the internal mammary vessels using the rib-preservation technique. The surgical duration was 656 minutes (range=468-690 mins. The median flap weight was 618g (range=432-1275g and the mastectomy weight was 445g (range=220-896g. All flaps were successful and only three patients requested minor liposuction to reduce and reshape their reconstructed breasts.Conclusion: Bipedicled free abdominal perforator flaps, employed in a fifth of all our AFFBRs, are a reliable and safe option for unilateral breast reconstruction. They, however, necessitate clear indications to justify the additional technical complexity and surgical duration. Our algorithm and comprehensive classification facilitate technique selection for the anastomotic permutations and successful execution of these operations.

  14. A New Approach in Pressure Transient Analysis: Using Numerical Density Derivatives to Improve Diagnosis of Flow Regimes and Estimation of Reservoir Properties for Multiple Phase Flow

    Directory of Open Access Journals (Sweden)

    Victor Torkiowei Biu

    2015-01-01

    Full Text Available This paper presents the numerical density derivative approach (another phase of numerical welltesting in which each fluid’s densities around the wellbore are measured and used to generate pressure equivalent for each phase using simplified pressure-density correlation, as well as new statistical derivative methods to determine each fluid phase’s permeabilities, and the average effective permeability for the system with a new empirical model. Also density related radial flow equations for each fluid phase are derived and semilog specialised plot of density versus Horner time is used to estimate k relative to each phase. Results from 2 examples of oil and gas condensate reservoirs show that the derivatives of the fluid phase pressure-densities equivalent display the same wellbore and reservoir fingerprint as the conventional bottom-hole pressure BPR method. It also indicates that the average effective kave ranges between 43 and 57 mD for scenarios (a to (d in Example 1.0 and 404 mD for scenarios (a to (b in Example 2.0 using the new fluid phase empirical model for K estimation. This is within the k value used in the simulation model and likewise that estimated from the conventional BPR method. Results also discovered that in all six scenarios investigated, the heavier fluid such as water and the weighted average pressure-density equivalent of all fluid gives exact effective k as the conventional BPR method. This approach provides an estimate of the possible fluid phase permeabilities and the % of each phase contribution to flow at a given point. Hence, at several dp' stabilisation points, the relative k can be generated.

  15. How many taxa can be recognized within the complex Tillandsia capillaris (Bromeliaceae, Tillandsioideae? Analysis of the available classifications using a multivariate approach

    Directory of Open Access Journals (Sweden)

    Lucía Castello

    2013-05-01

    Full Text Available Tillandsia capillaris Ruiz & Pav., which belongs to the subgenus Diaphoranthema is distributed in Ecuador, Peru, Bolivia, northern and central Argentina, and Chile, and includes forms that are difficult to circumscribe, thus considered to form a complex. The entities of this complex are predominantly small-sized epiphytes, adapted to xeric environments. The most widely used classification defines 5 forms for this complex based on few morphological reproductive traits: T. capillaris Ruiz & Pav. f. capillaris, T. capillaris f. incana (Mez L.B. Sm., T. capillaris f. cordobensis (Hieron. L.B. Sm., T. capillaris f. hieronymi (Mez L.B. Sm. and T. capillaris f. virescens (Ruiz & Pav. L.B. Sm. In this study, 35 floral and vegetative characters were analyzed with a multivariate approach in order to assess and discuss different proposals for classification of the T. capillaris complex, which presents morphotypes that co-occur in central and northern Argentina. To accomplish this, data of quantitative and categorical morphological characters of flowers and leaves were collected from herbarium specimens and field collections and were analyzed with statistical multivariate techniques. The results suggest that the last classification for the complex seems more comprehensive and three taxa were delimited: T. capillaris (=T. capillaris f. incana-hieronymi, T. virescens s. str. (=T. capillaris f. cordobensis and T. virescens s. l. (=T. capillaris f. virescens. While T. capillaris and T. virescens s. str. co-occur, T. virescens s. l. is restricted to altitudes above 2000 m in Argentina. Characters previously used for taxa delimitation showed continuous variation and therefore were not useful. New diagnostic characters are proposed and a key is provided for delimiting these three taxa within the complex.

  16. Climate Regime Shifts and Streamflow Responses in the Merrimack Watershed, NH-MA

    Science.gov (United States)

    Berton, R.; Driscoll, C. T.; Chandler, D. G.

    2013-12-01

    Climate change has frequently been related to alterations to the hydrologic cycle, especially for sites with winter snowpack and an annual snowmelt hydrograph. In the Northeast USA, changes in streamflow depend on both advanced timing of melt, typical of the sites with winter dominated precipitation, and increasing summer precipitation. In order to manage various demands for water, planners require robust metrics of change in flow quantity and timing for both wet and dry years. This study seeks appropriate metrics of hydrologic change, at several sites with different stream orders and levels of development within the Merrimack Watershed in the Northeast USA. The term "regime" is defined as variation in a parameter of interest. The regime change is a given expression to changes in statistical properties of data including mean, standard deviation, and skewness. Looking at long-term changes of a hydrological parameter without considering regime changes could result in over- or under-estimating trends. Trend evaluation over similar regime segment could be a more precise approach to study changes in hydroclimatological parameters. Regime shift point detection method developed by (Rodionov, 2004) is a sequential analysis which does not need pre-assumptions about timing of the shifts. The purpose of our research is to find regime shift points in hydroclimatological parameters at the study sites located within the Merrimack Watershed, NH-MA. Analysis of complete and partial annual streamflow records, by a combination of hydrologic flow classification (Genz and Luz, 2012) and regime shift point detection (Rodionov, 2004) provides insight into recent changes in streamflow regime. We try to identify the correlation between regime shifts in climate indices and observed trends in hydrologic variables in the Merrimack Watershed. The Atlantic Multi-decadal Oscillation (AMO) and the North Atlantic Oscillation (NAO) are the two climate indices related to sea surface temperature

  17. 基于集成分类的恶意应用检测方法%Mobile malware detection approach using ensemble classification

    Institute of Scientific and Technical Information of China (English)

    黄伟; 陈昊; 郭雅娟; 姜海涛

    2016-01-01

    针对难以准确判断单一的特征和单一的数据挖掘算法对于恶意应用检测精度影响的问题,该文提出了一种基于集成分类的恶意应用检测方法,该方法以安卓平台上的应用为研究对象,采用静态分析方法提取三类特征:权限特征、组件特征和函数调用特征;在此基础上,分别为每一类特征应用多种基分类器建立分类模型,并采用集成学习的思想设计一致性函数产生多种基分类器的决策结果作为某一特征上的分类结果;最后,再次采用集成学习的思想,融合每一类特征的分类结果,产生面向多特征的恶意应用分类结果. 针对应用市场的真实应用的检测分析结果表明:面向多特征的集成分类检测方法能提高恶意应用检测精度.%To accurately know the contributions of a single feature and a single data mining algorithm to high detection accuracy for malware detection,this paper puts forward a mobile malware detection approach using ensemble techniques for the Android platform. The proposed approach extracts three kinds of features from a given mobile application,including privilege feature,component feature and API call feature. Several classification models are built for each kind of feature using several base classifiers respectively. A consensus function for each feature is designed to make decision to obtain an optimal classification output. In the next step,another consensus function is designed and applied to the outputs from all kinds of features in order to obtain the final classification output. This paper carries out the empirical experiment evaluation on mobile applications from the real world application markets,and the compared results show that our approach can get a better detection accuracy in terms of F1 score than a single data mining algorithm.

  18. Regimes Of Helium Burning

    CERN Document Server

    Timmes, F X

    2000-01-01

    The burning regimes encountered by laminar deflagrations and ZND detonations propagating through helium-rich compositions in the presence of buoyancy-driven turbulence are analyzed. Particular attention is given to models of X-ray bursts which start with a thermonuclear runaway on the surface of a neutron star, and the thin shell helium instability of intermediate-mass stars. In the X-ray burst case, turbulent deflagrations propagating in the lateral or radial directions encounter a transition from the distributed regime to the flamlet regime at a density of 10^8 g cm^{-3}. In the radial direction, the purely laminar deflagration width is larger than the pressure scale height for densities smaller than 10^6 g cm^{-3}. Self-sustained laminar deflagrations travelling in the radial direction cannot exist below this density. Similarily, the planar ZND detonation width becomes larger than the pressure scale height at 10^7 g cm^{-3}, suggesting that a steady-state, self-sustained detonations cannot come into exista...

  19. Phase space and power spectral approaches for EEG-based automatic sleep-wake classification in humans: a comparative study using short and standard epoch lengths.

    Science.gov (United States)

    Brignol, Arnaud; Al-Ani, Tarik; Drouot, Xavier

    2013-03-01

    Sleep disorders in humans have become a public health issue in recent years. Sleep can be analysed by studying the electroencephalogram (EEG) recorded during a night's sleep. Alternating between sleep-wake stages gives information related to the sleep quality and quantity since this alternating pattern is highly affected during sleep disorders. Spectral composition of EEG signals varies according to sleep stages, alternating phases of high energy associated to low frequency (deep sleep) with periods of low energy associated to high frequency (wake and light sleep). The analysis of sleep in humans is usually made on periods (epochs) of 30-s length according to the original Rechtschaffen and Kales sleep scoring manual. In this work, we propose a new phase space-based (mainly based on Poincaré plot) algorithm for automatic classification of sleep-wake states in humans using EEG data gathered over relatively short-time periods. The effectiveness of our approach is demonstrated through a series of experiments involving EEG data from seven healthy adult female subjects and was tested on epoch lengths ranging from 3-s to 30-s. The performance of our phase space approach was compared to a 2-dimensional state space approach using the power spectral (PS) in two selected human-specific frequency bands. These powers were calculated by dividing integrated spectral amplitudes at selected human-specific frequency bands. The comparison demonstrated that the phase space approach gives better performance in the case of short as well as standard 30-s epoch lengths.

  20. Improve mask inspection capacity with Automatic Defect Classification (ADC)

    Science.gov (United States)

    Wang, Crystal; Ho, Steven; Guo, Eric; Wang, Kechang; Lakkapragada, Suresh; Yu, Jiao; Hu, Peter; Tolani, Vikram; Pang, Linyong

    2013-09-01

    As optical lithography continues to extend into low-k1 regime, resolution of mask patterns continues to diminish. The adoption of RET techniques like aggressive OPC, sub-resolution assist features combined with the requirements to detect even smaller defects on masks due to increasing MEEF, poses considerable challenges for mask inspection operators and engineers. Therefore a comprehensive approach is required in handling defects post-inspections by correctly identifying and classifying the real killer defects impacting the printability on wafer, and ignoring nuisance defect and false defects caused by inspection systems. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at the SMIC mask shop for the 40nm technology node. Traditionally, each defect is manually examined and classified by the inspection operator based on a set of predefined rules and human judgment. At SMIC mask shop due to the significant total number of detected defects, manual classification is not cost-effective due to increased inspection cycle time, resulting in constrained mask inspection capacity, since the review has to be performed while the mask stays on the inspection system. Luminescent Technologies Automated Defect Classification (ADC) product offers a complete and systematic approach for defect disposition and classification offline, resulting in improved utilization of the current mask inspection capability. Based on results from implementation of ADC in SMIC mask production flow, there was around 20% improvement in the inspection capacity compared to the traditional flow. This approach of computationally reviewing defects post mask-inspection ensures no yield loss by qualifying reticles without the errors associated with operator mis-classification or human error. The ADC engine retrieves the high resolution inspection images and uses a decision-tree flow to classify a given defect. Some identification mechanisms adopted by ADC to

  1. A reliable Raman-spectroscopy-based approach for diagnosis, classification and follow-up of B-cell acute lymphoblastic leukemia

    Science.gov (United States)

    Managò, Stefano; Valente, Carmen; Mirabelli, Peppino; Circolo, Diego; Basile, Filomena; Corda, Daniela; de Luca, Anna Chiara

    2016-04-01

    Acute lymphoblastic leukemia type B (B-ALL) is a neoplastic disorder that shows high mortality rates due to immature lymphocyte B-cell proliferation. B-ALL diagnosis requires identification and classification of the leukemia cells. Here, we demonstrate the use of Raman spectroscopy to discriminate normal lymphocytic B-cells from three different B-leukemia transformed cell lines (i.e., RS4;11, REH, MN60 cells) based on their biochemical features. In combination with immunofluorescence and Western blotting, we show that these Raman markers reflect the relative changes in the potential biological markers from cell surface antigens, cytoplasmic proteins, and DNA content and correlate with the lymphoblastic B-cell maturation/differentiation stages. Our study demonstrates the potential of this technique for classification of B-leukemia cells into the different differentiation/maturation stages, as well as for the identification of key biochemical changes under chemotherapeutic treatments. Finally, preliminary results from clinical samples indicate high consistency of, and potential applications for, this Raman spectroscopy approach.

  2. 基于改进置信规则库推理的分类方法%Classification Approach Based on Improved Belief Rule-Base Reasoning

    Institute of Scientific and Technical Information of China (English)

    叶青青; 杨隆浩; 傅仰耿; 陈晓聪

    2016-01-01

    通过引入置信规则库的线性组合方式,设定规则数等于分类数及改进个体匹配度的计算方法,提出了基于置信规则库推理的分类方法。比较传统的置信规则库推理方法,新方法中规则数的设置不依赖于问题的前件属性数量或候选值数量,仅与问题的分类数有关,保证了方法对于复杂问题的适用性。实验中,通过差分进化算法对置信规则库的规则权重、前件属性权重、属性候选值和评价等级的置信度进行参数学习,得到最优的参数组合。对3个常用的公共分类数据集进行测试,均获得理想的分类准确率,表明新分类方法合理有效。%This paper proposes a new classification approach based on improved belief rule-base reasoning by intro-ducing linear combinational mode, setting the number of rules based on the classifications and improving the method of calculating individual matching degree. Compared with the traditional belief rule-base inference methodology, the number of rules in the proposed method does not depend on the number of antecedent attributes or its referential values, and it is only related to classification number. In this way, the new method can ensure the applicability for complex problems. In the experiments, the differential evolution algorithm is applied to train parameters, including rule weights, attribute weights, referential values of antecedent attributes and belief degrees. Three commonly public datasets have been employed to validate the proposed method. And the classification results are proved to be ideal, which shows that the proposed method is reasonable and effective.

  3. A review and classification of approaches for dealing with uncertainty in multi-criteria decision analysis for healthcare decisions.

    Science.gov (United States)

    Broekhuizen, Henk; Groothuis-Oudshoorn, Catharina G M; van Til, Janine A; Hummel, J Marjan; IJzerman, Maarten J

    2015-05-01

    Multi-criteria decision analysis (MCDA) is increasingly used to support decisions in healthcare involving multiple and conflicting criteria. Although uncertainty is usually carefully addressed in health economic evaluations, whether and how the different sources of uncertainty are dealt with and with what methods in MCDA is less known. The objective of this study is to review how uncertainty can be explicitly taken into account in MCDA and to discuss which approach may be appropriate for healthcare decision makers. A literature review was conducted in the Scopus and PubMed databases. Two reviewers independently categorized studies according to research areas, the type of MCDA used, and the approach used to quantify uncertainty. Selected full text articles were read for methodological details. The search strategy identified 569 studies. The five approaches most identified were fuzzy set theory (45% of studies), probabilistic sensitivity analysis (15%), deterministic sensitivity analysis (31%), Bayesian framework (6%), and grey theory (3%). A large number of papers considered the analytic hierarchy process in combination with fuzzy set theory (31%). Only 3% of studies were published in healthcare-related journals. In conclusion, our review identified five different approaches to take uncertainty into account in MCDA. The deterministic approach is most likely sufficient for most healthcare policy decisions because of its low complexity and straightforward implementation. However, more complex approaches may be needed when multiple sources of uncertainty must be considered simultaneously.

  4. Enhancing China's Institutional Discourse Power:the Approach of International Regimes%增强中国制度性话语权:国际机制的路径

    Institute of Scientific and Technical Information of China (English)

    杨庆龙

    2016-01-01

    Under the circumstance of increasing institutionalization of present international community,China should learn to enhance its institutional right of discourse by the approach of international regimes.It should actively integrate into the available international mechanism and change the uneven distribution of power of discourse in them through gradual reform.In the mean-time,it is expected to improve its ability to contribute ideas to them and ability to set agenda in them to promote its institutional power of discourse.What's more,with the rise of China's power and international status,we should change our role of participa-tor into the role of founder or dominant player of international regimes and actively initiate new ones so as to enhance our institu-tional power of discourse.%在国际社会日益机制化的今天,中国要学会以国际机制的路径来增强制度性话语权。对于现有的国际机制,中国要积极融入其中,通过渐进的改革来改变机制内话语权分布不均衡的局面。同时,要提高对国际机制的理念贡献能力与议程设置能力来增强制度性话语权。随着中国实力的增长与国际地位的提高,中国要适时改变角色,由国际机制的参与者变为创建者或主导者,创建新的国际机制,增强自己的制度性话语权。

  5. Concepts of Classification and Taxonomy. Phylogenetic Classification

    CERN Document Server

    Fraix-Burnet, Didier

    2016-01-01

    Phylogenetic approaches to classification have been heavily developed in biology by bioinformaticians. But these techniques have applications in other fields, in particular in linguistics. Their main characteristics is to search for relationships between the objects or species in study, instead of grouping them by similarity. They are thus rather well suited for any kind of evolutionary objects. For nearly fifteen years, astrocladistics has explored the use of Maximum Parsimony (or cladistics) for astronomical objects like galaxies or globular clusters. In this lesson we will learn how it works. 1 Why phylogenetic tools in astrophysics? 1.1 History of classification The need for classifying living organisms is very ancient, and the first classification system can be dated back to the Greeks. The goal was very practical since it was intended to distinguish between eatable and toxic aliments, or kind and dangerous animals. Simple resemblance was used and has been used for centuries. Basically, until the XVIIIth...

  6. Cytotoxicity towards CCO cells of imidazolium ionic liquids with functionalized side chains: preliminary QSTR modeling using regression and classification based approaches.

    Science.gov (United States)

    Bubalo, Marina Cvjetko; Radošević, Kristina; Srček, Višnja Gaurina; Das, Rudra Narayan; Popelier, Paul; Roy, Kunal

    2015-02-01

    Within this work we evaluated the cytotoxicity towards the Channel Catfish Ovary (CCO) cell line of some imidazolium-based ionic liquids containing different functionalized and unsaturated side chains. The toxic effects were measured by the reduction of the WST-1 dye after 72 h exposure resulting in dose- and structure-dependent toxicities. The obtained data on cytotoxic effects of 14 different imidazolium ionic liquids in CCO cells, expressed as EC50 values, were used in a preliminary quantitative structure-toxicity relationship (QSTR) study employing regression- and classification-based approaches. The toxicity of ILs towards CCO was chiefly related to the shape and hydrophobicity parameters of cations. A significant influence of the quantum topological molecular similarity descriptor ellipticity (ε) of the imine bond was also observed.

  7. Extracting preseismic VLF-VHF electromagnetic signatures: A possible way in which the critical regime is reached as the earthquake approaches

    Science.gov (United States)

    Eftaxias, K.; Kapiris, P.; Karamanos, K.; Balasis, G.; Peratzakis, A.

    2005-12-01

    We view earthquakes EQ's as large-scale fracture phenomena in the Earth's heterogeneous crust. Our main observational tool is the monitoring of the microfractures, which occur in the prefocal area before the final break-up, by recording their kHz-MHz electromagnetic (EM) emissions, with the MHz radiation appearing earlier than the kHz. Our model of the focal area consists of a backbone of strong and almost homogeneous large asperities that sustains the system and a strongly heterogeneous medium that surrounds the family of strong asperities. We distinguish two characteristic epochs in the evolution of precursory EM activity and identify them with the equivalent critical stages in the EQ preparation process. Our approach will be in terms of critical phase transitions in statistical physics, drawing on recently published results. We obtain two major results. First, the initial MHz part of the preseismic EM emission, which has antipersistent behavior, is triggered by microfractures in the highly disordered system that surrounds the essentially homogeneous "backbone asperities" within the prefocal area and could be described in analogy with a thermal continuous phase transition. However, the analysis reveals that the system is gradually driven out of equilibrium. Considerations of the symmetry-breaking and ``intermittent dynamics of critical fluctuations" method estimate the time beyond which the process generating the preseismic EM emission could continue only as a nonequilibrium instability. Second, the abrupt emergence of strong kHz EM emission in the tail of the precursory radiation, showing strong persistent behavior, is thought to be due to the fracture of the high strength ``backbones". The associated phase of the EQ nucleation is a nonequilibrium process without any footprint of an equilibrium thermal phase transition. The family of asperities sustains the system. Physically, the appearance of persistent properties may indicate that the process acquires a self

  8. A least square support vector machine-based approach for contingency classification and ranking in a large power system

    Directory of Open Access Journals (Sweden)

    Bhanu Pratap Soni

    2016-12-01

    Full Text Available This paper proposes an effective supervised learning approach for static security assessment of a large power system. Supervised learning approach employs least square support vector machine (LS-SVM to rank the contingencies and predict the system severity level. The severity of the contingency is measured by two scalar performance indices (PIs: line MVA performance index (PIMVA and Voltage-reactive power performance index (PIVQ. SVM works in two steps. Step I is the estimation of both standard indices (PIMVA and PIVQ that is carried out under different operating scenarios and Step II contingency ranking is carried out based on the values of PIs. The effectiveness of the proposed methodology is demonstrated on IEEE 39-bus (New England system. The approach can be beneficial tool which is less time consuming and accurate security assessment and contingency analysis at energy management center.

  9. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  10. An image classification approach to analyze the suppression of plant immunity by the human pathogen Salmonella Typhimurium

    Directory of Open Access Journals (Sweden)

    Schikora Marek

    2012-07-01

    Full Text Available Abstract Background The enteric pathogen Salmonella is the causative agent of the majority of food-borne bacterial poisonings. Resent research revealed that colonization of plants by Salmonella is an active infection process. Salmonella changes the metabolism and adjust the plant host by suppressing the defense mechanisms. In this report we developed an automatic algorithm to quantify the symptoms caused by Salmonella infection on Arabidopsis. Results The algorithm is designed to attribute image pixels into one of the two classes: healthy and unhealthy. The task is solved in three steps. First, we perform segmentation to divide the image into foreground and background. In the second step, a support vector machine (SVM is applied to predict the class of each pixel belonging to the foreground. And finally, we do refinement by a neighborhood-check in order to omit all falsely classified pixels from the second step. The developed algorithm was tested on infection with the non-pathogenic E. coli and the plant pathogen Pseudomonas syringae and used to study the interaction between plants and Salmonella wild type and T3SS mutants. We proved that T3SS mutants of Salmonella are unable to suppress the plant defenses. Results obtained through the automatic analyses were further verified on biochemical and transcriptome levels. Conclusion This report presents an automatic pixel-based classification method for detecting “unhealthy” regions in leaf images. The proposed method was compared to existing method and showed a higher accuracy. We used this algorithm to study the impact of the human pathogenic bacterium Salmonella Typhimurium on plants immune system. The comparison between wild type bacteria and T3SS mutants showed similarity in the infection process in animals and in plants. Plant epidemiology is only one possible application of the proposed algorithm, it can be easily extended to other detection tasks, which also rely on color information, or

  11. A New Approach to Develop Computer-Aided Diagnosis Scheme of Breast Mass Classification Using Deep Learning Technology.

    Science.gov (United States)

    Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2017-04-18

    To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves