WorldWideScience

Sample records for regime classification approach

  1. A climatology of low level wind regimes over Central America using a weather type classification approach.

    Directory of Open Access Journals (Sweden)

    Fernán eSáenz

    2015-04-01

    Full Text Available Based on the potential of the weather types classification method to study synoptic features, this study proposes the application of such methodology for the identification of the main large scale patterns related with weather in Central America. Using ERA Interim low-level winds in a domain that encompasses the intra-Americas sea, the eastern tropical Pacific, southern North America, Central America and northern South America; the K-means clustering algorithm was applied to find recurrent regimes of low-level winds. Eleven regimes were identified and good coherency between the results and known features of regional circulation was found. It was determined that the main large scale patterns can be either locally forced or a response to tropical-extratropical interactions. Moreover, the local forcing dominates the summer regimes whereas mid latitude interactions lead winter regimes. The study of the relationship between the large scale patterns and regional precipitation shows that winter regimes are related with the Caribbean-Pacific precipitation seesaw. Summer regimes, on the other hand, enhance the Caribbean-Pacific precipitation contrasting distribution as a function of the dominant regimes. A strong influence of ENSO on the frequency and duration of the regimes was found. It was determined that the specific effect of ENSO on the regimes depends on whether the circulation is locally forced or lead by the interaction between the tropics and the mid-latitudes. The study of the cold surges using the information of the identified regimes revealed that three regimes are linkable with the occurrence of cold surges that affect Central America and its precipitation. As the winter regimes are largely dependent of mid-latitude interaction with the tropics, the effect that ENSO has on the Jet Stream is reflected in the winter regimes. An automated analysis of large scale conditions based on reanalysis and/or model data seems useful for both dynamical

  2. A Time Series Regime Classification Approach for Short-Term Forecasting; Identificacion de Mecanismos en Series Temporales para la Prediccion a Corto Plazo

    Energy Technology Data Exchange (ETDEWEB)

    Gallego, C. J.

    2010-03-08

    Abstract: This technical report is focused on the analysis of stochastic processes that switch between different dynamics (also called regimes or mechanisms) over time. The so-called Switching-regime models consider several underlying functions instead of one. In this case, a classification problem arises as the current regime has to be assessed at each time-step. The identification of the regimes allows the performance of regime-switching models for short-term forecasting purposes. Within this framework, identifying different regimes showed by time-series is the aim of this work. The proposed approach is based on a statistical tool called Gamma-test. One of the main advantages of this methodology is the absence of a mathematical definition for the different underlying functions. Applications with both simulated and real wind power data have been considered. Results on simulated time series show that regimes can be successfully identified under certain hypothesis. Nevertheless, this work highlights that further research has to be done when considering real wind power time-series, which usually show different behaviours (e.g. fluctuations or ramps, followed by low variance periods). A better understanding of these events eventually will improve wind power forecasting. (Author) 15 refs.

  3. Synergies between nonproliferation regimes: A pragmatic approach

    International Nuclear Information System (INIS)

    Findlay, Trevor; Meier, Oliver

    2001-01-01

    Full text: With the recent progress in establishing international nonproliferation regimes, the question of synergies between different verification and monitoring regimes is becoming more acute. Three multilateral and universal nonproliferation organisations covering safeguards on civil nuclear materials, nuclear testing, and chemical weapons are up and running. A regime on biological weapons is under negotiation. Several regional organisations concerned with monitoring nonproliferation commitments in the nuclear field are in place; others are being established. Past discussions on synergies between these regimes have suffered from being too far-reaching. These discussions often have not reflected adequately the political difficulties of cooperation between regimes with different membership, scope and institutional set-up. This paper takes a pragmatic look at exploiting synergies and identifies some potential and real overlaps in the work between different verification regimes. It argues for a bottom-up approach and identifies building blocks for collaboration between verification regimes. By realising such, more limited potential for cooperation, the ground could be prepared for exploiting other synergies between these regimes. (author)

  4. Automatic Classification of Offshore Wind Regimes With Weather Radar Observations

    DEFF Research Database (Denmark)

    Trombe, Pierre-Julien; Pinson, Pierre; Madsen, Henrik

    2014-01-01

    Weather radar observations are called to play an important role in offshore wind energy. In particular, they can enable the monitoring of weather conditions in the vicinity of large-scale offshore wind farms and thereby notify the arrival of precipitation systems associated with severe wind...... and amplitude) using reflectivity observations from a single weather radar system. A categorical sequence of most likely wind regimes is estimated from a wind speed time series by combining a Markov-Switching model and a global decoding technique, the Viterbi algorithm. In parallel, attributes of precipitation...... systems are extracted from weather radar images. These attributes describe the global intensity, spatial continuity and motion of precipitation echoes on the images. Finally, a CART classification tree is used to find the broad relationships between precipitation attributes and wind regimes...

  5. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  6. De Facto Exchange Rate Regime Classifications Are Better Than You Think

    OpenAIRE

    Michael Bleaney; Mo Tian; Lin Yin

    2015-01-01

    Several de facto exchange rate regime classifications have been widely used in empirical research, but they are known to disagree with one another to a disturbing extent. We dissect the algorithms employed and argue that they can be significantly improved. We implement the improvements, and show that there is a far higher agreement rate between the modified classifications. We conclude that the current pessimism about de facto exchange rate regime classification schemes is unwarranted.

  7. Flow Regime Identification of Co-Current Downward Two-Phase Flow With Neural Network Approach

    International Nuclear Information System (INIS)

    Hiroshi Goda; Seungjin Kim; Ye Mi; Finch, Joshua P.; Mamoru Ishii; Jennifer Uhle

    2002-01-01

    Flow regime identification for an adiabatic vertical co-current downward air-water two-phase flow in the 25.4 mm ID and the 50.8 mm ID round tubes was performed by employing an impedance void meter coupled with the neural network classification approach. This approach minimizes the subjective judgment in determining the flow regimes. The signals obtained by an impedance void meter were applied to train the self-organizing neural network to categorize these impedance signals into a certain number of groups. The characteristic parameters set into the neural network classification included the mean, standard deviation and skewness of impedance signals in the present experiment. The classification categories adopted in the present investigation were four widely accepted flow regimes, viz. bubbly, slug, churn-turbulent, and annular flows. These four flow regimes were recognized based upon the conventional flow visualization approach by a high-speed motion analyzer. The resulting flow regime maps classified by the neural network were compared with the results obtained through the flow visualization method, and consequently the efficiency of the neural network classification for flow regime identification was demonstrated. (authors)

  8. Buried penis: classification surgical approach.

    Science.gov (United States)

    Hadidi, Ahmed T

    2014-02-01

    The purpose of this study was to describe morphological classification of congenital buried penis (BP) and present a versatile surgical approach for correction. Sixty-one patients referred with BP were classified into 3 grades according to morphological findings: Grade 1-29 patients with Longer Inner Prepuce (LIP) only, Grade II-20 patients who presented with LIP associated with indrawn penis that required division of the fundiform and suspensory ligaments, and Grade III-12 patients who had in addition to the above, excess supra-pubic fat. A ventral midline penile incision extending from the tip of prepuce down to the penoscrotal junction was used in all patients. The operation was tailored according to the BP Grade. All patients underwent circumcision. Mean follow up was 3 years (range 1 to 10). All 61 patients had an abnormally long inner prepuce (LIP). Forty-seven patients had a short penile shaft. Early improvement was noted in all cases. Satisfactory results were achieved in all 29 patients in grade I and in 27 patients in grades II and III. Five children (Grades II and III) required further surgery (9%). Congenital buried penis is a spectrum characterized by LIP and may include in addition; short penile shaft, abnormal attachment of fundiform, and suspensory ligaments and excess supra-pubic fat. Congenital Mega Prepuce (CMP) is a variant of Grade I BP, with LIP characterized by intermittent ballooning of the genital area. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Flow regime classification in air-magnetic fluid two-phase flow.

    Science.gov (United States)

    Kuwahara, T; De Vuyst, F; Yamaguchi, H

    2008-05-21

    A new experimental/numerical technique of classification of flow regimes (flow patterns) in air-magnetic fluid two-phase flow is proposed in the present paper. The proposed technique utilizes the electromagnetic induction to obtain time-series signals of the electromotive force, allowing us to make a non-contact measurement. Firstly, an experiment is carried out to obtain the time-series signals in a vertical upward air-magnetic fluid two-phase flow. The signals obtained are first treated using two kinds of wavelet transforms. The data sets treated are then used as input vectors for an artificial neural network (ANN) with supervised training. In the present study, flow regimes are classified into bubbly, slug, churn and annular flows, which are generally the main flow regimes. To validate the flow regimes, a visualization experiment is also performed with a glycerin solution that has roughly the same physical properties, i.e., kinetic viscosity and surface tension, as a magnetic fluid used in the present study. The flow regimes from the visualization are used as targets in an ANN and also used in the estimation of the accuracy of the present method. As a result, ANNs using radial basis functions are shown to be the most appropriate for the present classification of flow regimes, leading to small classification errors.

  10. Waste classification: a management approach

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1984-01-01

    A waste classification system designed to quantify the total hazard of a waste has been developed by the Low-Level Waste Management Program. As originally conceived, the system was designed to deal with mixed radioactive waste. The methodology has been developed and successfully applied to radiological and chemical wastes, both individually and mixed together. Management options to help evaluate the financial and safety trade-offs between waste segregation, waste treatment, container types, and site factors are described. Using the system provides a very simple and cost effective way of making quick assessments of a site's capabilities to contain waste materials. 3 references

  11. The decision tree approach to classification

    Science.gov (United States)

    Wu, C.; Landgrebe, D. A.; Swain, P. H.

    1975-01-01

    A class of multistage decision tree classifiers is proposed and studied relative to the classification of multispectral remotely sensed data. The decision tree classifiers are shown to have the potential for improving both the classification accuracy and the computation efficiency. Dimensionality in pattern recognition is discussed and two theorems on the lower bound of logic computation for multiclass classification are derived. The automatic or optimization approach is emphasized. Experimental results on real data are reported, which clearly demonstrate the usefulness of decision tree classifiers.

  12. An ordinal classification approach for CTG categorization.

    Science.gov (United States)

    Georgoulas, George; Karvelis, Petros; Gavrilis, Dimitris; Stylios, Chrysostomos D; Nikolakopoulos, George

    2017-07-01

    Evaluation of cardiotocogram (CTG) is a standard approach employed during pregnancy and delivery. But, its interpretation requires high level expertise to decide whether the recording is Normal, Suspicious or Pathological. Therefore, a number of attempts have been carried out over the past three decades for development automated sophisticated systems. These systems are usually (multiclass) classification systems that assign a category to the respective CTG. However most of these systems usually do not take into consideration the natural ordering of the categories associated with CTG recordings. In this work, an algorithm that explicitly takes into consideration the ordering of CTG categories, based on binary decomposition method, is investigated. Achieved results, using as a base classifier the C4.5 decision tree classifier, prove that the ordinal classification approach is marginally better than the traditional multiclass classification approach, which utilizes the standard C4.5 algorithm for several performance criteria.

  13. Music Genre Classification Systems - A Computational Approach

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2006-01-01

    Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...

  14. Multivariate Approaches to Classification in Extragalactic Astronomy

    Directory of Open Access Journals (Sweden)

    Didier eFraix-Burnet

    2015-08-01

    Full Text Available Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  15. Biosensor approach to psychopathology classification.

    Directory of Open Access Journals (Sweden)

    Misha Koshelev

    2010-10-01

    Full Text Available We used a multi-round, two-party exchange game in which a healthy subject played a subject diagnosed with a DSM-IV (Diagnostic and Statistics Manual-IV disorder, and applied a Bayesian clustering approach to the behavior exhibited by the healthy subject. The goal was to characterize quantitatively the style of play elicited in the healthy subject (the proposer by their DSM-diagnosed partner (the responder. The approach exploits the dynamics of the behavior elicited in the healthy proposer as a biosensor for cognitive features that characterize the psychopathology group at the other side of the interaction. Using a large cohort of subjects (n = 574, we found statistically significant clustering of proposers' behavior overlapping with a range of DSM-IV disorders including autism spectrum disorder, borderline personality disorder, attention deficit hyperactivity disorder, and major depressive disorder. To further validate these results, we developed a computer agent to replace the human subject in the proposer role (the biosensor and show that it can also detect these same four DSM-defined disorders. These results suggest that the highly developed social sensitivities that humans bring to a two-party social exchange can be exploited and automated to detect important psychopathologies, using an interpersonal behavioral probe not directly related to the defining diagnostic criteria.

  16. A statistical approach to root system classification.

    Directory of Open Access Journals (Sweden)

    Gernot eBodner

    2013-08-01

    Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement

  17. Exploring different approaches for music genre classification

    Directory of Open Access Journals (Sweden)

    Antonio Jose Homsi Goulart

    2012-07-01

    Full Text Available In this letter, we present different approaches for music genre classification. The proposed techniques, which are composed of a feature extraction stage followed by a classification procedure, explore both the variations of parameters used as input and the classifier architecture. Tests were carried out with three styles of music, namely blues, classical, and lounge, which are considered informally by some musicians as being “big dividers” among music genres, showing the efficacy of the proposed algorithms and establishing a relationship between the relevance of each set of parameters for each music style and each classifier. In contrast to other works, entropies and fractal dimensions are the features adopted for the classifications.

  18. Decision Making under Ecological Regime Shift: An Experimental Economic Approach

    OpenAIRE

    Kawata, Yukichika

    2011-01-01

    Environmental economics postulates the assumption of homo economicus and presumes that externality occurs as a result of the rational economic activities of economic agents. This paper examines this assumption using an experimental economic approach in the context of regime shift, which has been receiving increasing attention. We observe that when externality does not exist, economic agents (subjects of experimemt) act economically rationally, but when externality exists, economic agents avoi...

  19. Flow Regime Classification and Hydrological Characterization: A Case Study of Ethiopian Rivers

    Directory of Open Access Journals (Sweden)

    Belete Berhanu

    2015-06-01

    Full Text Available The spatiotemporal variability of a stream flow due to the complex interaction of catchment attributes and rainfall induce complexity in hydrology. Researchers have been trying to address this complexity with a number of approaches; river flow regime is one of them. The flow regime can be quantified by means of hydrological indices characterizing five components: magnitude, frequency, duration, timing, and rate of change of flow. Similarly, this study aimed to understand the flow variability of Ethiopian Rivers using the observed daily flow data from 208 gauging stations in the country. With this process, the Hierarchical Ward Clustering method was implemented to group the streams into three flow regimes (1 ephemeral, (2 intermittent, and (3 perennial. Principal component analysis (PCA is also applied as the second multivariate analysis tool to identify dominant hydrological indices that cause the variability in the streams. The mean flow per unit catchment area (QmAR and Base flow index (BFI show an incremental trend with ephemeral, intermittent and perennial streams. Whereas the number of mean zero flow days ratio (ZFI and coefficient of variation (CV show a decreasing trend with ephemeral to perennial flow regimes. Finally, the streams in the three flow regimes were characterized with the mean and standard deviation of the hydrological variables and the shape, slope, and scale of the flow duration curve. Results of this study are the basis for further understanding of the ecohydrological processes of the river basins in Ethiopia.

  20. AUTOMATIC APPROACH TO VHR SATELLITE IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    P. Kupidura

    2016-06-01

    Full Text Available In this paper, we present a proposition of a fully automatic classification of VHR satellite images. Unlike the most widespread approaches: supervised classification, which requires prior defining of class signatures, or unsupervised classification, which must be followed by an interpretation of its results, the proposed method requires no human intervention except for the setting of the initial parameters. The presented approach bases on both spectral and textural analysis of the image and consists of 3 steps. The first step, the analysis of spectral data, relies on NDVI values. Its purpose is to distinguish between basic classes, such as water, vegetation and non-vegetation, which all differ significantly spectrally, thus they can be easily extracted basing on spectral analysis. The second step relies on granulometric maps. These are the product of local granulometric analysis of an image and present information on the texture of each pixel neighbourhood, depending on the texture grain. The purpose of texture analysis is to distinguish between different classes, spectrally similar, but yet of different texture, e.g. bare soil from a built-up area, or low vegetation from a wooded area. Due to the use of granulometric analysis, based on mathematical morphology opening and closing, the results are resistant to the border effect (qualifying borders of objects in an image as spaces of high texture, which affect other methods of texture analysis like GLCM statistics or fractal analysis. Therefore, the effectiveness of the analysis is relatively high. Several indices based on values of different granulometric maps have been developed to simplify the extraction of classes of different texture. The third and final step of the process relies on a vegetation index, based on near infrared and blue bands. Its purpose is to correct partially misclassified pixels. All the indices used in the classification model developed relate to reflectance values, so the

  1. Weather regimes over Senegal during the summer monsoon season using self-organizing maps and hierarchical ascendant classification. Part II: interannual time scale

    Energy Technology Data Exchange (ETDEWEB)

    Gueye, A.K. [ESP, UCAD, Dakar (Senegal); Janicot, Serge; Sultan, Benjamin [LOCEAN/IPSL, IRD, Universite Pierre et Marie Curie, Paris cedex 05 (France); Niang, A. [LTI, ESP/UCAD, Dakar (Senegal); Sawadogo, S. [LTI, EPT, Thies (Senegal); Diongue-Niang, A. [ANACIM, Dakar (Senegal); Thiria, S. [LOCEAN/IPSL, UPMC, Paris (France)

    2012-11-15

    The aim of this work is to define over the period 1979-2002 the main synoptic weather regimes relevant for understanding the daily variability of rainfall during the summer monsoon season over Senegal. ''Interannual'' synoptic weather regimes are defined by removing the influence of the mean 1979-2002 seasonal cycle. This is different from Part I where the seasonal evolution of each year was removed, then removing also the contribution of interannual variability. As in Part I, the self-organizing maps approach, a clustering methodology based on non-linear artificial neural network, is combined with a hierarchical ascendant classification to compute these regimes. Nine weather regimes are identified using the mean sea level pressure and 850 hPa wind field as variables. The composite circulation patterns of all these nine weather regimes are very consistent with the associated anomaly patterns of precipitable water, mid-troposphere vertical velocity and rainfall. They are also consistent with the distribution of rainfall extremes. These regimes have been then gathered into different groups. A first group of four regimes is included in an inner circuit and is characterized by a modulation of the semi-permanent trough located along the western coast of West Africa and an opposite modulation on the east. This circuit is important because it associates the two wettest and highly persistent weather regimes over Senegal with the driest and the most persistent one. One derivation of this circuit is highlighted, including the two driest regimes and the most persistent one, what can provide important dry sequences occurrence. An exit of this circuit is characterised by a filling of the Saharan heat low. An entry into the main circuit includes a southward location of the Saharan heat low followed by its deepening. The last weather regime is isolated from the other ones and it has no significant impact on Senegal. It is present in June and September, and

  2. Non-invasive classification of gas–liquid two-phase horizontal flow regimes using an ultrasonic Doppler sensor and a neural network

    International Nuclear Information System (INIS)

    Abbagoni, Baba Musa; Yeung, Hoi

    2016-01-01

    The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas–liquid flow regimes objectively with the gas–liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the

  3. Non-invasive classification of gas-liquid two-phase horizontal flow regimes using an ultrasonic Doppler sensor and a neural network

    Science.gov (United States)

    Musa Abbagoni, Baba; Yeung, Hoi

    2016-08-01

    The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas-liquid flow regimes objectively with the gas-liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the

  4. Classification of innovations: approaches and consequences

    Directory of Open Access Journals (Sweden)

    Jakub Tabas

    2011-01-01

    Full Text Available Currently, innovations are perceived as a life blood of businesses. The inevitable fact is that even if the innovations have a potential to transform the companies or all the industries, the innovations are high risky. Even though, the second fact is that in order to companies’ development and their survival on the markets, the innovations have become the necessity. In the theory, it is rather difficult to find a comprehensive definition of innovation, and to settle down a general definition of innovation becomes more and more difficult with the growing number of domains where the innovations, or possible innovations start to appear in a form of added value to something that already exist. Definition of innovation has come through a long process of development; from early definition of Schumpeter who has connected innovation especially with changes in products or production processes, to recent definitions based on the added value for a society. One of possible approaches to define the content of innovation is to base the definition on classification of innovation. In the article, the authors provide the analysis of existing classifications of innovations in order to find, respectively in order to define the general content of innovation that would confirm (or reject their definition of innovation derived in the frame of their previous work where they state that innovation is a change that leads to gaining profit for an individual, for business entity, or for society, while the profit is not only the accounting one, but it is the economic profit.The article is based especially on the secondary research while the authors employ the method of analysis with the aim to confront various classification-based definitions of innovation. Then the methods used are especially comparison, analysis and synthesis.

  5. Bridging interest, classification and technology gaps in the climate change regime

    International Nuclear Information System (INIS)

    Gupta, J.; Van der Werff, P.; Gagnon-Lebrun, F.; Van Dijk, I.; Verspeek, F.; Arkesteijn, E.; Van der Meer, J.

    2002-01-01

    The climate change regime is affected by a major credibility gap; there is a gap between what countries have been stating that they are willing to do and what they actually do. This is visible not just in the inability of the developed countries to stabilise their emissions at 1990 levels by the year 2000 as provided for in the United Nations Framework Convention on Climate Change (FCCC), but by the general reluctance of all countries to ratify the Kyoto Protocol to the Convention (KPFCCC). This research postulates that this credibility gap is affected further by three other types of gaps: 1) the interest gap; 2) the classification gap; and 3) the technology gap. The purpose of this research is thus to identify ways and means to promote industrial transformation in developing countries as a method to address the climate change problem. The title of this project is: Bridging Gaps - Enhancing Domestic and International Technological Collaboration to Enable the Adoption of Climate Relevant Technologies and Practices (CT and Ps) and thereby Foster Participation and Implementation of the Climate Convention (FCCC) by Developing Countries (DCs). In order to enhance technology co-operation, we believe that graduation profiles are needed at the international level and stakeholder involvement at both the national and international levels. refs

  6. A practicable approach for periodontal classification

    Science.gov (United States)

    Mittal, Vishnu; Bhullar, Raman Preet K.; Bansal, Rachita; Singh, Karanprakash; Bhalodi, Anand; Khinda, Paramjit K.

    2013-01-01

    The Diagnosis and classification of periodontal diseases has remained a dilemma since long. Two distinct concepts have been used to define diseases: Essentialism and Nominalism. Essentialistic concept implies the real existence of disease whereas; nominalistic concept states that the names of diseases are the convenient way of stating concisely the endpoint of a diagnostic process. It generally advances from assessment of symptoms and signs toward knowledge of causation and gives a feasible option to name the disease for which etiology is either unknown or it is too complex to access in routine clinical practice. Various classifications have been proposed by the American Academy of Periodontology (AAP) in 1986, 1989 and 1999. The AAP 1999 classification is among the most widely used classification. But this classification also has demerits which provide impediment for its use in day to day practice. Hence a classification and diagnostic system is required which can help the clinician to access the patient's need and provide a suitable treatment which is in harmony with the diagnosis for that particular case. Here is an attempt to propose a practicable classification and diagnostic system of periodontal diseases for better treatment outcome. PMID:24379855

  7. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  8. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    Science.gov (United States)

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  9. Classification of Marital Relationships: An Empirical Approach.

    Science.gov (United States)

    Snyder, Douglas K.; Smith, Gregory T.

    1986-01-01

    Derives an empirically based classification system of marital relationships, employing a multidimensional self-report measure of marital interaction. Spouses' profiles on the Marital Satisfaction Inventory for samples of clinic and nonclinic couples were subjected to cluster analysis, resulting in separate five-group typologies for husbands and…

  10. A new classification of large-scale climate regimes around the Tibetan Plateau based on seasonal circulation patterns

    Directory of Open Access Journals (Sweden)

    Xin-Gang Dai

    2017-03-01

    Full Text Available This study aims to develop a large-scale climate classification for investigating the characteristics of the climate regimes around the Tibetan Plateau based on seasonal precipitation, moisture transport and moisture divergence using in situ observations and ERA40 reanalysis data. The results indicate that the climate can be attributed to four regimes around the Plateau. They situate in East Asia, South Asia, Central Asia and the semi-arid zone in northern Central Asia throughout the dryland of northwestern China, in addition to the Köppen climate classification. There are different collocations of seasonal temperature and precipitation: 1 in phase for the East and South Asia monsoon regimes, 2 anti-phase for the Central Asia regime, 3 out-of-phase for the westerly regime. The seasonal precipitation concentrations are coupled with moisture divergence, i.e., moisture convergence coincides with the Asian monsoon zone and divergence appears over the Mediterranean-like arid climate region and westerly controlled area in the warm season, while it reverses course in the cold season. In addition, moisture divergence is associated with meridional moisture transport. The northward/southward moisture transport corresponds to moisture convergence/divergence, indicating that the wet and dry seasons are, to a great extent, dominated by meridional moisture transport in these regions. The climate mean southward transport results in the dry-cold season of the Asian monsoon zone and the dry-warm season, leading to desertification or land degradation in Central Asia and the westerly regime zone. The mean-wind moisture transport (MMT is the major contributor to total moisture transport, while persistent northward transient eddy moisture transport (TEMT plays a key role in dry season precipitation, especially in the Asian monsoon zone. The persistent TEMT divergence is an additional mechanism of the out-of-phase collocation in the westerly regime zone. In addition

  11. Toward a common classification approach for biorefinery systems

    NARCIS (Netherlands)

    Cherubini, F.; Jungmeier, G.; Wellisch, M.; Wilke, T.; Skiadas, I.; Ree, van R.; Jong, de E.

    2009-01-01

    This paper deals with a biorefinery classification approach developed within International Energy Agency (IEA) Bioenergy Task 42. Since production of transportation biofuels is seen as the driving force for future biorefinery developments, a selection of the most interesting transportation biofuels

  12. A Data Mining Classification Approach for Behavioral Malware Detection

    Directory of Open Access Journals (Sweden)

    Monire Norouzi

    2016-01-01

    Full Text Available Data mining techniques have numerous applications in malware detection. Classification method is one of the most popular data mining techniques. In this paper we present a data mining classification approach to detect malware behavior. We proposed different classification methods in order to detect malware based on the feature and behavior of each malware. A dynamic analysis method has been presented for identifying the malware features. A suggested program has been presented for converting a malware behavior executive history XML file to a suitable WEKA tool input. To illustrate the performance efficiency as well as training data and test, we apply the proposed approaches to a real case study data set using WEKA tool. The evaluation results demonstrated the availability of the proposed data mining approach. Also our proposed data mining approach is more efficient for detecting malware and behavioral classification of malware can be useful to detect malware in a behavioral antivirus.

  13. Transparent electrodes in the terahertz regime – a new approach

    DEFF Research Database (Denmark)

    Malureanu, Radu; Song, Z.; Zalkovskij, Maksim

    We suggest a new possibility for obtaining a transparent metallic film, thus allowing for completely transparent electrodes. By placing a complementary composite layer on top of the electrode, we can cancel the back-scattering of the latter thus obtaining a perfectly transparent structure. For ea...... of fabrication, we performed the first experiments in the THz regime, but the concept is applicable to the entire electromagnetic waves spectrum. We show that the experiments and theory match each other perfectly....

  14. Non-linear dynamical classification of short time series of the rössler system in high noise regimes.

    Science.gov (United States)

    Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J

    2013-01-01

    Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.

  15. Hydrological classification of natural flow regimes to support environmental flow assessments in intensively regulated Mediterranean rivers, Segura River Basin (Spain).

    Science.gov (United States)

    Belmar, Oscar; Velasco, Josefa; Martinez-Capel, Francisco

    2011-05-01

    Hydrological classification constitutes the first step of a new holistic framework for developing regional environmental flow criteria: the "Ecological Limits of Hydrologic Alteration (ELOHA)". The aim of this study was to develop a classification for 390 stream sections of the Segura River Basin based on 73 hydrological indices that characterize their natural flow regimes. The hydrological indices were calculated with 25 years of natural monthly flows (1980/81-2005/06) derived from a rainfall-runoff model developed by the Spanish Ministry of Environment and Public Works. These indices included, at a monthly or annual basis, measures of duration of droughts and central tendency and dispersion of flow magnitude (average, low and high flow conditions). Principal Component Analysis (PCA) indicated high redundancy among most hydrological indices, as well as two gradients: flow magnitude for mainstream rivers and temporal variability for tributary streams. A classification with eight flow-regime classes was chosen as the most easily interpretable in the Segura River Basin, which was supported by ANOSIM analyses. These classes can be simplified in 4 broader groups, with different seasonal discharge pattern: large rivers, perennial stable streams, perennial seasonal streams and intermittent and ephemeral streams. They showed a high degree of spatial cohesion, following a gradient associated with climatic aridity from NW to SE, and were well defined in terms of the fundamental variables in Mediterranean streams: magnitude and temporal variability of flows. Therefore, this classification is a fundamental tool to support water management and planning in the Segura River Basin. Future research will allow us to study the flow alteration-ecological response relationship for each river type, and set the basis to design scientifically credible environmental flows following the ELOHA framework.

  16. Precipitation regime classification for the Mojave Desert: Implications for fire occurrence

    Science.gov (United States)

    Tagestad, Jerry; Brooks, Matthew L.; Cullinan, Valerie; Downs, Janelle; McKinley, Randy

    2016-01-01

    Long periods of drought or above-average precipitation affect Mojave Desert vegetation condition, biomass and susceptibility to fire. Changes in the seasonality of precipitation alter the likelihood of lightning, a key ignition source for fires. The objectives of this study were to characterize the relationship between recent, historic, and future precipitation patterns and fire. Classifying monthly precipitation data from 1971 to 2010 reveals four precipitation regimes: low winter/low summer, moderate winter/moderate summer, high winter/low summer and high winter/high summer. Two regimes with summer monsoonal precipitation covered only 40% of the Mojave Desert ecoregion but contain 88% of the area burned and 95% of the repeat burn area. Classifying historic precipitation for early-century (wet) and mid-century (drought) periods reveals distinct shifts in regime boundaries. Early-century results are similar to current, while the mid-century results show a sizeable reduction in area of regimes with a strong monsoonal component. Such a shift would suggest that fires during the mid-century period would be minimal and anecdotal records confirm this. Predicted precipitation patterns from downscaled global climate models indicate numerous epochs of high winter precipitation, inferring higher fire potential for many multi-decade periods during the next century.

  17. An objective and parsimonious approach for classifying natural flow regimes at a continental scale

    Science.gov (United States)

    Archfield, S. A.; Kennen, J.; Carlisle, D.; Wolock, D.

    2013-12-01

    Hydroecological stream classification--the process of grouping streams by similar hydrologic responses and, thereby, similar aquatic habitat--has been widely accepted and is often one of the first steps towards developing ecological flow targets. Despite its importance, the last national classification of streamgauges was completed about 20 years ago. A new classification of 1,534 streamgauges in the contiguous United States is presented using a novel and parsimonious approach to understand similarity in ecological streamflow response. This new classification approach uses seven fundamental daily streamflow statistics (FDSS) rather than winnowing down an uncorrelated subset from 200 or more ecologically relevant streamflow statistics (ERSS) commonly used in hydroecological classification studies. The results of this investigation demonstrate that the distributions of 33 tested ERSS are consistently different among the classes derived from the seven FDSS. It is further shown that classification based solely on the 33 ERSS generally does a poorer job in grouping similar streamgauges than the classification based on the seven FDSS. This new classification approach has the additional advantages of overcoming some of the subjectivity associated with the selection of the classification variables and provides a set of robust continental-scale classes of US streamgauges.

  18. Inguinal hernia recurrence: Classification and approach

    Directory of Open Access Journals (Sweden)

    Campanelli Giampiero

    2006-01-01

    Full Text Available The authors reviewed the records of 2,468 operations of groin hernia in 2,350 patients, including 277 recurrent hernias updated to January 2005. The data obtained - evaluating technique, results and complications - were used to propose a simple anatomo-clinical classification into three types which could be used to plan the surgical strategy:Type R1: first recurrence ′high,′ oblique external, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R2: first recurrence ′low,′ direct, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R3: all the other recurrences - including femoral recurrences; recurrent groin hernia with big defect (inguinal eventration; multirecurrent hernias; nonreducible, linked with a controlateral primitive or recurrent hernia; and situations compromised from aggravating factors (for example obesity or anyway not easily included in R1 or R2, after pure tissue or mesh repair.

  19. Approaches to Substance of Social Infrastructure and to Its Classification

    Directory of Open Access Journals (Sweden)

    Kyrychenko Sergiy О. –

    2016-03-01

    Full Text Available The article is concerned with studying and analyzing approaches to both substance and classification of social infrastructure objects as a specific constellation of subsystems and components. To address the purpose set, the following tasks have been formulated: analysis of existing methods for determining the classification of social infrastructure; classification of the branches of social infrastructure using functional-dedicated approach; formulation of author's own definition of substance of social infrastructure. It has been determined that to date most often a social infrastructure classification is carried out depending on its functional tasks, although there are other approaches to classification. The author's definition of substance of social infrastructure has been formulated as follows: social infrastructure is a body of economy branches (public utilities, management, public safety and environment, socio-economic services, the purpose of which is to impact on reproductive potential and overall conditions of human activity in the spheres of work, everyday living, family, social-political, spiritual and intellectual development as well as life activity.

  20. Classification of Arctic, Mid-Latitude and Tropical Clouds in the Mixed-Phase Temperature Regime

    Science.gov (United States)

    Costa, Anja; Afchine, Armin; Luebke, Anna; Meyer, Jessica; Dorsey, James R.; Gallagher, Martin W.; Ehrlich, André; Wendisch, Manfred; Krämer, Martina

    2016-04-01

    The degree of glaciation and the sizes and habits of ice particles formed in mixed-phase clouds remain not fully understood. However, these properties define the mixed clouds' radiative impact on the Earth's climate and thus a correct representation of this cloud type in global climate models is of importance for an improved certainty of climate predictions. This study focuses on the occurrence and characteristics of two types of clouds in the mixed-phase temperature regime (238-275K): coexistence clouds (Coex), in which both liquid drops and ice crystals exist, and fully glaciated clouds that develop in the Wegener-Bergeron-Findeisen regime (WBF clouds). We present an extensive dataset obtained by the Cloud and Aerosol Particle Spectrometer NIXE-CAPS, covering Arctic, mid-latitude and tropical regions. In total, we spent 45.2 hours within clouds in the mixed-phase temperature regime during five field campaigns (Arctic: VERDI, 2012 and RACEPAC, 2014 - Northern Canada; mid-latitude: COALESC, 2011 - UK and ML-Cirrus, 2014 - central Europe; tropics: ACRIDICON, 2014 - Brazil). We show that WBF and Coex clouds can be identified via cloud particle size distributions. The classified datasets are used to analyse temperature dependences of both cloud types as well as range and frequencies of cloud particle concentrations and sizes. One result is that Coex clouds containing supercooled liquid drops are found down to temperatures of -40 deg C only in tropical mixed clouds, while in the Arctic and mid-latitudes no liquid drops are observed below about -20 deg C. In addition, we show that the cloud particles' aspherical fractions - derived from polarization signatures of particles with diameters between 20 and 50 micrometers - differ significantly between WBF and Coex clouds. In Coex clouds, the aspherical fraction of cloud particles is generally very low, but increases with decreasing temperature. In WBF clouds, where all cloud particles are ice, about 20-40% of the cloud

  1. Classification of Arctic, midlatitude and tropical clouds in the mixed-phase temperature regime

    Directory of Open Access Journals (Sweden)

    A. Costa

    2017-10-01

    Full Text Available The degree of glaciation of mixed-phase clouds constitutes one of the largest uncertainties in climate prediction. In order to better understand cloud glaciation, cloud spectrometer observations are presented in this paper, which were made in the mixed-phase temperature regime between 0 and −38 °C (273 to 235 K, where cloud particles can either be frozen or liquid. The extensive data set covers four airborne field campaigns providing a total of 139 000 1 Hz data points (38.6 h within clouds over Arctic, midlatitude and tropical regions. We develop algorithms, combining the information on number concentration, size and asphericity of the observed cloud particles to classify four cloud types: liquid clouds, clouds in which liquid droplets and ice crystals coexist, fully glaciated clouds after the Wegener–Bergeron–Findeisen process and clouds where secondary ice formation occurred. We quantify the occurrence of these cloud groups depending on the geographical region and temperature and find that liquid clouds dominate our measurements during the Arctic spring, while clouds dominated by the Wegener–Bergeron–Findeisen process are most common in midlatitude spring. The coexistence of liquid water and ice crystals is found over the whole mixed-phase temperature range in tropical convective towers in the dry season. Secondary ice is found at midlatitudes at −5 to −10 °C (268 to 263 K and at higher altitudes, i.e. lower temperatures in the tropics. The distribution of the cloud types with decreasing temperature is shown to be consistent with the theory of evolution of mixed-phase clouds. With this study, we aim to contribute to a large statistical database on cloud types in the mixed-phase temperature regime.

  2. Classification of Arctic, midlatitude and tropical clouds in the mixed-phase temperature regime

    Science.gov (United States)

    Costa, Anja; Meyer, Jessica; Afchine, Armin; Luebke, Anna; Günther, Gebhard; Dorsey, James R.; Gallagher, Martin W.; Ehrlich, Andre; Wendisch, Manfred; Baumgardner, Darrel; Wex, Heike; Krämer, Martina

    2017-10-01

    The degree of glaciation of mixed-phase clouds constitutes one of the largest uncertainties in climate prediction. In order to better understand cloud glaciation, cloud spectrometer observations are presented in this paper, which were made in the mixed-phase temperature regime between 0 and -38 °C (273 to 235 K), where cloud particles can either be frozen or liquid. The extensive data set covers four airborne field campaigns providing a total of 139 000 1 Hz data points (38.6 h within clouds) over Arctic, midlatitude and tropical regions. We develop algorithms, combining the information on number concentration, size and asphericity of the observed cloud particles to classify four cloud types: liquid clouds, clouds in which liquid droplets and ice crystals coexist, fully glaciated clouds after the Wegener-Bergeron-Findeisen process and clouds where secondary ice formation occurred. We quantify the occurrence of these cloud groups depending on the geographical region and temperature and find that liquid clouds dominate our measurements during the Arctic spring, while clouds dominated by the Wegener-Bergeron-Findeisen process are most common in midlatitude spring. The coexistence of liquid water and ice crystals is found over the whole mixed-phase temperature range in tropical convective towers in the dry season. Secondary ice is found at midlatitudes at -5 to -10 °C (268 to 263 K) and at higher altitudes, i.e. lower temperatures in the tropics. The distribution of the cloud types with decreasing temperature is shown to be consistent with the theory of evolution of mixed-phase clouds. With this study, we aim to contribute to a large statistical database on cloud types in the mixed-phase temperature regime.

  3. The Influence of temporal sampling regime on the WFD classification of catchments within the Eden Demonstration Test Catchment Project

    Science.gov (United States)

    Jonczyk, Jennine; Haygarth, Phil; Quinn, Paul; Reaney, Sim

    2014-05-01

    A high temporal resolution data set from the Eden Demonstration Test Catchment (DTC) project is used to investigate the processes causing pollution and the influence of temporal sampling regime on the WFD classification of three catchments. This data highlights WFD standards may not be fit for purpose. The Eden DTC project is part of a UK government-funded project designed to provide robust evidence regarding how diffuse pollution can be cost-effectively controlled to improve and maintain water quality in rural river catchments. The impact of multiple water quality parameters on ecosystems and sustainable food production are being studied at the catchment scale. Three focus catchments approximately 10 km2 each, have been selected to represent the different farming practices and geophysical characteristics across the Eden catchment, Northern England. A field experimental programme has been designed to monitor the dynamics of agricultural diffuse pollution at multiple scales using state of the art sensors providing continuous real time data. The data set, which includes Total Phosphorus and Total Reactive Phosphorus, Nitrate, Ammonium, pH, Conductivity, Turbidity and Chlorophyll a reveals the frequency and duration of nutrient concentration target exceedance which arises from the prevalence of storm events of increasing magnitude. This data set is sub-sampled at different time intervals to explore how different sampling regimes affects our understanding of nutrient dynamics and the ramification of the different regimes to WFD chemical status. This presentation seeks to identify an optimum temporal resolution of data for effective catchment management and to question the usefulness of the WFD status metric for determining health of a system. Criteria based on high frequency short duration events needs to be accounted for.

  4. Toward a common classification approach for biorefinery systems

    DEFF Research Database (Denmark)

    Cherubini, Francesco; Jungmeier, Gerfried; Wellisch, Maria

    2009-01-01

    until 2020 is based on their characteristics to be mixed with gasoline, diesel and natural gas, reflecting the main advantage of using the already-existing infrastructure for easier market introduction. This classification approach relies on four main features: (1) platforms; (2) products; (3) feedstock...

  5. Diagnosing Unemployment: The 'Classification' Approach to Multiple Causation

    NARCIS (Netherlands)

    Rodenburg, P.

    2002-01-01

    The establishment of appropriate policy measures for fighting unemployment has always been difficult since causes of unemployment are hard to identify. This paper analyses an approach used mainly in the 1960s and 1970s in economics, in which classification is used as a way to deal with such a

  6. A structuralist approach in the study of evolution and classification

    NARCIS (Netherlands)

    Hammen, van der L.

    1985-01-01

    A survey is given of structuralism as a method that can be applied in the study of evolution and classification. The results of a structuralist approach are illustrated by examples from the laws underlying numerical changes, from the laws underlying changes in the chelicerate life-cycle, and from

  7. Change classification in SAR time series: a functional approach

    Science.gov (United States)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  8. GROWTH REGIMES IN SUB-SAHARAN AFRICA: A MIXTUREMODEL APPROACH

    Directory of Open Access Journals (Sweden)

    Emmanuel Igbinoba

    2016-07-01

    Full Text Available This paper employs a generalized mixture model approach to empiricallydetermine if Sub-Saharan African countries henceforth (SSA follow ahomogenous growth pattern based on the conditional distribution of their growthrates. Latent effects are employed to determine the growth experience of SSAcountries and to examine the structural characteristics of the clusters if any exist.Affirmation of clusters might imply significant productivity divergence amongSub-Saharan economies, helping explaining the structural imbalances in theregion. Results strongly buttress the existence of clusters and little evidence of acommon growth path, implying divergence among Sub-Saharan economies andspecific economic reforms are required in the identified clusters to guaranteesustainability and equality of growth in the SSA region. We also observed apositive and significant effect of investment even though the estimated long runeffects of investment on economic growth are smaller than expected

  9. Assessing the Approaches to Classification of the State Financial Control

    OpenAIRE

    Baraniuk Yurii R.

    2017-01-01

    The article is aimed at assessing the approaches to classification of the State financial control, as well as disclosing the relationship and differences between its forms, types and methods. The results of comparative analysis of existing classifications of the State financial control have been covered. The substantiation of its identification by forms, types and methods of control was explored. Clarification of the interpretation of the concepts of «form of control», «type of control», «sub...

  10. Knowledge-based approach to video content classification

    Science.gov (United States)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  11. An Ultrasonic Pattern Recognition Approach to Welding Defect Classification

    International Nuclear Information System (INIS)

    Song, Sung Jin

    1995-01-01

    Classification of flaws in weldments from their ultrasonic scattering signals is very important in quantitative nondestructive evaluation. This problem is ideally suited to a modern ultrasonic pattern recognition technique. Here brief discussion on systematic approach to this methodology is presented including ultrasonic feature extraction, feature selection and classification. A stronger emphasis is placed on probabilistic neural networks as efficient classifiers for many practical classification problems. In an example probabilistic neural networks are applied to classify flaws in weldments into 3 classes such as cracks, porosity and slag inclusions. Probabilistic nets are shown to be able to exhibit high performance of other classifiers without any training time overhead. In addition, forward selection scheme for sensitive features is addressed to enhance network performance

  12. Sustainability in product development: a proposal for classification of approaches

    Directory of Open Access Journals (Sweden)

    Patrícia Flores Magnago

    2012-06-01

    Full Text Available The product development is a process that addresses sustainability issues inside companies. Many approaches have been discussed in academy concerning sustainability, as Natural Capitalism, Design for Environment (DfE and Life Cycle Analysis (LCA, but a question arises: which is indicated for what circumstance? This article aim is the proposition of a classification, based on a literature review, for 15 of these approaches. The criteria were: (i approach nature, (ii organization level, (iii integration level in Product Development Process (PDP, and (iv approach relevance for sustainability dimensions. Common terms allowed the establishment of connections among the approaches. As a result the researchers concluded that, despite they come from distinct knowledge areas they are not mutually excludent, on the contrary, the approaches may be used in a complementary way by managers. The combined use of complementary approaches is finally suggested in the paper.

  13. ADHD classification using bag of words approach on network features

    Science.gov (United States)

    Solmaz, Berkan; Dey, Soumyabrata; Rao, A. Ravishankar; Shah, Mubarak

    2012-02-01

    Attention Deficit Hyperactivity Disorder (ADHD) is receiving lots of attention nowadays mainly because it is one of the common brain disorders among children and not much information is known about the cause of this disorder. In this study, we propose to use a novel approach for automatic classification of ADHD conditioned subjects and control subjects using functional Magnetic Resonance Imaging (fMRI) data of resting state brains. For this purpose, we compute the correlation between every possible voxel pairs within a subject and over the time frame of the experimental protocol. A network of voxels is constructed by representing a high correlation value between any two voxels as an edge. A Bag-of-Words (BoW) approach is used to represent each subject as a histogram of network features; such as the number of degrees per voxel. The classification is done using a Support Vector Machine (SVM). We also investigate the use of raw intensity values in the time series for each voxel. Here, every subject is represented as a combined histogram of network and raw intensity features. Experimental results verified that the classification accuracy improves when the combined histogram is used. We tested our approach on a highly challenging dataset released by NITRC for ADHD-200 competition and obtained promising results. The dataset not only has a large size but also includes subjects from different demography and edge groups. To the best of our knowledge, this is the first paper to propose BoW approach in any functional brain disorder classification and we believe that this approach will be useful in analysis of many brain related conditions.

  14. STAR POLYMERS IN GOOD SOLVENTS FROM DILUTE TO CONCENTRATED REGIMES: CROSSOVER APPROACH

    Directory of Open Access Journals (Sweden)

    S.B.Kiselev

    2002-01-01

    Full Text Available An introduction is given to the crossover theory of the conformational and thermodynamic properties of star polymers in good solvents. The crossover theory is tested against Monte Carlo simulation data for the structure and thermodynamics of model star polymers. In good solvent conditions, star polymers approach a "universal" limit as N → ∞, however, there are two types of approach towards this limit. In the dilute regime, a critical degree of polymerization N* is found to play a similar role as the Ginzburg number in the crossover theory for critical phenomena in simple fluids. A rescaled penetration function is found to control the free energy of star polymer solutions in the dilute and semidilute regions. This equation of state captures the scaling behaviour of polymer solutions in the dilute/semidilute regimes and also performs well in the concentrated regimes, where the details of the monomer-monomer interactions become important.

  15. DETERMINANTS OF FOREIGN DIRECT INVESTMENT IN NIGERIA: A MARKOV REGIME-SWITCHING APPROACH

    Directory of Open Access Journals (Sweden)

    Akinlo A. Enisan

    2017-04-01

    Full Text Available Several studies have analyzed the movement of foreign direct investment in Nigeria using linear approach. In contrast with all existing studies in Nigeria, this paper runs several non linear FDI equations where the main determinants of FDI are determined using Markov- Regime Switching Model (MSMs. The approach enables us to observe structural changes, where exist, in FDI equations through time. Asides, where FDI regression equation is truly nonlinear, MSMs fit data better than the linear models. The paper adopts maximum likelihood methodology of Markov-Regime Model (MSM to identify possible structural changes in level and/or trends and possible changes in parameters of independent variables through the transition probabilities. The results show that FDI process in Nigeria is governed by two different regimes and a shift from one regime to another regime depends on transition probabilities. The results show that the main determinants of FDI are GDP growth, macro instability, financial development, exchange rate, inflation and discount rate. This implies liberalization that stems inflation and enhance the value of domestic currency will attract more FDI into the country.

  16. AN EXTENDED SPECTRAL–SPATIAL CLASSIFICATION APPROACH FOR HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    D. Akbari

    2017-11-01

    Full Text Available In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1 unsupervised feature extraction methods including principal component analysis (PCA, independent component analysis (ICA, and minimum noise fraction (MNF; (2 supervised feature extraction including decision boundary feature extraction (DBFE, discriminate analysis feature extraction (DAFE, and nonparametric weighted feature extraction (NWFE; (3 genetic algorithm (GA. The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.

  17. An Extended Spectral-Spatial Classification Approach for Hyperspectral Data

    Science.gov (United States)

    Akbari, D.

    2017-11-01

    In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.

  18. Hydroclimatic regimes: a distributed water-balance framework for hydrologic assessment, classification, and management

    Science.gov (United States)

    Weiskel, Peter K.; Wolock, David M.; Zarriello, Phillip J.; Vogel, Richard M.; Levin, Sara B.; Lent, Robert M.

    2014-01-01

    Runoff-based indicators of terrestrial water availability are appropriate for humid regions, but have tended to limit our basic hydrologic understanding of drylands – the dry-subhumid, semiarid, and arid regions which presently cover nearly half of the global land surface. In response, we introduce an indicator framework that gives equal weight to humid and dryland regions, accounting fully for both vertical (precipitation + evapotranspiration) and horizontal (groundwater + surface-water) components of the hydrologic cycle in any given location – as well as fluxes into and out of landscape storage. We apply the framework to a diverse hydroclimatic region (the conterminous USA) using a distributed water-balance model consisting of 53 400 networked landscape hydrologic units. Our model simulations indicate that about 21% of the conterminous USA either generated no runoff or consumed runoff from upgradient sources on a mean-annual basis during the 20th century. Vertical fluxes exceeded horizontal fluxes across 76% of the conterminous area. Long-term-average total water availability (TWA) during the 20th century, defined here as the total influx to a landscape hydrologic unit from precipitation, groundwater, and surface water, varied spatially by about 400 000-fold, a range of variation ~100 times larger than that for mean-annual runoff across the same area. The framework includes but is not limited to classical, runoff-based approaches to water-resource assessment. It also incorporates and reinterprets the green- and blue-water perspective now gaining international acceptance. Implications of the new framework for several areas of contemporary hydrology are explored, and the data requirements of the approach are discussed in relation to the increasing availability of gridded global climate, land-surface, and hydrologic data sets.

  19. Domain Adaptation for Opinion Classification: A Self-Training Approach

    Directory of Open Access Journals (Sweden)

    Yu, Ning

    2013-03-01

    Full Text Available Domain transfer is a widely recognized problem for machine learning algorithms because models built upon one data domain generally do not perform well in another data domain. This is especially a challenge for tasks such as opinion classification, which often has to deal with insufficient quantities of labeled data. This study investigates the feasibility of self-training in dealing with the domain transfer problem in opinion classification via leveraging labeled data in non-target data domain(s and unlabeled data in the target-domain. Specifically, self-training is evaluated for effectiveness in sparse data situations and feasibility for domain adaptation in opinion classification. Three types of Web content are tested: edited news articles, semi-structured movie reviews, and the informal and unstructured content of the blogosphere. Findings of this study suggest that, when there are limited labeled data, self-training is a promising approach for opinion classification, although the contributions vary across data domains. Significant improvement was demonstrated for the most challenging data domain-the blogosphere-when a domain transfer-based self-training strategy was implemented.

  20. A hybrid ensemble learning approach to star-galaxy classification

    Science.gov (United States)

    Kim, Edward J.; Brunner, Robert J.; Carrasco Kind, Matias

    2015-10-01

    There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template-fitting method. Using data from the CFHTLenS survey (Canada-France-Hawaii Telescope Lensing Survey), we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2 (Deep Extragalactic Evolutionary Probe Phase 2 ), SDSS (Sloan Digital Sky Survey), VIPERS (VIMOS Public Extragalactic Redshift Survey), and VVDS (VIMOS VLT Deep Survey), and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, strategies that combine the predictions of different classifiers may prove to be optimal in currently ongoing and forthcoming photometric surveys, such as the Dark Energy Survey and the Large Synoptic Survey Telescope.

  1. An intelligent software approach to ultrasonic flaw classification in weldments

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Hak Joon; Lee, Hyun

    1997-01-01

    Ultrasonic pattern recognition is the most effective approach to the problem of discriminating types of flaws in weldments based on ultrasonic flaw signals. In spite of significant progress on this methodology, it has not been widely used in practical ultrasonic inspection of weldments in industry. Hence, for the convenient application of this approach in many practical situations, we develop an intelligent ultrasonic signature classification software which can discriminate types of flaws in weldments using various tools in artificial intelligence such as neural networks. This software shows excellent performances in an experimental problem where flaws in weldments are classified into two categories of cracks and non-cracks.

  2. Data classification and MTBF prediction with a multivariate analysis approach

    International Nuclear Information System (INIS)

    Braglia, Marcello; Carmignani, Gionata; Frosolini, Marco; Zammori, Francesco

    2012-01-01

    The paper presents a multivariate statistical approach that supports the classification of mechanical components, subjected to specific operating conditions, in terms of the Mean Time Between Failure (MTBF). Assessing the influence of working conditions and/or environmental factors on the MTBF is a prerequisite for the development of an effective preventive maintenance plan. However, this task may be demanding and it is generally performed with ad-hoc experimental methods, lacking of statistical rigor. To solve this common problem, a step by step multivariate data classification technique is proposed. Specifically, a set of structured failure data are classified in a meaningful way by means of: (i) cluster analysis, (ii) multivariate analysis of variance, (iii) feature extraction and (iv) predictive discriminant analysis. This makes it possible not only to define the MTBF of the analyzed components, but also to identify the working parameters that explain most of the variability of the observed data. The approach is finally demonstrated on 126 centrifugal pumps installed in an oil refinery plant; obtained results demonstrate the quality of the final discrimination, in terms of data classification and failure prediction.

  3. Assessing the Approaches to Classification of the State Financial Control

    Directory of Open Access Journals (Sweden)

    Baraniuk Yurii R.

    2017-11-01

    Full Text Available The article is aimed at assessing the approaches to classification of the State financial control, as well as disclosing the relationship and differences between its forms, types and methods. The results of comparative analysis of existing classifications of the State financial control have been covered. The substantiation of its identification by forms, types and methods of control was explored. Clarification of the interpretation of the concepts of «form of control», «type of control», «subtype of control», «method of control», «methodical reception of control» has been provided. It has been determined that the form of the State financial control is a manifestation of the internal organization of control and the methods of its carrying out; a model of classification of the State financial control has been substantiated; attributes of the first and second order have been allocated; substantiation of methods and techniques has been improved; their composition and structure have been identified. This approach allows to divide general questions of the State financial control into theoretical and practical and, taking into consideration the expansion of the list of objects of the State financial control, will help to improve its methodology.

  4. Automatic classification of hyperactive children: comparing multiple artificial intelligence approaches.

    Science.gov (United States)

    Delavarian, Mona; Towhidkhah, Farzad; Gharibzadeh, Shahriar; Dibajnia, Parvin

    2011-07-12

    Automatic classification of different behavioral disorders with many similarities (e.g. in symptoms) by using an automated approach will help psychiatrists to concentrate on correct disorder and its treatment as soon as possible, to avoid wasting time on diagnosis, and to increase the accuracy of diagnosis. In this study, we tried to differentiate and classify (diagnose) 306 children with many similar symptoms and different behavioral disorders such as ADHD, depression, anxiety, comorbid depression and anxiety and conduct disorder with high accuracy. Classification was based on the symptoms and their severity. With examining 16 different available classifiers, by using "Prtools", we have proposed nearest mean classifier as the most accurate classifier with 96.92% accuracy in this research. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  5. Non-invasive classification of gas–liquid two-phase horizontal flow regimes using an ultrasonic Doppler sensor and a neural network

    OpenAIRE

    Abbagoni, Baba Musa; Yeung, Hoi

    2016-01-01

    The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas–liquid flow regimes objectively with the gas–liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase ...

  6. A Risk-Based Ecohydrological Approach to Assessing Environmental Flow Regimes

    Science.gov (United States)

    Mcgregor, Glenn B.; Marshall, Jonathan C.; Lobegeiger, Jaye S.; Holloway, Dean; Menke, Norbert; Coysh, Julie

    2018-03-01

    For several decades there has been recognition that water resource development alters river flow regimes and impacts ecosystem values. Determining strategies to protect or restore flow regimes to achieve ecological outcomes is a focus of water policy and legislation in many parts of the world. However, consideration of existing environmental flow assessment approaches for application in Queensland identified deficiencies precluding their adoption. Firstly, in managing flows and using ecosystem condition as an indicator of effectiveness, many approaches ignore the fact that river ecosystems are subjected to threatening processes other than flow regime alteration. Secondly, many focus on providing flows for responses without considering how often they are necessary to sustain ecological values in the long-term. Finally, few consider requirements at spatial-scales relevant to the desired outcomes, with frequent focus on individual places rather than the regions supporting sustainability. Consequently, we developed a risk-based ecohydrological approach that identifies ecosystem values linked to desired ecological outcomes, is sensitive to flow alteration and uses indicators of broader ecosystem requirements. Monitoring and research is undertaken to quantify flow-dependencies and ecological modelling is used to quantify flow-related ecological responses over an historical flow period. The relative risk from different flow management scenarios can be evaluated at relevant spatial-scales. This overcomes the deficiencies identified above and provides a robust and useful foundation upon which to build the information needed to support water planning decisions. Application of the risk assessment approach is illustrated here by two case studies.

  7. Flow regimes

    International Nuclear Information System (INIS)

    Kh'yuitt, G.

    1980-01-01

    An introduction into the problem of two-phase flows is presented. Flow regimes arizing in two-phase flows are described, and classification of these regimes is given. Structures of vertical and horizontal two-phase flows and a method of their identification using regime maps are considered. The limits of this method application are discussed. The flooding phenomena and phenomena of direction change (flow reversal) of the flow and interrelation of these phenomena as well as transitions from slug regime to churn one and from churn one to annular one in vertical flows are described. Problems of phase transitions and equilibrium are discussed. Flow regimes in tubes where evaporating liquid is running, are described [ru

  8. Exploring the physical controls of regional patterns of flow duration curves – Part 3: A catchment classification system based on regime curve indicators

    Directory of Open Access Journals (Sweden)

    M. Sivapalan

    2012-11-01

    Full Text Available Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of flow duration curves (FDCs, motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser 3 (ID3 algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the five catchments' regime behavior and climate and catchment properties becomes clearer. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index

  9. A dictionary learning approach for human sperm heads classification.

    Science.gov (United States)

    Shaker, Fariba; Monadjemi, S Amirhassan; Alirezaie, Javad; Naghsh-Nilchi, Ahmad Reza

    2017-12-01

    To diagnose infertility in men, semen analysis is conducted in which sperm morphology is one of the factors that are evaluated. Since manual assessment of sperm morphology is time-consuming and subjective, automatic classification methods are being developed. Automatic classification of sperm heads is a complicated task due to the intra-class differences and inter-class similarities of class objects. In this research, a Dictionary Learning (DL) technique is utilized to construct a dictionary of sperm head shapes. This dictionary is used to classify the sperm heads into four different classes. Square patches are extracted from the sperm head images. Columnized patches from each class of sperm are used to learn class-specific dictionaries. The patches from a test image are reconstructed using each class-specific dictionary and the overall reconstruction error for each class is used to select the best matching class. Average accuracy, precision, recall, and F-score are used to evaluate the classification method. The method is evaluated using two publicly available datasets of human sperm head shapes. The proposed DL based method achieved an average accuracy of 92.2% on the HuSHeM dataset, and an average recall of 62% on the SCIAN-MorphoSpermGS dataset. The results show a significant improvement compared to a previously published shape-feature-based method. We have achieved high-performance results. In addition, our proposed approach offers a more balanced classifier in which all four classes are recognized with high precision and recall. In this paper, we use a Dictionary Learning approach in classifying human sperm heads. It is shown that the Dictionary Learning method is far more effective in classifying human sperm heads than classifiers using shape-based features. Also, a dataset of human sperm head shapes is introduced to facilitate future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A three-way approach for protein function classification.

    Directory of Open Access Journals (Sweden)

    Hafeez Ur Rehman

    Full Text Available The knowledge of protein functions plays an essential role in understanding biological cells and has a significant impact on human life in areas such as personalized medicine, better crops and improved therapeutic interventions. Due to expense and inherent difficulty of biological experiments, intelligent methods are generally relied upon for automatic assignment of functions to proteins. The technological advancements in the field of biology are improving our understanding of biological processes and are regularly resulting in new features and characteristics that better describe the role of proteins. It is inevitable to neglect and overlook these anticipated features in designing more effective classification techniques. A key issue in this context, that is not being sufficiently addressed, is how to build effective classification models and approaches for protein function prediction by incorporating and taking advantage from the ever evolving biological information. In this article, we propose a three-way decision making approach which provides provisions for seeking and incorporating future information. We considered probabilistic rough sets based models such as Game-Theoretic Rough Sets (GTRS and Information-Theoretic Rough Sets (ITRS for inducing three-way decisions. An architecture of protein functions classification with probabilistic rough sets based three-way decisions is proposed and explained. Experiments are carried out on Saccharomyces cerevisiae species dataset obtained from Uniprot database with the corresponding functional classes extracted from the Gene Ontology (GO database. The results indicate that as the level of biological information increases, the number of deferred cases are reduced while maintaining similar level of accuracy.

  11. Comparison of four approaches to a rock facies classification problem

    Science.gov (United States)

    Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.

    2007-01-01

    In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.

  12. A neural network approach for radiographic image classification in NDT

    International Nuclear Information System (INIS)

    Lavayssiere, B.

    1993-05-01

    Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this note, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighbourhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (author). 5 figs., 21 refs

  13. A revised 3-column classification approach for the surgical planning of extended lateral tibial plateau fractures.

    Science.gov (United States)

    Hoekstra, H; Kempenaers, K; Nijs, S

    2017-10-01

    Variable angle locking compression plates allow for lateral buttress and support of the posterolateral joint surface of tibial plateau fractures. This gives room for improvement of the surgical 3-column classification approach. Our aim was to revise and validate the 3-column classification approach to better guide the surgical planning of tibial plateau fractures extending into the posterolateral corner. In contrast to the 3-column classification approach, in the revised approach the posterior border of the lateral column in the revised approach lies posterior instead of anterior of the fibula. According to the revised 3-column classification approach, extended lateral column fractures are defined as single lateral column fractures extending posteriorly into the posterolateral corner. CT-images of 36 patients were reviewed and classified twice online according to Schatzker and revised 3-column classification approach by five observers. The intraobserver reliability was calculated using the Cohen's kappa and the interobserver reliability was calculated using the Fleiss' kappa. The intraobserver reliability showed substantial agreement according to Landis and Koch for both Schatzker and the revised 3-column classification approach (0.746 vs. 0.782 p = 0.37, Schatzker vs. revised 3-column, respectively). However, the interobserver reliability of the revised 3-column classification approach was significantly higher as compared to the Schatzker classification (0.531 vs. 0.669 p column, respectively). With the introduction of variable angle locking compression plates, the revised 3-column classification approach is a very helpful tool in the preoperative surgical planning of tibial plateau fractures, in particular, lateral column fractures that extend into the posterolateral corner. The revised 3-column classification approach is rather a practical supplement to the Schatzker classification. It has a significantly higher interobserver reliability as compared to the

  14. Chorological classification approach for species and ecosystem conservation practice

    Science.gov (United States)

    Rogova, T. V.; Kozevnikova, M. V.; Prokhorov, V. E.; Timofeeva, N. O.

    2018-01-01

    The habitat type allocation approach based on the EUNIS Habitat Classification and the JUICE version 7 software is used for the conservation of species and ecosystem biodiversity. Using the vegetation plots of the Vegetation Database of Tatarstan, included in the EVA (European Vegetation Archive) and GIVD (Global Index of Vegetation-plots Databases) types of habitats of dry meadows and steppes are distinguished by differing compositions of the leading families composing their flora - Asteraceae, Fabaceae, Poaceae and Rosaceae. E12a - Semi-dry perennial calcareous grassland, and E12b - Perennial calcareous grassland and basic steppes were identified. The selected group of relevés that do not correspond to any of the EUNIS types can be considered specific for ecotone forest-steppe landscapes of the southeast of the Republic of Tatarstan. In all types of studied habitats, rare and protected plant species are noted, most of which are South-East-European-Asian species.

  15. PATTERN CLASSIFICATION APPROACHES TO MATCHING BUILDING POLYGONS AT MULTIPLE SCALES

    Directory of Open Access Journals (Sweden)

    X. Zhang

    2012-07-01

    Full Text Available Matching of building polygons with different levels of detail is crucial in the maintenance and quality assessment of multi-representation databases. Two general problems need to be addressed in the matching process: (1 Which criteria are suitable? (2 How to effectively combine different criteria to make decisions? This paper mainly focuses on the second issue and views data matching as a supervised pattern classification. Several classifiers (i.e. decision trees, Naive Bayes and support vector machines are evaluated for the matching task. Four criteria (i.e. position, size, shape and orientation are used to extract information for these classifiers. Evidence shows that these classifiers outperformed the weighted average approach.

  16. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  17. Current legal regime for environmental impact assessment in areas beyond national jurisdiction and its future approaches

    International Nuclear Information System (INIS)

    Ma, Deqiang; Fang, Qinhua; Guan, Song

    2016-01-01

    In 2004, the United Nations launched an Ad Hoc Open-ended Informal Working Group to study issues relating to the conservation and sustainable use of marine biological diversity in areas beyond national jurisdiction. Since then, the topic of governing marine areas beyond national jurisdiction (ABNJ) has been widely discussed by politicians, policy makers and scholars. As one of management tools to protect marine biodiversity in ABNJ, environmental impact assessment (EIA) has been widely recognized and accepted by the international community, however, the biggest challenge is how to effectively implement the EIA regime in ABNJ. This paper explores the impacts of anthropogenic activities in ABNJ on marine ecosystems, reviews the existing legal regime for EIA in ABNJ and discusses possible measures to strengthen the implementation of EIA in ABNJ. - Highlights: • We identify human activities in ABNJ and their impacts on marine ecosystems. • We analyze the characters and gaps of the existing legal regime for EIA in ABNJ. • We analyze the pros and cons of alternative approaches of EIA in ABNJ.

  18. Current legal regime for environmental impact assessment in areas beyond national jurisdiction and its future approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Deqiang [Fujian Provincial Key Laboratory for Coastal Ecology and Environmental Studies, Xiamen University, 361102 (China); Coastal and Ocean Management Institute, Xiamen University, 361102 (China); Fang, Qinhua, E-mail: qhfang@xmu.edu.cn [Fujian Provincial Key Laboratory for Coastal Ecology and Environmental Studies, Xiamen University, 361102 (China); Coastal and Ocean Management Institute, Xiamen University, 361102 (China); Guan, Song [Coastal and Ocean Management Institute, Xiamen University, 361102 (China)

    2016-01-15

    In 2004, the United Nations launched an Ad Hoc Open-ended Informal Working Group to study issues relating to the conservation and sustainable use of marine biological diversity in areas beyond national jurisdiction. Since then, the topic of governing marine areas beyond national jurisdiction (ABNJ) has been widely discussed by politicians, policy makers and scholars. As one of management tools to protect marine biodiversity in ABNJ, environmental impact assessment (EIA) has been widely recognized and accepted by the international community, however, the biggest challenge is how to effectively implement the EIA regime in ABNJ. This paper explores the impacts of anthropogenic activities in ABNJ on marine ecosystems, reviews the existing legal regime for EIA in ABNJ and discusses possible measures to strengthen the implementation of EIA in ABNJ. - Highlights: • We identify human activities in ABNJ and their impacts on marine ecosystems. • We analyze the characters and gaps of the existing legal regime for EIA in ABNJ. • We analyze the pros and cons of alternative approaches of EIA in ABNJ.

  19. A simplified approach for the molecular classification of glioblastomas.

    Directory of Open Access Journals (Sweden)

    Marie Le Mercier

    Full Text Available Glioblastoma (GBM is the most common malignant primary brain tumors in adults and exhibit striking aggressiveness. Although GBM constitute a single histological entity, they exhibit considerable variability in biological behavior, resulting in significant differences in terms of prognosis and response to treatment. In an attempt to better understand the biology of GBM, many groups have performed high-scale profiling studies based on gene or protein expression. These studies have revealed the existence of several GBM subtypes. Although there remains to be a clear consensus, two to four major subtypes have been identified. Interestingly, these different subtypes are associated with both differential prognoses and responses to therapy. In the present study, we investigated an alternative immunohistochemistry (IHC-based approach to achieve a molecular classification for GBM. For this purpose, a cohort of 100 surgical GBM samples was retrospectively evaluated by immunohistochemical analysis of EGFR, PDGFRA and p53. The quantitative analysis of these immunostainings allowed us to identify the following two GBM subtypes: the "Classical-like" (CL subtype, characterized by EGFR-positive and p53- and PDGFRA-negative staining and the "Proneural-like" (PNL subtype, characterized by p53- and/or PDGFRA-positive staining. This classification represents an independent prognostic factor in terms of overall survival compared to age, extent of resection and adjuvant treatment, with a significantly longer survival associated with the PNL subtype. Moreover, these two GBM subtypes exhibited different responses to chemotherapy. The addition of temozolomide to conventional radiotherapy significantly improved the survival of patients belonging to the CL subtype, but it did not affect the survival of patients belonging to the PNL subtype. We have thus shown that it is possible to differentiate between different clinically relevant subtypes of GBM by using IHC

  20. An attempt of classification of theoretical approaches to national identity

    Directory of Open Access Journals (Sweden)

    Milošević-Đorđević Jasna S.

    2003-01-01

    Full Text Available It is compulsory that complex social concepts should be defined in different ways and approached from the perspective of different science disciplines. Therefore, it is difficult to precisely define them without overlapping of meaning with other similar concepts. This paper has made an attempt towards theoretical classification of the national identity and differentiate that concept in comparison to the other related concepts (race, ethnic group, nation, national background, authoritativeness, patriarchy. Theoretical assessments are classified into two groups: ones that are dealing with nature of national identity and others that are stating one or more dimensions of national identity, crucial for its determination. On the contrary to the primordialistic concept of national identity, describing it as a fundamental, deeply rooted human feature, there are many numerous contemporary theoretical approaches (instrumentalist, constructivist, functionalistic, emphasizing changeable, fluid, instrumentalist function of the national identity. Fundamental determinants of national identity are: language, culture (music, traditional myths, state symbols (territory, citizenship, self-categorization, religion, set of personal characteristics and values.

  1. A Modeling Approach for Evaluating the Coupled Riparian Vegetation-Geomorphic Response to Altered Flow Regimes

    Science.gov (United States)

    Manners, R.; Wilcox, A. C.; Merritt, D. M.

    2016-12-01

    The ecogeomorphic response of riparian ecosystems to a change in hydrologic properties is difficult to predict because of the interactions and feedbacks among plants, water, and sediment. Most riparian models of community dynamics assume a static channel, yet geomorphic processes strongly control the establishment and survival of riparian vegetation. Using a combination of approaches that includes empirical relationships and hydrodynamic models, we model the coupled vegetation-topographic response of three cross-sections on the Yampa and Green Rivers in Dinosaur National Monument, to a shift in the flow regime. The locations represent the variable geomorphology and vegetation composition of these canyon-bound rivers. We account for the inundation and hydraulic properties of vegetation plots surveyed over three years within International River Interface Cooperative (iRIC) Fastmech, equipped with a vegetation module that accounts for flexible stems and plant reconfiguration. The presence of functional groupings of plants, or those plants that respond similarly to environmental factors such as water availability and disturbance are determined from flow response curves developed for the Yampa River. Using field measurements of vegetation morphology, distance from the channel centerline, and dominant particle size and modeled inundation properties we develop an empirical relationship between these variables and topographic change. We evaluate vegetation and channel form changes over decadal timescales, allowing for the integration of processes over time. From our analyses, we identify thresholds in the flow regime that alter the distribution of plants and reduce geomorphic complexity, predominately through side-channel and backwater infilling. Simplification of some processes (e.g., empirically-derived sedimentation) and detailed treatment of others (e.g., plant-flow interactions) allows us to model the coupled dynamics of riparian ecosystems and evaluate the impact of

  2. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  3. Evaluation of Current Approaches to Stream Classification and a Heuristic Guide to Developing Classifications of Integrated Aquatic Networks

    Science.gov (United States)

    Melles, S. J.; Jones, N. E.; Schmidt, B. J.

    2014-03-01

    Conservation and management of fresh flowing waters involves evaluating and managing effects of cumulative impacts on the aquatic environment from disturbances such as: land use change, point and nonpoint source pollution, the creation of dams and reservoirs, mining, and fishing. To assess effects of these changes on associated biotic communities it is necessary to monitor and report on the status of lotic ecosystems. A variety of stream classification methods are available to assist with these tasks, and such methods attempt to provide a systematic approach to modeling and understanding complex aquatic systems at various spatial and temporal scales. Of the vast number of approaches that exist, it is useful to group them into three main types. The first involves modeling longitudinal species turnover patterns within large drainage basins and relating these patterns to environmental predictors collected at reach and upstream catchment scales; the second uses regionalized hierarchical classification to create multi-scale, spatially homogenous aquatic ecoregions by grouping adjacent catchments together based on environmental similarities; and the third approach groups sites together on the basis of similarities in their environmental conditions both within and between catchments, independent of their geographic location. We review the literature with a focus on more recent classifications to examine the strengths and weaknesses of the different approaches. We identify gaps or problems with the current approaches, and we propose an eight-step heuristic process that may assist with development of more flexible and integrated aquatic classifications based on the current understanding, network thinking, and theoretical underpinnings.

  4. A discrimlnant function approach to ecological site classification in northern New England

    Science.gov (United States)

    James M. Fincher; Marie-Louise Smith

    1994-01-01

    Describes one approach to ecologically based classification of upland forest community types of the White and Green Mountain physiographic regions. The classification approach is based on an intensive statistical analysis of the relationship between the communities and soil-site factors. Discriminant functions useful in distinguishing between types based on soil-site...

  5. A Cointegrated Regime-Switching Model Approach with Jumps Applied to Natural Gas Futures Prices

    Directory of Open Access Journals (Sweden)

    Daniel Leonhardt

    2017-09-01

    Full Text Available Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for different states of the underlying volatility of the futures. In this paper, we therefore allow for cointegration of the term structure within a multi-factor model, which includes seasonality, as well as joint and individual jumps in the price processes of futures with different maturities. The seasonality in this model is realized via a deterministic function, and the jumps are represented with thinned-out compound Poisson processes. The model also includes a regime-switching approach that is modelled through a Markov chain and extends the class of geometric models. We show how the model can be calibrated to empirical data and give some practical applications.

  6. A probabilistic approach to quantifying spatial patterns of flow regimes and network-scale connectivity

    Science.gov (United States)

    Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca

    2017-04-01

    The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of

  7. Decision tree approach for classification of remotely sensed satellite ...

    Indian Academy of Sciences (India)

    sensed satellite data using open source support. Richa Sharma .... Decision tree classification techniques have been .... the USGS Earth Resource Observation Systems. (EROS) ... for shallow water, 11% were for sparse and dense built-up ...

  8. Decision tree approach for classification of remotely sensed satellite

    Indian Academy of Sciences (India)

    DTC) algorithm for classification of remotely sensed satellite data (Landsat TM) using open source support. The decision tree is constructed by recursively partitioning the spectral distribution of the training dataset using WEKA, open source ...

  9. Nanomedical science and laser-driven particle acceleration: promising approaches in the prethermal regime

    Science.gov (United States)

    Gauduel, Y. A.

    2017-05-01

    A major challenge of spatio-temporal radiation biomedicine concerns the understanding of biophysical events triggered by an initial energy deposition inside confined ionization tracks. This contribution deals with an interdisciplinary approach that concerns cutting-edge advances in real-time radiation events, considering the potentialities of innovating strategies based on ultrafast laser science, from femtosecond photon sources to advanced techniques of ultrafast TW laser-plasma accelerator. Recent advances of powerful TW laser sources ( 1019 W cm-2) and laser-plasma interactions providing ultra-short relativistic particle beams in the energy domain 5-200 MeV open promising opportunities for the development of high energy radiation femtochemistry (HERF) in the prethermal regime of secondary low-energy electrons and for the real-time imaging of radiation-induced biomolecular alterations at the nanoscopic scale. New developments would permit to correlate early radiation events triggered by ultrashort radiation sources with a molecular approach of Relative Biological Effectiveness (RBE). These emerging research developments are crucial to understand simultaneously, at the sub-picosecond and nanometric scales, the early consequences of ultra-short-pulsed radiation on biomolecular environments or integrated biological entities. This innovating approach would be applied to biomedical relevant concepts such as the emerging domain of real-time nanodosimetry for targeted pro-drug activation and pulsed radio-chimiotherapy of cancers.

  10. A Nuclear Third Party Liability Regime of a Multilateral Nuclear Approaches Framework in the Asian Region

    Directory of Open Access Journals (Sweden)

    Makiko Tazaki

    2014-01-01

    Full Text Available There are two primary challenges for establishing nuclear third party liability (TPL regimes within multilateral nuclear approaches (MNA to nuclear fuel cycle facilities in the Asian region. The first challenge is to ensure secure and prompt compensation, especially for transboundary damages, which is also a challenge for a nation-based facility. One possible solution is that in order to share common nuclear TPL principles, all states in the region participate in the same international nuclear TPL convention, such as the Convention on Supplementary Compensation for Nuclear Damage (CSC, with a view to its entry into force in the future. One problem with this approach is that many states in the Asian region need to raise their amount of financial security in order to be able to participate in the CSC. The second challenge lies with the multiple MNA member states and encompasses the question of how decisions are to be made and responsabilities of an installation state are to be shared in case of a nuclear incident. Principally, a host state of the MNA facility takes on this responsibility. However, in certain situations and in agreement with all MNA member states, such responsibilities can be indirectly shared among all MNA member states. This can be done through internal arrangements within the MNA framework, such as reimbursement to a host state based on pre-agreed shares in accordance with investment and/or making deposits on such reimbursements in case of an incident.

  11. State Traditions and Language Regimes: A Historical Institutionalism Approach to Language Policy

    Directory of Open Access Journals (Sweden)

    Sonntag Selma K.

    2015-12-01

    Full Text Available This paper is an elaboration of a theoretical framework we developed in the introductory chapter of our co-edited volume, State Traditions and Language Regimes (McGill-Queen’s University Press, 2015. Using a historical institutionalism approach derived from political science, we argue that language policies need to be understood in terms of their historical and institutional context. The concept of ‘state tradition’ focuses our attention on the relative autonomy of the state in terms of its normative and institutional traditions that lead to particular path dependencies of language policy choices, subject to change at critical junctures. ‘Language regime’ is the conceptual link between state traditions and language policy choices: it allows us to analytically conceptualize how and why these choices are made and how and why they change. We suggest that our framework offers a more robust analysis of language politics than other approaches found in sociolinguistics and normative theory. It also challenges political science to become more engaged with scholarly debate on language policy and linguistic diversity.

  12. Three naive Bayes approaches for discrimination-free classification

    NARCIS (Netherlands)

    Calders, T.G.K.; Verwer, S.E.

    2010-01-01

    In this paper, we investigate how to modify the naive Bayes classifier in order to perform classification that is restricted to be independent with respect to a given sensitive attribute. Such independency restrictions occur naturally when the decision process leading to the labels in the data-set

  13. A novel approach for classification of abnormalities in digitized ...

    Indian Academy of Sciences (India)

    Feature extraction is an important process for the overall system performance in classification. The objective of this article is to reveal the effectiveness of texture feature analysis for detecting the abnormalities in digitized mammograms using Self Adaptive Resource Allocation Network (SRAN) classifier. Thus, we proposed a ...

  14. A Hybrid Feature Selection Approach for Arabic Documents Classification

    NARCIS (Netherlands)

    Habib, Mena Badieh; Sarhan, Ahmed A. E.; Salem, Abdel-Badeeh M.; Fayed, Zaki T.; Gharib, Tarek F.

    Text Categorization (classification) is the process of classifying documents into a predefined set of categories based on their content. Text categorization algorithms usually represent documents as bags of words and consequently have to deal with huge number of features. Feature selection tries to

  15. Dynamic Allocation or Diversification: A Regime-Based Approach to Multiple Assets

    DEFF Research Database (Denmark)

    Nystrup, Peter; Hansen, Bo William; Larsen, Henrik Olejasz

    2018-01-01

    ’ behavior and a new, more intuitive way of inferring the hidden market regimes. The empirical results show that regime-based asset allocation is profitable, even when compared to a diversified benchmark portfolio. The results are robust because they are based on available market data with no assumptions...... about forecasting skills....

  16. MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH

    Data.gov (United States)

    National Aeronautics and Space Administration — MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Multispectral remote sensing images have...

  17. Toward noncooperative iris recognition: a classification approach using multiple signatures.

    Science.gov (United States)

    Proença, Hugo; Alexandre, Luís A

    2007-04-01

    This paper focuses on noncooperative iris recognition, i.e., the capture of iris images at large distances, under less controlled lighting conditions, and without active participation of the subjects. This increases the probability of capturing very heterogeneous images (regarding focus, contrast, or brightness) and with several noise factors (iris obstructions and reflections). Current iris recognition systems are unable to deal with noisy data and substantially increase their error rates, especially the false rejections, in these conditions. We propose an iris classification method that divides the segmented and normalized iris image into six regions, makes an independent feature extraction and comparison for each region, and combines each of the dissimilarity values through a classification rule. Experiments show a substantial decrease, higher than 40 percent, of the false rejection rates in the recognition of noisy iris images.

  18. Classification, diagnosis, and approach to treatment for angioedema

    DEFF Research Database (Denmark)

    Cicardi, M; Aberer, W; Banerji, A

    2014-01-01

    Angioedema is defined as localized and self-limiting edema of the subcutaneous and submucosal tissue, due to a temporary increase in vascular permeability caused by the release of vasoactive mediator(s). When angioedema recurs without significant wheals, the patient should be diagnosed to have...... angioedema as a distinct disease. In the absence of accepted classification, different types of angioedema are not uniquely identified. For this reason, the European Academy of Allergy and Clinical Immunology gave its patronage to a consensus conference aimed at classifying angioedema. Four types of acquired...... and three types of hereditary angioedema were identified as separate forms from the analysis of the literature and were presented in detail at the meeting. Here, we summarize the analysis of the data and the resulting classification of angioedema....

  19. The effects of crude oil shocks on stock market shifts behaviour A regime switching approach

    Energy Technology Data Exchange (ETDEWEB)

    Aloui, Chaker; Jammazi, Rania [International Finance Group-Tunisia, Faculty of Management and Economic Sciences of Tunis, Boulevard du 7 novembre, El Manar University, B.P. 248, C.P. 2092, Tunis Cedex (Tunisia)

    2009-09-15

    In this paper we develop a two regime Markov-switching EGARCH model introduced by Henry [Henry, O., 2009. Regime switching in the relationship between equity returns and short-term interest rates. Journal of Banking and Finance 33, 405-414.] to examine the relationship between crude oil shocks and stock markets. An application to stock markets of UK, France and Japan over the sample period January 1989 to December 2007 illustrates plausible results. We detect two episodes of series behaviour one relative to low mean/high variance regime and the other to high mean/low variance regime. Furthermore, there is evidence that common recessions coincide with the low mean/high variance regime. In addition, we allow both real stock returns and probability of transitions from one regime to another to depend on the net oil price increase variable. The findings show that rises in oil price has a significant role in determining both the volatility of stock returns and the probability of transition across regimes. (author)

  20. The effects of crude oil shocks on stock market shifts behaviour A regime switching approach

    International Nuclear Information System (INIS)

    Aloui, Chaker; Jammazi, Rania

    2009-01-01

    In this paper we develop a two regime Markov-switching EGARCH model introduced by Henry [Henry, O., 2009. Regime switching in the relationship between equity returns and short-term interest rates. Journal of Banking and Finance 33, 405-414.] to examine the relationship between crude oil shocks and stock markets. An application to stock markets of UK, France and Japan over the sample period January 1989 to December 2007 illustrates plausible results. We detect two episodes of series behaviour one relative to low mean/high variance regime and the other to high mean/low variance regime. Furthermore, there is evidence that common recessions coincide with the low mean/high variance regime. In addition, we allow both real stock returns and probability of transitions from one regime to another to depend on the net oil price increase variable. The findings show that rises in oil price has a significant role in determining both the volatility of stock returns and the probability of transition across regimes. (author)

  1. A decision-theoretic approach for segmental classification

    OpenAIRE

    Yau, Christopher; Holmes, Christopher C.

    2013-01-01

    This paper is concerned with statistical methods for the segmental classification of linear sequence data where the task is to segment and classify the data according to an underlying hidden discrete state sequence. Such analysis is commonplace in the empirical sciences including genomics, finance and speech processing. In particular, we are interested in answering the following question: given data $y$ and a statistical model $\\pi(x,y)$ of the hidden states $x$, what should we report as the ...

  2. Possession States: Approaches to Clinical Evaluation and Classification

    Directory of Open Access Journals (Sweden)

    S. McCormick

    1992-01-01

    Full Text Available The fields of anthropology and sociology have produced a large quantity of literature on possession states, physicians however rarely report on such phenomena. As a result clinical description of possession states has suffered, even though these states may be more common and less deviant than supposed. Both ICD-10 and DSM-IV may include specific criteria for possession disorders. The authors briefly review Western notions about possession and kindred states and present guidelines for evaluation and classification.

  3. An Approach for Leukemia Classification Based on Cooperative Game Theory

    Directory of Open Access Journals (Sweden)

    Atefeh Torkaman

    2011-01-01

    Full Text Available Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO. Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5 with (90.16% in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment

  4. An approach for leukemia classification based on cooperative game theory.

    Science.gov (United States)

    Torkaman, Atefeh; Charkari, Nasrollah Moghaddam; Aghaeipour, Mahnaz

    2011-01-01

    Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO). Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5) with (90.16%) in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment of leukemic

  5. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    Directory of Open Access Journals (Sweden)

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  6. A Novel Anti-classification Approach for Knowledge Protection.

    Science.gov (United States)

    Lin, Chen-Yi; Chen, Tung-Shou; Tsai, Hui-Fang; Lee, Wei-Bin; Hsu, Tien-Yu; Kao, Yuan-Hung

    2015-10-01

    Classification is the problem of identifying a set of categories where new data belong, on the basis of a set of training data whose category membership is known. Its application is wide-spread, such as the medical science domain. The issue of the classification knowledge protection has been paid attention increasingly in recent years because of the popularity of cloud environments. In the paper, we propose a Shaking Sorted-Sampling (triple-S) algorithm for protecting the classification knowledge of a dataset. The triple-S algorithm sorts the data of an original dataset according to the projection results of the principal components analysis so that the features of the adjacent data are similar. Then, we generate noise data with incorrect classes and add those data to the original dataset. In addition, we develop an effective positioning strategy, determining the added positions of noise data in the original dataset, to ensure the restoration of the original dataset after removing those noise data. The experimental results show that the disturbance effect of the triple-S algorithm on the CLC, MySVM, and LibSVM classifiers increases when the noise data ratio increases. In addition, compared with existing methods, the disturbance effect of the triple-S algorithm is more significant on MySVM and LibSVM when a certain amount of the noise data added to the original dataset is reached.

  7. Classification as clustering: a Pareto cooperative-competitive GP approach.

    Science.gov (United States)

    McIntyre, Andrew R; Heywood, Malcolm I

    2011-01-01

    Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.

  8. INTRODUCTION OF A SECTORAL APPROACH TO TRANSPORT SECTOR FOR POST-2012 CLIMATE REGIME

    Directory of Open Access Journals (Sweden)

    Atit TIPPICHAI

    2009-01-01

    Full Text Available Recently, the concept of sectoral approaches has been discussed actively under the UNFCCC framework as it could realize GHG mitigations for the Kyoto Protocol and beyond. However, most studies have never introduced this approach to the transport sector explicitly or analyzed its impacts quantitatively. In this paper, we introduce a sectoral approach which aims to set sector-specific emission reduction targets for the transport sector for the post-2012 climate regime. We suppose that developed countries will commit to the sectoral reduction target and key developing countries such as China and India will have the sectoral no-lose targets — no penalties for the failure to meet targets but the right to sell exceeding reductions — for the medium term commitment, i.e. 2013–2020. Six scenarios of total CO2 emission reduction target in the transport sector in 2020, varying from 5% to 30% reductions from the 2005 level are established. The paper preliminarily analyzes shares of emission reductions and abatement costs to meet the targets for key developed countries including the USA, EU-15, Russia, Japan and Canada. To analyze the impacts of the proposed approach, we generate sectoral marginal abatement cost (MAC curves by region through extending a top-down economic model, namely the AIM/CGE model. The total emission reduction targets are analyzed against the developed MAC curves for the transport sector in order to obtain an equal marginal abatement cost which derives optimal emission reduction for each country and minimizes total abatement cost. The results indicate that the USA will play a crucial role in GHG mitigations in the transport sector as it is most responsible for emission reductions (i.e. accounts for more than 70% while Japan will least reduce (i.e. accounts for about 3% for all scenarios. In the case of a 5% reduction, the total abatement is equal to 171.1 MtCO2 with a total cost of 1.61 billion USD; and in the case of a 30

  9. A Novel Imbalanced Data Classification Approach Based on Logistic Regression and Fisher Discriminant

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2015-01-01

    Full Text Available We introduce an imbalanced data classification approach based on logistic regression significant discriminant and Fisher discriminant. First of all, a key indicators extraction model based on logistic regression significant discriminant and correlation analysis is derived to extract features for customer classification. Secondly, on the basis of the linear weighted utilizing Fisher discriminant, a customer scoring model is established. And then, a customer rating model where the customer number of all ratings follows normal distribution is constructed. The performance of the proposed model and the classical SVM classification method are evaluated in terms of their ability to correctly classify consumers as default customer or nondefault customer. Empirical results using the data of 2157 customers in financial engineering suggest that the proposed approach better performance than the SVM model in dealing with imbalanced data classification. Moreover, our approach contributes to locating the qualified customers for the banks and the bond investors.

  10. Dynamic species classification of microorganisms across time, abiotic and biotic environments-A sliding window approach.

    Directory of Open Access Journals (Sweden)

    Frank Pennekamp

    Full Text Available The development of video-based monitoring methods allows for rapid, dynamic and accurate monitoring of individuals or communities, compared to slower traditional methods, with far reaching ecological and evolutionary applications. Large amounts of data are generated using video-based methods, which can be effectively processed using machine learning (ML algorithms into meaningful ecological information. ML uses user defined classes (e.g. species, derived from a subset (i.e. training data of video-observed quantitative features (e.g. phenotypic variation, to infer classes in subsequent observations. However, phenotypic variation often changes due to environmental conditions, which may lead to poor classification, if environmentally induced variation in phenotypes is not accounted for. Here we describe a framework for classifying species under changing environmental conditions based on the random forest classification. A sliding window approach was developed that restricts temporal and environmentally conditions to improve the classification. We tested our approach by applying the classification framework to experimental data. The experiment used a set of six ciliate species to monitor changes in community structure and behavior over hundreds of generations, in dozens of species combinations and across a temperature gradient. Differences in biotic and abiotic conditions caused simplistic classification approaches to be unsuccessful. In contrast, the sliding window approach allowed classification to be highly successful, as phenotypic differences driven by environmental change, could be captured by the classifier. Importantly, classification using the random forest algorithm showed comparable success when validated against traditional, slower, manual identification. Our framework allows for reliable classification in dynamic environments, and may help to improve strategies for long-term monitoring of species in changing environments. Our

  11. A bayesian approach to classification criteria for spectacled eiders

    Science.gov (United States)

    Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.

    1996-01-01

    To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.

  12. National-Scale Hydrologic Classification & Agricultural Decision Support: A Multi-Scale Approach

    Science.gov (United States)

    Coopersmith, E. J.; Minsker, B.; Sivapalan, M.

    2012-12-01

    Classification frameworks can help organize catchments exhibiting similarity in hydrologic and climatic terms. Focusing this assessment of "similarity" upon specific hydrologic signatures, in this case the annual regime curve, can facilitate the prediction of hydrologic responses. Agricultural decision-support over a diverse set of catchments throughout the United States depends upon successful modeling of the wetting/drying process without necessitating separate model calibration at every site where such insights are required. To this end, a holistic classification framework is developed to describe both climatic variability (humid vs. arid, winter rainfall vs. summer rainfall) and the draining, storing, and filtering behavior of any catchment, including ungauged or minimally gauged basins. At the national scale, over 400 catchments from the MOPEX database are analyzed to construct the classification system, with over 77% of these catchments ultimately falling into only six clusters. At individual locations, soil moisture models, receiving only rainfall as input, produce correlation values in excess of 0.9 with respect to observed soil moisture measurements. By deploying physical models for predicting soil moisture exclusively from precipitation that are calibrated at gauged locations, overlaying machine learning techniques to improve these estimates, then generalizing the calibration parameters for catchments in a given class, agronomic decision-support becomes available where it is needed rather than only where sensing data are located.lassifications of 428 U.S. catchments on the basis of hydrologic regime data, Coopersmith et al, 2012.

  13. Developing a novel approach to analyse the regimes of temporary streams and their controls on aquatic biota

    Science.gov (United States)

    Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; de Girolamo, A. M.; Lo Porto, A.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Froebrich, J.

    2011-10-01

    Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. The use of the aquatic fauna structural and functional characteristics to assess the ecological quality of a temporary stream reach can not therefore be made without taking into account the controls imposed by the hydrological regime. This paper develops some methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: flood, riffles, connected, pools, dry and arid. We used the water discharge records from gauging stations or simulations using rainfall-runoff models to infer the temporal patterns of occurrence of these states using the developed aquatic states frequency graph. The visual analysis of this graph is complemented by the development of two metrics based on the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of the aquatic regimes of temporary streams in terms of their influence over the development of aquatic life is put forward, defining Permanent, Temporary-pools, Temporary-dry and Episodic regime types. All these methods were tested with data from eight temporary streams around the Mediterranean from MIRAGE project and its application was a precondition to assess the ecological quality of these streams using the current methods prescribed in the European Water Framework Directive for macroinvertebrate communities.

  14. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  15. Automated Diatom Classification (Part A: Handcrafted Feature Approaches

    Directory of Open Access Journals (Sweden)

    Gloria Bueno

    2017-07-01

    Full Text Available This paper deals with automatic taxa identification based on machine learning methods. The aim is therefore to automatically classify diatoms, in terms of pattern recognition terminology. Diatoms are a kind of algae microorganism with high biodiversity at the species level, which are useful for water quality assessment. The most relevant features for diatom description and classification have been selected using an extensive dataset of 80 taxa with a minimum of 100 samples/taxon augmented to 300 samples/taxon. In addition to published morphological, statistical and textural descriptors, a new textural descriptor, Local Binary Patterns (LBP, to characterize the diatom’s valves, and a log Gabor implementation not tested before for this purpose are introduced in this paper. Results show an overall accuracy of 98.11% using bagging decision trees and combinations of descriptors. Finally, some phycological features of diatoms that are still difficult to integrate in computer systems are discussed for future work.

  16. A CNN Based Approach for Garments Texture Design Classification

    Directory of Open Access Journals (Sweden)

    S.M. Sofiqul Islam

    2017-05-01

    Full Text Available Identifying garments texture design automatically for recommending the fashion trends is important nowadays because of the rapid growth of online shopping. By learning the properties of images efficiently, a machine can give better accuracy of classification. Several Hand-Engineered feature coding exists for identifying garments design classes. Recently, Deep Convolutional Neural Networks (CNNs have shown better performances for different object recognition. Deep CNN uses multiple levels of representation and abstraction that helps a machine to understand the types of data more accurately. In this paper, a CNN model for identifying garments design classes has been proposed. Experimental results on two different datasets show better results than existing two well-known CNN models (AlexNet and VGGNet and some state-of-the-art Hand-Engineered feature extraction methods.

  17. Syndromic classification of rickettsioses: an approach for clinical practice

    Directory of Open Access Journals (Sweden)

    Álvaro A. Faccini-Martínez

    2014-11-01

    Full Text Available Rickettsioses share common clinical manifestations, such as fever, malaise, exanthema, the presence or absence of an inoculation eschar, and lymphadenopathy. Some of these manifestations can be suggestive of certain species of Rickettsia infection. Nevertheless none of these manifestations are pathognomonic, and direct diagnostic methods to confirm the involved species are always required. A syndrome is a set of signs and symptoms that characterizes a disease with many etiologies or causes. This situation is applicable to rickettsioses, where different species can cause similar clinical presentations. We propose a syndromic classification for these diseases: exanthematic rickettsiosis syndrome with a low probability of inoculation eschar and rickettsiosis syndrome with a probability of inoculation eschar and their variants. In doing so, we take into account the clinical manifestations, the geographic origin, and the possible vector involved, in order to provide a guide for physicians of the most probable etiological agent.

  18. A probabilistic approach to emission-line galaxy classification

    Science.gov (United States)

    de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.

    2017-12-01

    We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.

  19. Evaluation Methodology between Globalization and Localization Features Approaches for Skin Cancer Lesions Classification

    Science.gov (United States)

    Ahmed, H. M.; Al-azawi, R. J.; Abdulhameed, A. A.

    2018-05-01

    Huge efforts have been put in the developing of diagnostic methods to skin cancer disease. In this paper, two different approaches have been addressed for detection the skin cancer in dermoscopy images. The first approach uses a global method that uses global features for classifying skin lesions, whereas the second approach uses a local method that uses local features for classifying skin lesions. The aim of this paper is selecting the best approach for skin lesion classification. The dataset has been used in this paper consist of 200 dermoscopy images from Pedro Hispano Hospital (PH2). The achieved results are; sensitivity about 96%, specificity about 100%, precision about 100%, and accuracy about 97% for globalization approach while, sensitivity about 100%, specificity about 100%, precision about 100%, and accuracy about 100% for Localization Approach, these results showed that the localization approach achieved acceptable accuracy and better than globalization approach for skin cancer lesions classification.

  20. a Two-Step Classification Approach to Distinguishing Similar Objects in Mobile LIDAR Point Clouds

    Science.gov (United States)

    He, H.; Khoshelham, K.; Fraser, C.

    2017-09-01

    Nowadays, lidar is widely used in cultural heritage documentation, urban modeling, and driverless car technology for its fast and accurate 3D scanning ability. However, full exploitation of the potential of point cloud data for efficient and automatic object recognition remains elusive. Recently, feature-based methods have become very popular in object recognition on account of their good performance in capturing object details. Compared with global features describing the whole shape of the object, local features recording the fractional details are more discriminative and are applicable for object classes with considerable similarity. In this paper, we propose a two-step classification approach based on point feature histograms and the bag-of-features method for automatic recognition of similar objects in mobile lidar point clouds. Lamp post, street light and traffic sign are grouped as one category in the first-step classification for their inter similarity compared with tree and vehicle. A finer classification of the lamp post, street light and traffic sign based on the result of the first-step classification is implemented in the second step. The proposed two-step classification approach is shown to yield a considerable improvement over the conventional one-step classification approach.

  1. A TWO-STEP CLASSIFICATION APPROACH TO DISTINGUISHING SIMILAR OBJECTS IN MOBILE LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    H. He

    2017-09-01

    Full Text Available Nowadays, lidar is widely used in cultural heritage documentation, urban modeling, and driverless car technology for its fast and accurate 3D scanning ability. However, full exploitation of the potential of point cloud data for efficient and automatic object recognition remains elusive. Recently, feature-based methods have become very popular in object recognition on account of their good performance in capturing object details. Compared with global features describing the whole shape of the object, local features recording the fractional details are more discriminative and are applicable for object classes with considerable similarity. In this paper, we propose a two-step classification approach based on point feature histograms and the bag-of-features method for automatic recognition of similar objects in mobile lidar point clouds. Lamp post, street light and traffic sign are grouped as one category in the first-step classification for their inter similarity compared with tree and vehicle. A finer classification of the lamp post, street light and traffic sign based on the result of the first-step classification is implemented in the second step. The proposed two-step classification approach is shown to yield a considerable improvement over the conventional one-step classification approach.

  2. The development of a classification schema for arts-based approaches to knowledge translation.

    Science.gov (United States)

    Archibald, Mandy M; Caine, Vera; Scott, Shannon D

    2014-10-01

    Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.

  3. Understanding energy-related regimes: A participatory approach from central Australia

    International Nuclear Information System (INIS)

    Foran, Tira; Fleming, David; Spandonide, Bruno; Williams, Rachel; Race, Digby

    2016-01-01

    For a particular community, what energy-related innovations constitute no-regrets strategies? We present a methodology to understand how alternative energy consuming activities and policy regimes impact on current and future liveability of socio-culturally diverse communities facing climate change. Our methodology augments the energy policy literature by harnessing three concepts (collaborative governance, innovation and political economic regime of provisioning) to support dialogue around changing energy-related activities. We convened workshops in Alice Springs, Australia to build capability to identify no-regrets energy-related housing or transport activities and strategies. In preparation, we interviewed policy actors and constructed three new housing-related future scenarios. After discussing the scenarios, policy and research actors prioritised five socio-technical activities or strategies. Evaluations indicate participants enjoyed opportunities given by the methodology to have focussed discussions about activities and innovation, while requesting more socially nuanced scenario storylines. We discuss implications for theory and technique development. - Highlights: •Energy-related activities and regimes frustrate pro-sustainability action. •Participatory workshops increased understanding of activities and regimes. •Workshops used a novel combination of governance and social theories. •Results justify inclusive dialogue around building energy standards and transport options.

  4. A law & economics approach to the study of integrated management regimes of estuaries

    NARCIS (Netherlands)

    van de Griendt, W.E.

    2004-01-01

    In this paper it is proposed to analyse legal regimes for integrated management of estuaries with the help of institutional legal theory and the Schlager & Ostrom framework for types of ownership. Estuaries are highly valued and valuable and therefore need protection. The problem is that they

  5. Human Rights Promotion through Transnational Investment Regimes: An International Political Economy Approach

    Directory of Open Access Journals (Sweden)

    Claire Cutler

    2013-05-01

    Full Text Available International investment agreements are foundational instruments in a transnational investment regime that governs how states regulate the foreign-owned assets and the foreign investment activities of private actors. Over 3,000 investment agreements between states govern key governmental powers and form the basis for an emerging transnational investment regime. This transnational regime significantly decentralizes, denationalizes, and privatizes decision-making and policy choices over foreign investment. Investment agreements set limits to state action in a number of areas of vital public concern, including the protection of human and labour rights, the environment, and sustainable development. They determine the distribution of power between foreign investors and host states and their societies. However, the societies in which they operate seldom have any input into the terms or operation of these agreements, raising crucial questions of their democratic legitimacy as mechanisms of governance. This paper draws on political science and law to explore the political economy of international investment agreements and asks whether these agreements are potential vehicles for promoting international human rights. The analysis provides an historical account of the investment regime, while a review of the political economy of international investment agreements identifies what appears to be a paradox at the core of their operation. It then examines contract theory for insight into this apparent paradox and considers whether investment agreements are suitable mechanisms for advancing international human rights.

  6. An approach for classification of malignant and benign ...

    Indian Academy of Sciences (India)

    Birmohan Singh

    2018-03-16

    Mar 16, 2018 ... To investigate the performance of the proposed approach, mammogram ... women from entering into further stages of breast cancer. Amongst all ..... have been extracted in this work are Haralick texture, Laws texture, fractal ...

  7. A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry

    Science.gov (United States)

    Forster, J.; Entrup, B.

    2017-10-01

    In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.

  8. Intelligence system based classification approach for medical disease diagnosis

    Science.gov (United States)

    Sagir, Abdu Masanawa; Sathasivam, Saratha

    2017-08-01

    The prediction of breast cancer in women who have no signs or symptoms of the disease as well as survivability after undergone certain surgery has been a challenging problem for medical researchers. The decision about presence or absence of diseases depends on the physician's intuition, experience and skill for comparing current indicators with previous one than on knowledge rich data hidden in a database. This measure is a very crucial and challenging task. The goal is to predict patient condition by using an adaptive neuro fuzzy inference system (ANFIS) pre-processed by grid partitioning. To achieve an accurate diagnosis at this complex stage of symptom analysis, the physician may need efficient diagnosis system. A framework describes methodology for designing and evaluation of classification performances of two discrete ANFIS systems of hybrid learning algorithms least square estimates with Modified Levenberg-Marquardt and Gradient descent algorithms that can be used by physicians to accelerate diagnosis process. The proposed method's performance was evaluated based on training and test datasets with mammographic mass and Haberman's survival Datasets obtained from benchmarked datasets of University of California at Irvine's (UCI) machine learning repository. The robustness of the performance measuring total accuracy, sensitivity and specificity is examined. In comparison, the proposed method achieves superior performance when compared to conventional ANFIS based gradient descent algorithm and some related existing methods. The software used for the implementation is MATLAB R2014a (version 8.3) and executed in PC Intel Pentium IV E7400 processor with 2.80 GHz speed and 2.0 GB of RAM.

  9. Classification of real farm conditions Iberian pigs according to the feeding regime with multivariate models developed by using fatty acids composition or NIR spectral data

    Directory of Open Access Journals (Sweden)

    De Pedro, Emiliano

    2009-07-01

    Full Text Available Multivariate Classification models to classify real farm conditions Iberian pigs, according to the feeding regime were developed by using fatty acids composition or NIR spectral data of liquid fat samples. A total of 121 subcutaneous fat samples were taken from Iberian pigs carcasses belonging to 5 batches reared under different feeding systems. Once the liquid sample was extracted from each subcutaneous fat sample, it was determined the percentage of 11 fatty acids (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1. At the same time, Near Infrared (NIR spectrum of each liquid sample was obtained. Linear Discriminant Analysis (LDA was considered as pattern recognition method to develop the multivariate models. Classification errors of the LDA models generated by using NIR spectral data were 0.0% and 1.7% for the model generated by using fatty acids composition. Results confirm the possibility to discriminate Iberian pig liquid samples from animals reared under different feeding regimes on real farm conditions by using NIR spectral data or fatty acids composition. Classification error obtained using models generated from NIR spectral data were lower than those obtained in models based on fatty acids composition.Se han desarrollado modelos multivariantes, generados a partir de la composición en ácidos grasos o datos espectrales NIR, para clasificar según el régimen alimenticio cerdos Ibéricos producidos bajo condiciones no experimentales. Se han empleado 121 muestras de grasa líquida procedentes de grasa subcutánea de canales de cerdos Ibéricos pertenecientes a 5 partidas con regímenes alimenticios diferentes. A dichas muestras líquidas se les determinó el contenido en 11 ácidos grasos (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1 y se obtuvo su espectro NIR. Los modelos de clasificación multivariantes se desarrollaron mediante Análisis Discriminante Lineal. Dichos

  10. Modern classification of neoplasms: reconciling differences between morphologic and molecular approaches

    International Nuclear Information System (INIS)

    Berman, Jules

    2005-01-01

    For over 150 years, pathologists have relied on histomorphology to classify and diagnose neoplasms. Their success has been stunning, permitting the accurate diagnosis of thousands of different types of neoplasms using only a microscope and a trained eye. In the past two decades, cancer genomics has challenged the supremacy of histomorphology by identifying genetic alterations shared by morphologically diverse tumors and by finding genetic features that distinguish subgroups of morphologically homogeneous tumors. The Developmental Lineage Classification and Taxonomy of Neoplasms groups neoplasms by their embryologic origin. The putative value of this classification is based on the expectation that tumors of a common developmental lineage will share common metabolic pathways and common responses to drugs that target these pathways. The purpose of this manuscript is to show that grouping tumors according to their developmental lineage can reconcile certain fundamental discrepancies resulting from morphologic and molecular approaches to neoplasm classification. In this study, six issues in tumor classification are described that exemplify the growing rift between morphologic and molecular approaches to tumor classification: 1) the morphologic separation between epithelial and non-epithelial tumors; 2) the grouping of tumors based on shared cellular functions; 3) the distinction between germ cell tumors and pluripotent tumors of non-germ cell origin; 4) the distinction between tumors that have lost their differentiation and tumors that arise from uncommitted stem cells; 5) the molecular properties shared by morphologically disparate tumors that have a common developmental lineage, and 6) the problem of re-classifying morphologically identical but clinically distinct subsets of tumors. The discussion of these issues in the context of describing different methods of tumor classification is intended to underscore the clinical value of a robust tumor classification. A

  11. Neurological gait disorders in elderly people: clinical approach and classification.

    NARCIS (Netherlands)

    Snijders, A.H.; Warrenburg, B.P.C. van de; Giladi, N.; Bloem, B.R.

    2007-01-01

    Gait disorders are common and often devastating companions of ageing, leading to reductions in quality of life and increased mortality. Here, we present a clinically oriented approach to neurological gait disorders in the elderly population. We also draw attention to several exciting scientific

  12. A graduated food addiction classification approach significantly differentiates obesity among people with type 2 diabetes.

    Science.gov (United States)

    Raymond, Karren-Lee; Kannis-Dymand, Lee; Lovell, Geoff P

    2016-10-01

    This study examined a graduated severity level approach to food addiction classification against associations with World Health Organization obesity classifications (body mass index, kg/m 2 ) among 408 people with type 2 diabetes. A survey including the Yale Food Addiction Scale and several demographic questions demonstrated four distinct Yale Food Addiction Scale symptom severity groups (in line with Diagnostic and Statistical Manual of Mental Disorders (5th ed.) severity indicators): non-food addiction, mild food addiction, moderate food addiction and severe food addiction. Analysis of variance with post hoc tests demonstrated each severity classification group was significantly different in body mass index, with each grouping being associated with increased World Health Organization obesity classifications. These findings have implications for diagnosing food addiction and implementing treatment and prevention methodologies of obesity among people with type 2 diabetes.

  13. A Graph Cut Approach to Artery/Vein Classification in Ultra-Widefield Scanning Laser Ophthalmoscopy.

    Science.gov (United States)

    Pellegrini, Enrico; Robertson, Gavin; MacGillivray, Tom; van Hemert, Jano; Houston, Graeme; Trucco, Emanuele

    2018-02-01

    The classification of blood vessels into arterioles and venules is a fundamental step in the automatic investigation of retinal biomarkers for systemic diseases. In this paper, we present a novel technique for vessel classification on ultra-wide-field-of-view images of the retinal fundus acquired with a scanning laser ophthalmoscope. To the best of our knowledge, this is the first time that a fully automated artery/vein classification technique for this type of retinal imaging with no manual intervention has been presented. The proposed method exploits hand-crafted features based on local vessel intensity and vascular morphology to formulate a graph representation from which a globally optimal separation between the arterial and venular networks is computed by graph cut approach. The technique was tested on three different data sets (one publicly available and two local) and achieved an average classification accuracy of 0.883 in the largest data set.

  14. Numerical approach to optimal portfolio in a power utility regime-switching model

    Science.gov (United States)

    Gyulov, Tihomir B.; Koleva, Miglena N.; Vulkov, Lubin G.

    2017-12-01

    We consider a system of weakly coupled degenerate semi-linear parabolic equations of optimal portfolio in a regime-switching with power utility function, derived by A.R. Valdez and T. Vargiolu [14]. First, we discuss some basic properties of the solution of this system. Then, we develop and analyze implicit-explicit, flux limited finite difference schemes for the differential problem. Numerical experiments are discussed.

  15. A texton-based approach for the classification of lung parenchyma in CT images

    DEFF Research Database (Denmark)

    Gangeh, Mehrdad J.; Sørensen, Lauge; Shaker, Saher B.

    2010-01-01

    In this paper, a texton-based classification system based on raw pixel representation along with a support vector machine with radial basis function kernel is proposed for the classification of emphysema in computed tomography images of the lung. The proposed approach is tested on 168 annotated...... regions of interest consisting of normal tissue, centrilobular emphysema, and paraseptal emphysema. The results show the superiority of the proposed approach to common techniques in the literature including moments of the histogram of filter responses based on Gaussian derivatives. The performance...

  16. Regime change?

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, K.W.

    2004-01-01

    Following the 1998 nuclear tests in South Asia and later reinforced by revelations about North Korean and Iraqi nuclear activities, there has been growing concern about increasing proliferation dangers. At the same time, the prospects of radiological/nuclear terrorism are seen to be rising - since 9/11, concern over a proliferation/terrorism nexus has never been higher. In the face of this growing danger, there are urgent calls for stronger measures to strengthen the current international nuclear nonproliferation regime, including recommendations to place civilian processing of weapon-useable material under multinational control. As well, there are calls for entirely new tools, including military options. As proliferation and terrorism concerns grow, the regime is under pressure and there is a temptation to consider fundamental changes to the regime. In this context, this paper will address the following: Do we need to change the regime centered on the Treaty on the Nonproliferation of Nuclear Weapons (NPT) and the International Atomic Energy Agency (IAEA)? What improvements could ensure it will be the foundation for the proliferation resistance and physical protection needed if nuclear power grows? What will make it a viable centerpiece of future nonproliferation and counterterrorism approaches?

  17. New approaches for the financial distress classification in agribusiness

    Directory of Open Access Journals (Sweden)

    Jan Vavřina

    2013-01-01

    Full Text Available After the recent financial crisis the need for unchallenged tools evaluating the financial health of enterprises has even arisen. Apart from well-known techniques such as Z-score and logit models, a new approaches were suggested, namely the data envelopment analysis (DEA reformulation for bankruptcy prediction and production function-based economic performance evaluation (PFEP. Being recently suggested, these techniques have not yet been validated for common use in financial sector, although as for DEA approach some introductory studies are available for manufacturing and IT industry. In this contribution we focus on the thorough validation calculations that evaluate these techniques for the specific agribusiness industry. To keep the data as homogeneous as possible we limit the choice of agribusiness companies onto the area of the countries of Visegrad Group. The extensive data set covering several hundreds of enterprises were collected employing the database Amadeus of Bureau van Dijk. We present the validation results for each of the four mentioned methods, outline the strengths and weaknesses of each approach and discuss the valid suggestions for the effective detection of financial problems in the specific branch of agribusiness.

  18. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  19. A machine learning approach for the classification of metallic glasses

    Science.gov (United States)

    Gossett, Eric; Perim, Eric; Toher, Cormac; Lee, Dongwoo; Zhang, Haitao; Liu, Jingbei; Zhao, Shaofan; Schroers, Jan; Vlassak, Joost; Curtarolo, Stefano

    Metallic glasses possess an extensive set of mechanical properties along with plastic-like processability. As a result, they are a promising material in many industrial applications. However, the successful synthesis of novel metallic glasses requires trial and error, costing both time and resources. Therefore, we propose a high-throughput approach that combines an extensive set of experimental measurements with advanced machine learning techniques. This allows us to classify metallic glasses and predict the full phase diagrams for a given alloy system. Thus this method provides a means to identify potential glass-formers and opens up the possibility for accelerating and reducing the cost of the design of new metallic glasses.

  20. Optimization of a Non-traditional Unsupervised Classification Approach for Land Cover Analysis

    Science.gov (United States)

    Boyd, R. K.; Brumfield, J. O.; Campbell, W. J.

    1982-01-01

    The conditions under which a hybrid of clustering and canonical analysis for image classification produce optimum results were analyzed. The approach involves generation of classes by clustering for input to canonical analysis. The importance of the number of clusters input and the effect of other parameters of the clustering algorithm (ISOCLS) were examined. The approach derives its final result by clustering the canonically transformed data. Therefore the importance of number of clusters requested in this final stage was also examined. The effect of these variables were studied in terms of the average separability (as measured by transformed divergence) of the final clusters, the transformation matrices resulting from different numbers of input classes, and the accuracy of the final classifications. The research was performed with LANDSAT MSS data over the Hazleton/Berwick Pennsylvania area. Final classifications were compared pixel by pixel with an existing geographic information system to provide an indication of their accuracy.

  1. Sows’ activity classification device using acceleration data – A resource constrained approach

    DEFF Research Database (Denmark)

    Marchioro, Gilberto Fernandes; Cornou, Cécile; Kristensen, Anders Ringgaard

    2011-01-01

    This paper discusses the main architectural alternatives and design decisions in order to implement a sows’ activity classification model on electronic devices. The different possibilities are analyzed in practical and technical aspects, focusing on the implementation metrics, like cost......, performance, complexity and reliability. The target architectures are divided into: server based, where the main processing element is a central computer; and embedded based, where the processing is distributed on devices attached to the animals. The initial classification model identifies the activities...... of a heuristic classification approach, focusing on the resource constrained characteristics of embedded systems. The new approach classifies the activities performed by the sows with accuracy close to 90%. It was implemented as a hardware module that can easily be instantiated to provide preprocessed...

  2. Head Pose Estimation on Eyeglasses Using Line Detection and Classification Approach

    Science.gov (United States)

    Setthawong, Pisal; Vannija, Vajirasak

    This paper proposes a unique approach for head pose estimation of subjects with eyeglasses by using a combination of line detection and classification approaches. Head pose estimation is considered as an important non-verbal form of communication and could also be used in the area of Human-Computer Interface. A major improvement of the proposed approach is that it allows estimation of head poses at a high yaw/pitch angle when compared with existing geometric approaches, does not require expensive data preparation and training, and is generally fast when compared with other approaches.

  3. Review on Electrodynamic Energy Harvesters—A Classification Approach

    Directory of Open Access Journals (Sweden)

    Roland Lausecker

    2013-04-01

    Full Text Available Beginning with a short historical sketch, electrodynamic energy harvesters with focus on vibration generators and volumes below 1dm3 are reviewed. The current challenges to generate up to several milliwatts of power from practically relevant flows and vibrations are addressed, and the variety of available solutions is sketched. Sixty-seven different harvester concepts from more than 130 publications are classified with respect to excitation, additional boundary conditions, design and fabrication. A chronological list of the harvester concepts with corresponding references provides an impression about the developments. Besides resonant harvester concepts, the review includes broadband approaches and mechanisms to harvest from flow. Finally, a short overview of harvesters in applications and first market ready concepts is given.

  4. A Biologically Inspired Approach to Frequency Domain Feature Extraction for EEG Classification

    Directory of Open Access Journals (Sweden)

    Nurhan Gursel Ozmen

    2018-01-01

    Full Text Available Classification of electroencephalogram (EEG signal is important in mental decoding for brain-computer interfaces (BCI. We introduced a feature extraction approach based on frequency domain analysis to improve the classification performance on different mental tasks using single-channel EEG. This biologically inspired method extracts the most discriminative spectral features from power spectral densities (PSDs of the EEG signals. We applied our method on a dataset of six subjects who performed five different imagination tasks: (i resting state, (ii mental arithmetic, (iii imagination of left hand movement, (iv imagination of right hand movement, and (v imagination of letter “A.” Pairwise and multiclass classifications were performed in single EEG channel using Linear Discriminant Analysis and Support Vector Machines. Our method produced results (mean classification accuracy of 83.06% for binary classification and 91.85% for multiclassification that are on par with the state-of-the-art methods, using single-channel EEG with low computational cost. Among all task pairs, mental arithmetic versus letter imagination yielded the best result (mean classification accuracy of 90.29%, indicating that this task pair could be the most suitable pair for a binary class BCI. This study contributes to the development of single-channel BCI, as well as finding the best task pair for user defined applications.

  5. Methodical approaches to development of classification state methods of regulation business activity in fishery

    OpenAIRE

    She Son Gun

    2014-01-01

    Approaches to development of classification of the state methods of regulation of economy are considered. On the basis of the provided review the complex method of state regulation of business activity is reasonable. The offered principles allow improving public administration and can be used in industry concepts and state programs on support of small business in fishery.

  6. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    Science.gov (United States)

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  7. A novel deep learning approach for classification of EEG motor imagery signals.

    Science.gov (United States)

    Tabar, Yousef Rezaei; Halici, Ugur

    2017-02-01

    Signal classification is an important issue in brain computer interface (BCI) systems. Deep learning approaches have been used successfully in many recent studies to learn features and classify different types of data. However, the number of studies that employ these approaches on BCI applications is very limited. In this study we aim to use deep learning methods to improve classification performance of EEG motor imagery signals. In this study we investigate convolutional neural networks (CNN) and stacked autoencoders (SAE) to classify EEG Motor Imagery signals. A new form of input is introduced to combine time, frequency and location information extracted from EEG signal and it is used in CNN having one 1D convolutional and one max-pooling layers. We also proposed a new deep network by combining CNN and SAE. In this network, the features that are extracted in CNN are classified through the deep network SAE. The classification performance obtained by the proposed method on BCI competition IV dataset 2b in terms of kappa value is 0.547. Our approach yields 9% improvement over the winner algorithm of the competition. Our results show that deep learning methods provide better classification performance compared to other state of art approaches. These methods can be applied successfully to BCI systems where the amount of data is large due to daily recording.

  8. Bottom-up and Top-down: An alternate classification of LD authoring approaches

    NARCIS (Netherlands)

    Sodhi, Tim; Miao, Yongwu; Brouns, Francis; Koper, Rob

    2007-01-01

    Sodhi, T., Miao, Y., Brouns, F., & Koper, E. J. R. (2007). Bottom-up and Top-down: An alternate classification of LD authoring approaches. Paper presented at the TENCompetence Open Workshop on Current research on IMS Learning Design and Lifelong Competence Development Infrastructures. June, 21-22,

  9. A Systematic Approach to Food Variety Classification as a Tool in ...

    African Journals Online (AJOL)

    A Systematic Approach to Food Variety Classification as a Tool in Dietary ... and food variety (count of all dietary items consumed during the recall period up to the ... This paper presents a pilot study carried out with an aim of demonstrating the ...

  10. On the dynamics of liquids in their viscous regime approaching the glass transition.

    Science.gov (United States)

    Chen, Z; Angell, C A; Richert, R

    2012-07-01

    Recently, Mallamace et al. (Eur. Phys. J. E 34, 94 (2011)) proposed a crossover temperature, T(×), and claimed that the dynamics of many supercooled liquids follow an Arrhenius-type temperature dependence between T(×) and the glass transition temperature T(g). The opposite, namely super-Arrhenius behavior in this viscous regime, has been demonstrated repeatedly for molecular glass-former, for polymers, and for the majority of the exhaustively studied inorganic glasses of technological interest. Therefore, we subject the molecular systems of the Mallamace et al. study to a "residuals" analysis and include not only viscosity data but also the more precise data available from dielectric relaxation experiments over the same temperature range. Although many viscosity data sets are inconclusive due to their noise level, we find that Arrhenius behavior is not a general feature of viscosity in the T(g) to T(×) range. Moreover, the residuals of dielectric relaxation times with respect to an Arrhenius law clearly reveal systematic curvature consistent with super-Arrhenius behavior being an endemic feature of transport properties in this viscous regime. We also observe a common pattern of how dielectric relaxation times decouple slightly from viscosity.

  11. Evaluation of toroidal torque by non-resonant magnetic perturbations in tokamaks for resonant transport regimes using a Hamiltonian approach

    Energy Technology Data Exchange (ETDEWEB)

    Albert, Christopher G.; Heyn, Martin F.; Kapper, Gernot; Kernbichler, Winfried; Martitsch, Andreas F. [Fusion@ÖAW, Institut für Theoretische Physik - Computational Physics, Technische Universität Graz, Petersgasse 16, 8010 Graz (Austria); Kasilov, Sergei V. [Fusion@ÖAW, Institut für Theoretische Physik - Computational Physics, Technische Universität Graz, Petersgasse 16, 8010 Graz (Austria); Institute of Plasma Physics, National Science Center “Kharkov Institute of Physics and Technology,” ul. Akademicheskaya 1, 61108 Kharkov (Ukraine)

    2016-08-15

    Toroidal torque generated by neoclassical viscosity caused by external non-resonant, non-axisymmetric perturbations has a significant influence on toroidal plasma rotation in tokamaks. In this article, a derivation for the expressions of toroidal torque and radial transport in resonant regimes is provided within quasilinear theory in canonical action-angle variables. The proposed approach treats all low-collisional quasilinear resonant neoclassical toroidal viscosity regimes including superbanana-plateau and drift-orbit resonances in a unified way and allows for magnetic drift in all regimes. It is valid for perturbations on toroidally symmetric flux surfaces of the unperturbed equilibrium without specific assumptions on geometry or aspect ratio. The resulting expressions are shown to match the existing analytical results in the large aspect ratio limit. Numerical results from the newly developed code NEO-RT are compared to calculations by the quasilinear version of the code NEO-2 at low collisionalities. The importance of the magnetic shear term in the magnetic drift frequency and a significant effect of the magnetic drift on drift-orbit resonances are demonstrated.

  12. Classification of cancerous cells based on the one-class problem approach

    Science.gov (United States)

    Murshed, Nabeel A.; Bortolozzi, Flavio; Sabourin, Robert

    1996-03-01

    One of the most important factors in reducing the effect of cancerous diseases is the early diagnosis, which requires a good and a robust method. With the advancement of computer technologies and digital image processing, the development of a computer-based system has become feasible. In this paper, we introduce a new approach for the detection of cancerous cells. This approach is based on the one-class problem approach, through which the classification system need only be trained with patterns of cancerous cells. This reduces the burden of the training task by about 50%. Based on this approach, a computer-based classification system is developed, based on the Fuzzy ARTMAP neural networks. Experimental results were performed using a set of 542 patterns taken from a sample of breast cancer. Results of the experiment show 98% correct identification of cancerous cells and 95% correct identification of non-cancerous cells.

  13. An ensemble classification approach for improved Land use/cover change detection

    Science.gov (United States)

    Chellasamy, M.; Ferré, T. P. A.; Humlekrog Greve, M.; Larsen, R.; Chinnasamy, U.

    2014-11-01

    Change Detection (CD) methods based on post-classification comparison approaches are claimed to provide potentially reliable results. They are considered to be most obvious quantitative method in the analysis of Land Use Land Cover (LULC) changes which provides from - to change information. But, the performance of post-classification comparison approaches highly depends on the accuracy of classification of individual images used for comparison. Hence, we present a classification approach that produce accurate classified results which aids to obtain improved change detection results. Machine learning is a part of broader framework in change detection, where neural networks have drawn much attention. Neural network algorithms adaptively estimate continuous functions from input data without mathematical representation of output dependence on input. A common practice for classification is to use Multi-Layer-Perceptron (MLP) neural network with backpropogation learning algorithm for prediction. To increase the ability of learning and prediction, multiple inputs (spectral, texture, topography, and multi-temporal information) are generally stacked to incorporate diversity of information. On the other hand literatures claims backpropagation algorithm to exhibit weak and unstable learning in use of multiple inputs, while dealing with complex datasets characterized by mixed uncertainty levels. To address the problem of learning complex information, we propose an ensemble classification technique that incorporates multiple inputs for classification unlike traditional stacking of multiple input data. In this paper, we present an Endorsement Theory based ensemble classification that integrates multiple information, in terms of prediction probabilities, to produce final classification results. Three different input datasets are used in this study: spectral, texture and indices, from SPOT-4 multispectral imagery captured on 1998 and 2003. Each SPOT image is classified

  14. Modeling and forecasting of wind power generation - Regime-switching approaches

    DEFF Research Database (Denmark)

    Trombe, Pierre-Julien

    The present thesis addresses a number of challenges emerging from the increasing penetration of renewable energy sources into power systems. Focus is placed on wind energy and large-scale offshore wind farms. Indeed, offshore wind power variability is becoming a serious obstacle to the integration...... of more renewable energy into power systems since these systems are subjected to maintain a strict balance between electricity consumption and production, at any time. For this purpose, wind power forecasts offer an essential support to power system operators. In particular, there is a growing demand...... case study is the Horns Rev wind farm located in the North Sea. Regime-switching aspects of offshore wind power fluctuations are investigated. Several formulations of Markov-Switching models are proposed in order to better characterize the stochastic behavior of the underlying process and improve its...

  15. A regime-switching copula approach to modeling day-ahead prices in coupled electricity markets

    DEFF Research Database (Denmark)

    Pircalabu, Anca; Benth, Fred Espen

    2017-01-01

    significant evidence of tail dependence in all pairs of interconnected areas we consider. As a first application of the proposed model, we consider the pricing of financial transmission rights, and highlight how the choice of marginal distributions and copula impacts prices. As a second application we......The recent price coupling of many European electricity markets has triggered a fundamental change in the interaction of day-ahead prices, challenging additionally the modeling of the joint behavior of prices in interconnected markets. In this paper we propose a regime-switching AR–GARCH copula...... to model pairs of day-ahead electricity prices in coupled European markets. While capturing key stylized facts empirically substantiated in the literature, this model easily allows us to 1) deviate from the assumption of normal margins and 2) include a more detailed description of the dependence between...

  16. Integrative Chemical-Biological Read-Across Approach for Chemical Hazard Classification

    Science.gov (United States)

    Low, Yen; Sedykh, Alexander; Fourches, Denis; Golbraikh, Alexander; Whelan, Maurice; Rusyn, Ivan; Tropsha, Alexander

    2013-01-01

    Traditional read-across approaches typically rely on the chemical similarity principle to predict chemical toxicity; however, the accuracy of such predictions is often inadequate due to the underlying complex mechanisms of toxicity. Here we report on the development of a hazard classification and visualization method that draws upon both chemical structural similarity and comparisons of biological responses to chemicals measured in multiple short-term assays (”biological” similarity). The Chemical-Biological Read-Across (CBRA) approach infers each compound's toxicity from those of both chemical and biological analogs whose similarities are determined by the Tanimoto coefficient. Classification accuracy of CBRA was compared to that of classical RA and other methods using chemical descriptors alone, or in combination with biological data. Different types of adverse effects (hepatotoxicity, hepatocarcinogenicity, mutagenicity, and acute lethality) were classified using several biological data types (gene expression profiling and cytotoxicity screening). CBRA-based hazard classification exhibited consistently high external classification accuracy and applicability to diverse chemicals. Transparency of the CBRA approach is aided by the use of radial plots that show the relative contribution of analogous chemical and biological neighbors. Identification of both chemical and biological features that give rise to the high accuracy of CBRA-based toxicity prediction facilitates mechanistic interpretation of the models. PMID:23848138

  17. Spectral-spatial classification of hyperspectral data with mutual information based segmented stacked autoencoder approach

    Science.gov (United States)

    Paul, Subir; Nagesh Kumar, D.

    2018-04-01

    Hyperspectral (HS) data comprises of continuous spectral responses of hundreds of narrow spectral bands with very fine spectral resolution or bandwidth, which offer feature identification and classification with high accuracy. In the present study, Mutual Information (MI) based Segmented Stacked Autoencoder (S-SAE) approach for spectral-spatial classification of the HS data is proposed to reduce the complexity and computational time compared to Stacked Autoencoder (SAE) based feature extraction. A non-parametric dependency measure (MI) based spectral segmentation is proposed instead of linear and parametric dependency measure to take care of both linear and nonlinear inter-band dependency for spectral segmentation of the HS bands. Then morphological profiles are created corresponding to segmented spectral features to assimilate the spatial information in the spectral-spatial classification approach. Two non-parametric classifiers, Support Vector Machine (SVM) with Gaussian kernel and Random Forest (RF) are used for classification of the three most popularly used HS datasets. Results of the numerical experiments carried out in this study have shown that SVM with a Gaussian kernel is providing better results for the Pavia University and Botswana datasets whereas RF is performing better for Indian Pines dataset. The experiments performed with the proposed methodology provide encouraging results compared to numerous existing approaches.

  18. Using blocking approach to preserve privacy in classification rules by inserting dummy Transaction

    Directory of Open Access Journals (Sweden)

    Doryaneh Hossien Afshari

    2017-03-01

    Full Text Available The increasing rate of data sharing among organizations could maximize the risk of leaking sensitive knowledge. Trying to solve this problem leads to increase the importance of privacy preserving within the process of data sharing. In this study is focused on privacy preserving in classification rules mining as a technique of data mining. We propose a blocking algorithm to hiding sensitive classification rules. In the solution, rules' hiding occurs as a result of editing a set of transactions which satisfy sensitive classification rules. The proposed approach tries to deceive and block adversaries by inserting some dummy transactions. Finally, the solution has been evaluated and compared with other available solutions. Results show that limiting the number of attributes existing in each sensitive rule will lead to a decrease in both the number of lost rules and the production rate of ghost rules.

  19. An approach for classification of hydrogeological systems at the regional scale based on groundwater hydrographs

    Science.gov (United States)

    Haaf, Ezra; Barthel, Roland

    2016-04-01

    When assessing hydrogeological conditions at the regional scale, the analyst is often confronted with uncertainty of structures, inputs and processes while having to base inference on scarce and patchy data. Haaf and Barthel (2015) proposed a concept for handling this predicament by developing a groundwater systems classification framework, where information is transferred from similar, but well-explored and better understood to poorly described systems. The concept is based on the central hypothesis that similar systems react similarly to the same inputs and vice versa. It is conceptually related to PUB (Prediction in ungauged basins) where organization of systems and processes by quantitative methods is intended and used to improve understanding and prediction. Furthermore, using the framework it is expected that regional conceptual and numerical models can be checked or enriched by ensemble generated data from neighborhood-based estimators. In a first step, groundwater hydrographs from a large dataset in Southern Germany are compared in an effort to identify structural similarity in groundwater dynamics. A number of approaches to group hydrographs, mostly based on a similarity measure - which have previously only been used in local-scale studies, can be found in the literature. These are tested alongside different global feature extraction techniques. The resulting classifications are then compared to a visual "expert assessment"-based classification which serves as a reference. A ranking of the classification methods is carried out and differences shown. Selected groups from the classifications are related to geological descriptors. Here we present the most promising results from a comparison of classifications based on series correlation, different series distances and series features, such as the coefficients of the discrete Fourier transform and the intrinsic mode functions of empirical mode decomposition. Additionally, we show examples of classes

  20. Evaluating an ensemble classification approach for crop diversityverification in Danish greening subsidy control

    DEFF Research Database (Denmark)

    Chellasamy, Menaka; Ferre, Ty; Greve, Mogens Humlekrog

    2016-01-01

    Beginning in 2015, Danish farmers are obliged to meet specific crop diversification rules based on total land area and number of crops cultivated to be eligible for new greening subsidies. Hence, there is a need for the Danish government to extend their subsidy control system to verify farmers......’ declarations to war-rant greening payments under the new crop diversification rules. Remote Sensing (RS) technology has been used since 1992 to control farmers’ subsidies in Denmark. However, a proper RS-based approach is yet to be finalised to validate new crop diversity requirements designed for assessing...... compliance under the recent subsidy scheme (2014–2020); This study uses an ensemble classification approach(proposed by the authors in previous studies) for validating the crop diversity requirements of the new rules. The approach uses a neural network ensemble classification system with bi-temporal (spring...

  1. Brake fault diagnosis using Clonal Selection Classification Algorithm (CSCA – A statistical learning approach

    Directory of Open Access Journals (Sweden)

    R. Jegadeeshwaran

    2015-03-01

    Full Text Available In automobile, brake system is an essential part responsible for control of the vehicle. Any failure in the brake system impacts the vehicle's motion. It will generate frequent catastrophic effects on the vehicle cum passenger's safety. Thus the brake system plays a vital role in an automobile and hence condition monitoring of the brake system is essential. Vibration based condition monitoring using machine learning techniques are gaining momentum. This study is one such attempt to perform the condition monitoring of a hydraulic brake system through vibration analysis. In this research, the performance of a Clonal Selection Classification Algorithm (CSCA for brake fault diagnosis has been reported. A hydraulic brake system test rig was fabricated. Under good and faulty conditions of a brake system, the vibration signals were acquired using a piezoelectric transducer. The statistical parameters were extracted from the vibration signal. The best feature set was identified for classification using attribute evaluator. The selected features were then classified using CSCA. The classification accuracy of such artificial intelligence technique has been compared with other machine learning approaches and discussed. The Clonal Selection Classification Algorithm performs better and gives the maximum classification accuracy (96% for the fault diagnosis of a hydraulic brake system.

  2. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    Directory of Open Access Journals (Sweden)

    Dong Jiang

    Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.

  3. A hybrid clustering and classification approach for predicting crash injury severity on rural roads.

    Science.gov (United States)

    Hasheminejad, Seyed Hessam-Allah; Zahedi, Mohsen; Hasheminejad, Seyed Mohammad Hossein

    2018-03-01

    As a threat for transportation system, traffic crashes have a wide range of social consequences for governments. Traffic crashes are increasing in developing countries and Iran as a developing country is not immune from this risk. There are several researches in the literature to predict traffic crash severity based on artificial neural networks (ANNs), support vector machines and decision trees. This paper attempts to investigate the crash injury severity of rural roads by using a hybrid clustering and classification approach to compare the performance of classification algorithms before and after applying the clustering. In this paper, a novel rule-based genetic algorithm (GA) is proposed to predict crash injury severity, which is evaluated by performance criteria in comparison with classification algorithms like ANN. The results obtained from analysis of 13,673 crashes (5600 property damage, 778 fatal crashes, 4690 slight injuries and 2605 severe injuries) on rural roads in Tehran Province of Iran during 2011-2013 revealed that the proposed GA method outperforms other classification algorithms based on classification metrics like precision (86%), recall (88%) and accuracy (87%). Moreover, the proposed GA method has the highest level of interpretation, is easy to understand and provides feedback to analysts.

  4. A Linear Dynamical Systems Approach to Streamflow Reconstruction Reveals History of Regime Shifts in Northern Thailand

    Science.gov (United States)

    Nguyen, Hung T. T.; Galelli, Stefano

    2018-03-01

    Catchment dynamics is not often modeled in streamflow reconstruction studies; yet, the streamflow generation process depends on both catchment state and climatic inputs. To explicitly account for this interaction, we contribute a linear dynamic model, in which streamflow is a function of both catchment state (i.e., wet/dry) and paleoclimatic proxies. The model is learned using a novel variant of the Expectation-Maximization algorithm, and it is used with a paleo drought record—the Monsoon Asia Drought Atlas—to reconstruct 406 years of streamflow for the Ping River (northern Thailand). Results for the instrumental period show that the dynamic model has higher accuracy than conventional linear regression; all performance scores improve by 45-497%. Furthermore, the reconstructed trajectory of the state variable provides valuable insights about the catchment history—e.g., regime-like behavior—thereby complementing the information contained in the reconstructed streamflow time series. The proposed technique can replace linear regression, since it only requires information on streamflow and climatic proxies (e.g., tree-rings, drought indices); furthermore, it is capable of readily generating stochastic streamflow replicates. With a marginal increase in computational requirements, the dynamic model brings more desirable features and value to streamflow reconstructions.

  5. A study of earthquake-induced building detection by object oriented classification approach

    Science.gov (United States)

    Sabuncu, Asli; Damla Uca Avci, Zehra; Sunar, Filiz

    2017-04-01

    Among the natural hazards, earthquakes are the most destructive disasters and cause huge loss of lives, heavily infrastructure damages and great financial losses every year all around the world. According to the statistics about the earthquakes, more than a million earthquakes occur which is equal to two earthquakes per minute in the world. Natural disasters have brought more than 780.000 deaths approximately % 60 of all mortality is due to the earthquakes after 2001. A great earthquake took place at 38.75 N 43.36 E in the eastern part of Turkey in Van Province on On October 23th, 2011. 604 people died and about 4000 buildings seriously damaged and collapsed after this earthquake. In recent years, the use of object oriented classification approach based on different object features, such as spectral, textural, shape and spatial information, has gained importance and became widespread for the classification of high-resolution satellite images and orthophotos. The motivation of this study is to detect the collapsed buildings and debris areas after the earthquake by using very high-resolution satellite images and orthophotos with the object oriented classification and also see how well remote sensing technology was carried out in determining the collapsed buildings. In this study, two different land surfaces were selected as homogenous and heterogeneous case study areas. In the first step of application, multi-resolution segmentation was applied and optimum parameters were selected to obtain the objects in each area after testing different color/shape and compactness/smoothness values. In the next step, two different classification approaches, namely "supervised" and "unsupervised" approaches were applied and their classification performances were compared. Object-based Image Analysis (OBIA) was performed using e-Cognition software.

  6. Non-canonical distribution and non-equilibrium transport beyond weak system-bath coupling regime: A polaron transformation approach

    Science.gov (United States)

    Xu, Dazhi; Cao, Jianshu

    2016-08-01

    The concept of polaron, emerged from condense matter physics, describes the dynamical interaction of moving particle with its surrounding bosonic modes. This concept has been developed into a useful method to treat open quantum systems with a complete range of system-bath coupling strength. Especially, the polaron transformation approach shows its validity in the intermediate coupling regime, in which the Redfield equation or Fermi's golden rule will fail. In the polaron frame, the equilibrium distribution carried out by perturbative expansion presents a deviation from the canonical distribution, which is beyond the usual weak coupling assumption in thermodynamics. A polaron transformed Redfield equation (PTRE) not only reproduces the dissipative quantum dynamics but also provides an accurate and efficient way to calculate the non-equilibrium steady states. Applications of the PTRE approach to problems such as exciton diffusion, heat transport and light-harvesting energy transfer are presented.

  7. Risk-informed Analytical Approaches to Concentration Averaging for the Purpose of Waste Classification

    International Nuclear Information System (INIS)

    Esh, D.W.; Pinkston, K.E.; Barr, C.S.; Bradford, A.H.; Ridge, A.Ch.

    2009-01-01

    Nuclear Regulatory Commission (NRC) staff has developed a concentration averaging approach and guidance for the review of Department of Energy (DOE) non-HLW determinations. Although the approach was focused on this specific application, concentration averaging is generally applicable to waste classification and thus has implications for waste management decisions as discussed in more detail in this paper. In the United States, radioactive waste has historically been classified into various categories for the purpose of ensuring that the disposal system selected is commensurate with the hazard of the waste such that public health and safety will be protected. However, the risk from the near-surface disposal of radioactive waste is not solely a function of waste concentration but is also a function of the volume (quantity) of waste and its accessibility. A risk-informed approach to waste classification for near-surface disposal of low-level waste would consider the specific characteristics of the waste, the quantity of material, and the disposal system features that limit accessibility to the waste. NRC staff has developed example analytical approaches to estimate waste concentration, and therefore waste classification, for waste disposed in facilities or with configurations that were not anticipated when the regulation for the disposal of commercial low-level waste (i.e. 10 CFR Part 61) was developed. (authors)

  8. Improving Wishart Classification of Polarimetric SAR Data Using the Hopfield Neural Network Optimization Approach

    Directory of Open Access Journals (Sweden)

    Íñigo Molina

    2012-11-01

    Full Text Available This paper proposes the optimization relaxation approach based on the analogue Hopfield Neural Network (HNN for cluster refinement of pre-classified Polarimetric Synthetic Aperture Radar (PolSAR image data. We consider the initial classification provided by the maximum-likelihood classifier based on the complex Wishart distribution, which is then supplied to the HNN optimization approach. The goal is to improve the classification results obtained by the Wishart approach. The classification improvement is verified by computing a cluster separability coefficient and a measure of homogeneity within the clusters. During the HNN optimization process, for each iteration and for each pixel, two consistency coefficients are computed, taking into account two types of relations between the pixel under consideration and its corresponding neighbors. Based on these coefficients and on the information coming from the pixel itself, the pixel under study is re-classified. Different experiments are carried out to verify that the proposed approach outperforms other strategies, achieving the best results in terms of separability and a trade-off with the homogeneity preserving relevant structures in the image. The performance is also measured in terms of computational central processing unit (CPU times.

  9. Regime identification in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Giannone, L; Sips, A C C; Kardaun, O; Spreitler, F; Suttrop, W

    2004-01-01

    The ability to recognize the transition from the L-mode to the H-mode or from the H-mode to the improved H-mode reliably from a conveniently small number of measurements in real time is of increasing importance for machine control. Discriminant analysis has been applied to regime identification of plasma discharges in the ASDEX Upgrade tokamak. An observation consists of a set of plasma parameters averaged over a time slice in a discharge. The data set consists of all observations over different discharges and time slices. Discriminant analysis yields coefficients allowing the classification of a new observation. The results of a frequentist and a formal Bayesian approach to discriminant analysis are compared. With five plasma variables, a failure rate of 1.3% for predicting the L-mode and the H-mode confinement regime was achieved. With five plasma variables, a failure rate of 5.3% for predicting the H-mode and the improved H-mode confinement regime was achieved. The coefficients derived by discriminant analysis have been applied subsequently to discharges to illustrate the operation of regime identification in a real time control system

  10. Insights from a Regime Decomposition Approach on CERES and CloudSat-inferred Cloud Radiative Effects

    Science.gov (United States)

    Oreopoulos, L.; Cho, N.; Lee, D.

    2015-12-01

    Our knowledge of the Cloud Radiative Effect (CRE) not only at the Top-of-the-Atmosphere (TOA), but also (with the help of some modeling) at the surface (SFC) and within the atmospheric column (ATM) has been steadily growing in recent years. Not only do we have global values for these CREs, but we can now also plot global maps of their geographical distribution. The next step in our effort to advance our knowledge of CRE is to systematically assess the contributions of prevailing cloud systems to the global values. The presentation addresses this issue directly. We identify the world's prevailing cloud systems, which we call "Cloud Regimes" (CRs) via clustering analysis of MODIS (Aqua-Terra) daily joint histograms of Cloud Top Pressure and Cloud Optical Thickness (TAU) at 1 degree scales. We then composite CERES diurnal values of CRE (TOA, SFC, ATM) separately for each CR by averaging these values for each CR occurrence, and thus find the contribution of each CR to the global value of CRE. But we can do more. We can actually decompose vertical profiles of inferred instantaneous CRE from CloudSat/CALIPSO (2B-FLXHR-LIDAR product) by averaging over Aqua CR occurrences (since A-Train formation flying allows collocation). Such an analysis greatly enhances our understanding of the radiative importance of prevailing cloud mixtures at different atmospheric levels. We can, for example, in addition to examining whether the CERES findings on which CRs contribute to radiative cooling and warming of the atmospheric column are consistent with CloudSat, also gain insight on why and where exactly this happens from the shape of the full instantaneous CRE vertical profiles.

  11. Heuristic approach to the classification of postpartum endometritis and its forms

    Directory of Open Access Journals (Sweden)

    E. A. Balashova

    2017-01-01

    Full Text Available Тhe work is dedicated to the development of a method of automated medical diagnosis based on the description of biomedical systems using two parameters: energy, reflecting the interaction of its elements, and entropy characterizing the organization of the system. The violations of the energy-entropy cycle of biomedical systems is reflected in the symptoms of the disease. Statistical link between the symptoms of the condition of the body and the nature of excitation of its elements best expressed in the heuristic description of the system state. High accuracy classification of the patient's condition is achieved by using heuristic detection methods. In the proposed approach, allowing to estimate the probability of correct diagnosis increases the accuracy of the classification, and the estimated minimum amount of training samples and the capacity of its constituent signs. Classification technique consists in averaging the characteristic values in the selected classes, the preparation of the complex of symptoms of the most important signs of the disease, to conduct a "rough" diagnostic threshold rules that allow to distinguish severe forms of the disease, then differential diagnosis the severity of the disease. The proposed method was tested for classification of the forms of puerperal endometritis (mild, moderate, severe. The training sample contained 70 case histories. Syndrome to classify the patient's condition was composed of 17 characteristics. Threshold diagnosis has allowed to establish the presence of disease and to separate heavy. Differential diagnosis was used for classification of mild and moderate severity of postpartum endometritis. The accuracy of the classification of forms of postpartum endometritis amounted to 97.1%.

  12. TWO-STAGE CHARACTER CLASSIFICATION : A COMBINED APPROACH OF CLUSTERING AND SUPPORT VECTOR CLASSIFIERS

    NARCIS (Netherlands)

    Vuurpijl, L.; Schomaker, L.

    2000-01-01

    This paper describes a two-stage classification method for (1) classification of isolated characters and (2) verification of the classification result. Character prototypes are generated using hierarchical clustering. For those prototypes known to sometimes produce wrong classification results, a

  13. A novel underwater dam crack detection and classification approach based on sonar images.

    Science.gov (United States)

    Shi, Pengfei; Fan, Xinnan; Ni, Jianjun; Khan, Zubair; Li, Min

    2017-01-01

    Underwater dam crack detection and classification based on sonar images is a challenging task because underwater environments are complex and because cracks are quite random and diverse in nature. Furthermore, obtainable sonar images are of low resolution. To address these problems, a novel underwater dam crack detection and classification approach based on sonar imagery is proposed. First, the sonar images are divided into image blocks. Second, a clustering analysis of a 3-D feature space is used to obtain the crack fragments. Third, the crack fragments are connected using an improved tensor voting method. Fourth, a minimum spanning tree is used to obtain the crack curve. Finally, an improved evidence theory combined with fuzzy rule reasoning is proposed to classify the cracks. Experimental results show that the proposed approach is able to detect underwater dam cracks and classify them accurately and effectively under complex underwater environments.

  14. A novel underwater dam crack detection and classification approach based on sonar images.

    Directory of Open Access Journals (Sweden)

    Pengfei Shi

    Full Text Available Underwater dam crack detection and classification based on sonar images is a challenging task because underwater environments are complex and because cracks are quite random and diverse in nature. Furthermore, obtainable sonar images are of low resolution. To address these problems, a novel underwater dam crack detection and classification approach based on sonar imagery is proposed. First, the sonar images are divided into image blocks. Second, a clustering analysis of a 3-D feature space is used to obtain the crack fragments. Third, the crack fragments are connected using an improved tensor voting method. Fourth, a minimum spanning tree is used to obtain the crack curve. Finally, an improved evidence theory combined with fuzzy rule reasoning is proposed to classify the cracks. Experimental results show that the proposed approach is able to detect underwater dam cracks and classify them accurately and effectively under complex underwater environments.

  15. Childhood leukodystrophies: A literature review of updates on new definitions, classification, diagnostic approach and management.

    Science.gov (United States)

    Ashrafi, Mahmoud Reza; Tavasoli, Ali Reza

    2017-05-01

    Childhood leukodystrophies are a growing category of neurological disorders in pediatric neurology practice. With the help of new advanced genetic studies such as whole exome sequencing (WES) and whole genome sequencing (WGS), the list of childhood heritable white matter disorders has been increased to more than one hundred disorders. During the last three decades, the basic concepts and definitions, classification, diagnostic approach and medical management of these disorders much have changed. Pattern recognition based on brain magnetic resonance imaging (MRI), has played an important role in this process. We reviewed the last Global Leukodystrophy Initiative (GLIA) expert opinions in definition, new classification, diagnostic approach and medical management including emerging treatments for pediatric leukodystrophies. Copyright © 2017 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  16. A Similarity-Based Approach for Audiovisual Document Classification Using Temporal Relation Analysis

    Directory of Open Access Journals (Sweden)

    Ferrane Isabelle

    2011-01-01

    Full Text Available Abstract We propose a novel approach for video classification that bases on the analysis of the temporal relationships between the basic events in audiovisual documents. Starting from basic segmentation results, we define a new representation method that is called Temporal Relation Matrix (TRM. Each document is then described by a set of TRMs, the analysis of which makes events of a higher level stand out. This representation has been first designed to analyze any audiovisual document in order to find events that may well characterize its content and its structure. The aim of this work is to use this representation to compute a similarity measure between two documents. Approaches for audiovisual documents classification are presented and discussed. Experimentations are done on a set of 242 video documents and the results show the efficiency of our proposals.

  17. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  18. An efficient approach for video action classification based on 3d Zernike moments

    OpenAIRE

    Lassoued , Imen; Zagrouba , Ezzedine; Chahir , Youssef

    2011-01-01

    International audience; Action recognition in video and still image is one of the most challenging research topics in pattern recognition and computer vision. This paper proposes a new method for video action classification based on 3D Zernike moments. These last ones aim to capturing both structural and temporal information of a time varying sequence. The originality of this approach consists to represent actions in video sequences by a three-dimension shape obtained from different silhouett...

  19. Parallel exploitation of a spatial-spectral classification approach for hyperspectral images on RVC-CAL

    Science.gov (United States)

    Lazcano, R.; Madroñal, D.; Fabelo, H.; Ortega, S.; Salvador, R.; Callicó, G. M.; Juárez, E.; Sanz, C.

    2017-10-01

    Hyperspectral Imaging (HI) assembles high resolution spectral information from hundreds of narrow bands across the electromagnetic spectrum, thus generating 3D data cubes in which each pixel gathers the spectral information of the reflectance of every spatial pixel. As a result, each image is composed of large volumes of data, which turns its processing into a challenge, as performance requirements have been continuously tightened. For instance, new HI applications demand real-time responses. Hence, parallel processing becomes a necessity to achieve this requirement, so the intrinsic parallelism of the algorithms must be exploited. In this paper, a spatial-spectral classification approach has been implemented using a dataflow language known as RVCCAL. This language represents a system as a set of functional units, and its main advantage is that it simplifies the parallelization process by mapping the different blocks over different processing units. The spatial-spectral classification approach aims at refining the classification results previously obtained by using a K-Nearest Neighbors (KNN) filtering process, in which both the pixel spectral value and the spatial coordinates are considered. To do so, KNN needs two inputs: a one-band representation of the hyperspectral image and the classification results provided by a pixel-wise classifier. Thus, spatial-spectral classification algorithm is divided into three different stages: a Principal Component Analysis (PCA) algorithm for computing the one-band representation of the image, a Support Vector Machine (SVM) classifier, and the KNN-based filtering algorithm. The parallelization of these algorithms shows promising results in terms of computational time, as the mapping of them over different cores presents a speedup of 2.69x when using 3 cores. Consequently, experimental results demonstrate that real-time processing of hyperspectral images is achievable.

  20. Improving oil classification quality from oil spill fingerprint beyond six sigma approach.

    Science.gov (United States)

    Juahir, Hafizan; Ismail, Azimah; Mohamed, Saiful Bahri; Toriman, Mohd Ekhwan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wah, Wong Kok; Zali, Munirah Abdul; Retnam, Ananthy; Taib, Mohd Zaki Mohd; Mokhtar, Mazlin

    2017-07-15

    This study involves the use of quality engineering in oil spill classification based on oil spill fingerprinting from GC-FID and GC-MS employing the six-sigma approach. The oil spills are recovered from various water areas of Peninsular Malaysia and Sabah (East Malaysia). The study approach used six sigma methodologies that effectively serve as the problem solving in oil classification extracted from the complex mixtures of oil spilled dataset. The analysis of six sigma link with the quality engineering improved the organizational performance to achieve its objectivity of the environmental forensics. The study reveals that oil spills are discriminated into four groups' viz. diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) according to the similarity of the intrinsic chemical properties. Through the validation, it confirmed that four discriminant component, diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) dominate the oil types with a total variance of 99.51% with ANOVA giving F stat >F critical at 95% confidence level and a Chi Square goodness test of 74.87. Results obtained from this study reveals that by employing six-sigma approach in a data-driven problem such as in the case of oil spill classification, good decision making can be expedited. Copyright © 2017. Published by Elsevier Ltd.

  1. Emulating natural disturbance regimes: an emerging approach for sustainable forest management

    Science.gov (United States)

    M. North; W Keeton

    2008-01-01

    Sustainable forest management integrates ecological, social, and economic objectives. To achieve the former, researchers and practitioners are modifying silvicultural practices based on concepts from successional and landscape ecology to provide a broader array of ecosystem functions than is associated with conventional approaches. One...

  2. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  3. A coarse-to-fine approach for medical hyperspectral image classification with sparse representation

    Science.gov (United States)

    Chang, Lan; Zhang, Mengmeng; Li, Wei

    2017-10-01

    A coarse-to-fine approach with sparse representation is proposed for medical hyperspectral image classification in this work. Segmentation technique with different scales is employed to exploit edges of the input image, where coarse super-pixel patches provide global classification information while fine ones further provide detail information. Different from common RGB image, hyperspectral image has multi bands to adjust the cluster center with more high precision. After segmentation, each super pixel is classified by recently-developed sparse representation-based classification (SRC), which assigns label for testing samples in one local patch by means of sparse linear combination of all the training samples. Furthermore, segmentation with multiple scales is employed because single scale is not suitable for complicate distribution of medical hyperspectral imagery. Finally, classification results for different sizes of super pixel are fused by some fusion strategy, offering at least two benefits: (1) the final result is obviously superior to that of segmentation with single scale, and (2) the fusion process significantly simplifies the choice of scales. Experimental results using real medical hyperspectral images demonstrate that the proposed method outperforms the state-of-the-art SRC.

  4. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    Science.gov (United States)

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  5. A Neuro-Fuzzy Approach in the Classification of Students’ Academic Performance

    Directory of Open Access Journals (Sweden)

    Quang Hung Do

    2013-01-01

    Full Text Available Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  6. Martingale approach in pricing and hedging European options under regime-switching

    OpenAIRE

    Grigori N. Milstein; Vladimir Spokoiny

    2011-01-01

    The paper focuses on the problem of pricing and hedging a European contingent claim for an incomplete market model, in which evolution of price processes for a saving account and stocks depends on an observable Markov chain. The pricing function is evaluated using the martingale approach. The equivalent martingale measure is introduced in a way that the Markov chain remains the historical one, and the pricing function satisfies the Cauchy problem for a system of linear parabolic equations. It...

  7. A regime-switching cointegration approach for removing environmental and operational variations in structural health monitoring

    Science.gov (United States)

    Shi, Haichen; Worden, Keith; Cross, Elizabeth J.

    2018-03-01

    Cointegration is now extensively used to model the long term common trends among economic variables in the field of econometrics. Recently, cointegration has been successfully implemented in the context of structural health monitoring (SHM), where it has been used to remove the confounding influences of environmental and operational variations (EOVs) that can often mask the signature of structural damage. However, restrained by its linear nature, the conventional cointegration approach has limited power in modelling systems where measurands are nonlinearly related; this occurs, for example, in the benchmark study of the Z24 Bridge, where nonlinear relationships between natural frequencies were induced during a period of very cold temperatures. To allow the removal of EOVs from SHM data with nonlinear relationships like this, this paper extends the well-established cointegration method to a nonlinear context, which is to allow a breakpoint in the cointegrating vector. In a novel approach, the augmented Dickey-Fuller (ADF) statistic is used to find which position is most appropriate for inserting a breakpoint, the Johansen procedure is then utilised for the estimation of cointegrating vectors. The proposed approach is examined with a simulated case and real SHM data from the Z24 Bridge, demonstrating that the EOVs can be neatly eliminated.

  8. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  9. A Comparison of Computer-Based Classification Testing Approaches Using Mixed-Format Tests with the Generalized Partial Credit Model

    Science.gov (United States)

    Kim, Jiseon

    2010-01-01

    Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…

  10. A Framework and Classification for Fault Detection Approaches in Wireless Sensor Networks with an Energy Efficiency Perspective

    DEFF Research Database (Denmark)

    Zhang, Yue; Dragoni, Nicola; Wang, Jiangtao

    2015-01-01

    efficiency to facilitate the design of fault detection methods and the evaluation of their energy efficiency. Following the same design principle of the fault detection framework, the paper proposes a classification for fault detection approaches. The classification is applied to a number of fault detection...

  11. Developing a regional scale approach for modelling the impacts of fertiliser regime on N2O emissions in Ireland

    Science.gov (United States)

    Zimmermann, Jesko; Jones, Michael

    2016-04-01

    error (RMSE < RMSE95) or bias (RE< RE95). A general trend observed was that model performance declined with increased fertilisation rates. Overall, DayCent showed the best performance, however it does not provide the possibility to model the addition urease inhibitors. The results suggest that modelling changes in fertiliser regime on a large scale may require a multi-model approach to assure best performance. Ultimately, the research aims to develop a GIS based platform to apply such an approach on a regional scale.

  12. Classification of follicular lymphoma images: a holistic approach with symbol-based machine learning methods.

    Science.gov (United States)

    Zorman, Milan; Sánchez de la Rosa, José Luis; Dinevski, Dejan

    2011-12-01

    It is not very often to see a symbol-based machine learning approach to be used for the purpose of image classification and recognition. In this paper we will present such an approach, which we first used on the follicular lymphoma images. Lymphoma is a broad term encompassing a variety of cancers of the lymphatic system. Lymphoma is differentiated by the type of cell that multiplies and how the cancer presents itself. It is very important to get an exact diagnosis regarding lymphoma and to determine the treatments that will be most effective for the patient's condition. Our work was focused on the identification of lymphomas by finding follicles in microscopy images provided by the Laboratory of Pathology in the University Hospital of Tenerife, Spain. We divided our work in two stages: in the first stage we did image pre-processing and feature extraction, and in the second stage we used different symbolic machine learning approaches for pixel classification. Symbolic machine learning approaches are often neglected when looking for image analysis tools. They are not only known for a very appropriate knowledge representation, but also claimed to lack computational power. The results we got are very promising and show that symbolic approaches can be successful in image analysis applications.

  13. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  14. Fuzzy Continuous Review Inventory Model using ABC Multi-Criteria Classification Approach: A Single Case Study

    Directory of Open Access Journals (Sweden)

    Meriastuti - Ginting

    2015-07-01

    Full Text Available Abstract. Inventory is considered as the most expensive, yet important,to any companies. It representsapproximately 50% of the total investment. Inventory cost has become one of the majorcontributorsto inefficiency, therefore it should be managed effectively. This study aims to propose an alternative inventory model,  by using ABC multi-criteria classification approach to minimize total cost. By combining FANP (Fuzzy Analytical Network Process and TOPSIS (Technique of Order Preferences by Similarity to the Ideal Solution, the ABC multi-criteria classification approach identified 12 items of 69 inventory items as “outstanding important class” that contributed to 80% total inventory cost. This finding  is then used as the basis to determine the proposed continuous review inventory model.This study found that by using fuzzy trapezoidal cost, the inventory  turnover ratio can be increased, and inventory cost can be decreased by 78% for each item in “class A” inventory.Keywords:ABC multi-criteria classification, FANP-TOPSIS, continuous review inventory model lead-time demand distribution, trapezoidal fuzzy number 

  15. Transient regimes during high-temperature deformation of a bulk metallic glass: A free volume approach

    International Nuclear Information System (INIS)

    Bletry, M.; Guyot, P.; Brechet, Y.; Blandin, J.J.; Soubeyroux, J.L.

    2007-01-01

    The homogeneous deformation of a zirconium-based bulk metallic glass is investigated in the glass transition range. Compression and stress-relaxation tests have been conducted. The stress-strain curves are modeled in the framework of the free volume theory, including transient phenomena (overshoot and undershoot). This approach allows several physical parameters (activation volume, flow defect creation and relaxation coefficient) to be determined from a mechanical experiment. This model is able to rationalize the dependency of stress overshoot on relaxation time. It is shown that, due to the relationship between flow defect concentration and free volume model, it is impossible to determine the equilibrium flow defect concentration. However, the relative variation of flow defect is always the same, and all the model parameters depend on the equilibrium flow defect concentration. The methodology presented in this paper should, in the future, allow the consistency of the free volume model to be assessed

  16. Classification of gene expression data: A hubness-aware semi-supervised approach.

    Science.gov (United States)

    Buza, Krisztian

    2016-04-01

    Classification of gene expression data is the common denominator of various biomedical recognition tasks. However, obtaining class labels for large training samples may be difficult or even impossible in many cases. Therefore, semi-supervised classification techniques are required as semi-supervised classifiers take advantage of unlabeled data. Gene expression data is high-dimensional which gives rise to the phenomena known under the umbrella of the curse of dimensionality, one of its recently explored aspects being the presence of hubs or hubness for short. Therefore, hubness-aware classifiers have been developed recently, such as Naive Hubness-Bayesian k-Nearest Neighbor (NHBNN). In this paper, we propose a semi-supervised extension of NHBNN which follows the self-training schema. As one of the core components of self-training is the certainty score, we propose a new hubness-aware certainty score. We performed experiments on publicly available gene expression data. These experiments show that the proposed classifier outperforms its competitors. We investigated the impact of each of the components (classification algorithm, semi-supervised technique, hubness-aware certainty score) separately and showed that each of these components are relevant to the performance of the proposed approach. Our results imply that our approach may increase classification accuracy and reduce computational costs (i.e., runtime). Based on the promising results presented in the paper, we envision that hubness-aware techniques will be used in various other biomedical machine learning tasks. In order to accelerate this process, we made an implementation of hubness-aware machine learning techniques publicly available in the PyHubs software package (http://www.biointelligence.hu/pyhubs) implemented in Python, one of the most popular programming languages of data science. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. A Transform-Based Feature Extraction Approach for Motor Imagery Tasks Classification

    Science.gov (United States)

    Khorshidtalab, Aida; Mesbah, Mostefa; Salami, Momoh J. E.

    2015-01-01

    In this paper, we present a new motor imagery classification method in the context of electroencephalography (EEG)-based brain–computer interface (BCI). This method uses a signal-dependent orthogonal transform, referred to as linear prediction singular value decomposition (LP-SVD), for feature extraction. The transform defines the mapping as the left singular vectors of the LP coefficient filter impulse response matrix. Using a logistic tree-based model classifier; the extracted features are classified into one of four motor imagery movements. The proposed approach was first benchmarked against two related state-of-the-art feature extraction approaches, namely, discrete cosine transform (DCT) and adaptive autoregressive (AAR)-based methods. By achieving an accuracy of 67.35%, the LP-SVD approach outperformed the other approaches by large margins (25% compared with DCT and 6 % compared with AAR-based methods). To further improve the discriminatory capability of the extracted features and reduce the computational complexity, we enlarged the extracted feature subset by incorporating two extra features, namely, Q- and the Hotelling’s \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$T^{2}$ \\end{document} statistics of the transformed EEG and introduced a new EEG channel selection method. The performance of the EEG classification based on the expanded feature set and channel selection method was compared with that of a number of the state-of-the-art classification methods previously reported with the BCI IIIa competition data set. Our method came second with an average accuracy of 81.38%. PMID:27170898

  18. Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    LI Jian-Wei

    2014-08-01

    Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.

  19. A Novel Approach to ECG Classification Based upon Two-Layered HMMs in Body Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wei Liang

    2014-03-01

    Full Text Available This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient’s ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen.

  20. A Novel Approach to ECG Classification Based upon Two-Layered HMMs in Body Sensor Networks

    Science.gov (United States)

    Liang, Wei; Zhang, Yinlong; Tan, Jindong; Li, Yang

    2014-01-01

    This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient's ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS) filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs) are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC) in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN) platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen. PMID:24681668

  1. Cloud field classification based upon high spatial resolution textural features. II - Simplified vector approaches

    Science.gov (United States)

    Chen, D. W.; Sengupta, S. K.; Welch, R. M.

    1989-01-01

    This paper compares the results of cloud-field classification derived from two simplified vector approaches, the Sum and Difference Histogram (SADH) and the Gray Level Difference Vector (GLDV), with the results produced by the Gray Level Cooccurrence Matrix (GLCM) approach described by Welch et al. (1988). It is shown that the SADH method produces accuracies equivalent to those obtained using the GLCM method, while the GLDV method fails to resolve error clusters. Compared to the GLCM method, the SADH method leads to a 31 percent saving in run time and a 50 percent saving in storage requirements, while the GLVD approach leads to a 40 percent saving in run time and an 87 percent saving in storage requirements.

  2. Application of information retrieval approaches to case classification in the vaccine adverse event reporting system.

    Science.gov (United States)

    Botsis, Taxiarchis; Woo, Emily Jane; Ball, Robert

    2013-07-01

    Automating the classification of adverse event reports is an important step to improve the efficiency of vaccine safety surveillance. Previously we showed it was possible to classify reports using features extracted from the text of the reports. The aim of this study was to use the information encoded in the Medical Dictionary for Regulatory Activities (MedDRA(®)) in the US Vaccine Adverse Event Reporting System (VAERS) to support and evaluate two classification approaches: a multiple information retrieval strategy and a rule-based approach. To evaluate the performance of these approaches, we selected the conditions of anaphylaxis and Guillain-Barré syndrome (GBS). We used MedDRA(®) Preferred Terms stored in the VAERS, and two standardized medical terminologies: the Brighton Collaboration (BC) case definitions and Standardized MedDRA(®) Queries (SMQ) to classify two sets of reports for GBS and anaphylaxis. Two approaches were used: (i) the rule-based instruments that are available by the two terminologies (the Automatic Brighton Classification [ABC] tool and the SMQ algorithms); and (ii) the vector space model. We found that the rule-based instruments, particularly the SMQ algorithms, achieved a high degree of specificity; however, there was a cost in terms of sensitivity in all but the narrow GBS SMQ algorithm that outperformed the remaining approaches (sensitivity in the testing set was equal to 99.06 % for this algorithm vs. 93.40 % for the vector space model). In the case of anaphylaxis, the vector space model achieved higher sensitivity compared with the best values of both the ABC tool and the SMQ algorithms in the testing set (86.44 % vs. 64.11 % and 52.54 %, respectively). Our results showed the superiority of the vector space model over the existing rule-based approaches irrespective of the standardized medical knowledge represented by either the SMQ or the BC case definition. The vector space model might make automation of case definitions for

  3. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach

    Directory of Open Access Journals (Sweden)

    Miraemiliana Murat

    2017-09-01

    Full Text Available Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM, Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD, Histogram of Oriented Gradients (HOG, Hu invariant moments (Hu and Zernike moments (ZM. Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN, random forest (RF, support vector machine (SVM, k-nearest neighbour (k-NN, linear discriminant analysis (LDA and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM. In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS and Pearson’s coefficient correlation (PCC. The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia

  4. Neuropsychological assessment of individuals with brain tumor: Comparison of approaches used in the classification of impairment

    Directory of Open Access Journals (Sweden)

    Toni Maree Dwan

    2015-03-01

    Full Text Available Approaches to classifying neuropsychological impairment after brain tumor vary according to testing level (individual tests, domains or global index and source of reference (i.e., norms, controls and premorbid functioning. This study aimed to compare rates of impairment according to different classification approaches. Participants were 44 individuals (57% female with a primary brain tumor diagnosis (mean age = 45.6 years and 44 matched control participants (59% female, mean age = 44.5 years. All participants completed a test battery that assesses premorbid IQ (Wechsler Adult Reading Test, attention/processing speed (Digit Span, Trail Making Test A, memory (Hopkins Verbal Learning Test – Revised, Rey-Osterrieth Complex Figure-recall and executive function (Trail Making Test B, Rey-Osterrieth Complex Figure copy, Controlled Oral Word Association Test. Results indicated that across the different sources of reference, 86-93% of participants were classified as impaired at a test-specific level, 61-73% were classified as impaired at a domain-specific level, and 32-50% were classified as impaired at a global level. Rates of impairment did not significantly differ according to source of reference (p>.05; however, at the individual participant level, classification based on estimated premorbid IQ was often inconsistent with classification based on the norms or controls. Participants with brain tumor performed significantly poorer than matched controls on tests of neuropsychological functioning, including executive function (p=.001 and memory (p.05. These results highlight the need to examine individuals’ performance across a multi-faceted neuropsychological test battery to avoid over- or under-estimation of impairment.

  5. Improved Wetland Classification Using Eight-Band High Resolution Satellite Imagery and a Hybrid Approach

    Directory of Open Access Journals (Sweden)

    Charles R. Lane

    2014-12-01

    Full Text Available Although remote sensing technology has long been used in wetland inventory and monitoring, the accuracy and detail level of wetland maps derived with moderate resolution imagery and traditional techniques have been limited and often unsatisfactory. We explored and evaluated the utility of a newly launched high-resolution, eight-band satellite system (Worldview-2; WV2 for identifying and classifying freshwater deltaic wetland vegetation and aquatic habitats in the Selenga River Delta of Lake Baikal, Russia, using a hybrid approach and a novel application of Indicator Species Analysis (ISA. We achieved an overall classification accuracy of 86.5% (Kappa coefficient: 0.85 for 22 classes of aquatic and wetland habitats and found that additional metrics, such as the Normalized Difference Vegetation Index and image texture, were valuable for improving the overall classification accuracy and particularly for discriminating among certain habitat classes. Our analysis demonstrated that including WV2’s four spectral bands from parts of the spectrum less commonly used in remote sensing analyses, along with the more traditional bandwidths, contributed to the increase in the overall classification accuracy by ~4% overall, but with considerable increases in our ability to discriminate certain communities. The coastal band improved differentiating open water and aquatic (i.e., vegetated habitats, and the yellow, red-edge, and near-infrared 2 bands improved discrimination among different vegetated aquatic and terrestrial habitats. The use of ISA provided statistical rigor in developing associations between spectral classes and field-based data. Our analyses demonstrated the utility of a hybrid approach and the benefit of additional bands and metrics in providing the first spatially explicit mapping of a large and heterogeneous wetland system.

  6. Detection and classification of interstitial lung diseases and emphysema using a joint morphological-fuzzy approach

    Science.gov (United States)

    Chang Chien, Kuang-Che; Fetita, Catalin; Brillet, Pierre-Yves; Prêteux, Françoise; Chang, Ruey-Feng

    2009-02-01

    Multi-detector computed tomography (MDCT) has high accuracy and specificity on volumetrically capturing serial images of the lung. It increases the capability of computerized classification for lung tissue in medical research. This paper proposes a three-dimensional (3D) automated approach based on mathematical morphology and fuzzy logic for quantifying and classifying interstitial lung diseases (ILDs) and emphysema. The proposed methodology is composed of several stages: (1) an image multi-resolution decomposition scheme based on a 3D morphological filter is used to detect and analyze the different density patterns of the lung texture. Then, (2) for each pattern in the multi-resolution decomposition, six features are computed, for which fuzzy membership functions define a probability of association with a pathology class. Finally, (3) for each pathology class, the probabilities are combined up according to the weight assigned to each membership function and two threshold values are used to decide the final class of the pattern. The proposed approach was tested on 10 MDCT cases and the classification accuracy was: emphysema: 95%, fibrosis/honeycombing: 84% and ground glass: 97%.

  7. A long-memory model of motor learning in the saccadic system: a regime-switching approach.

    Science.gov (United States)

    Wong, Aaron L; Shelhamer, Mark

    2013-08-01

    Maintenance of movement accuracy relies on motor learning, by which prior errors guide future behavior. One aspect of this learning process involves the accurate generation of predictions of movement outcome. These predictions can, for example, drive anticipatory movements during a predictive-saccade task. Predictive saccades are rapid eye movements made to anticipated future targets based on error information from prior movements. This predictive process exhibits long-memory (fractal) behavior, as suggested by inter-trial fluctuations. Here, we model this learning process using a regime-switching approach, which avoids the computational complexities associated with true long-memory processes. The resulting model demonstrates two fundamental characteristics. First, long-memory behavior can be mimicked by a system possessing no true long-term memory, producing model outputs consistent with human-subjects performance. In contrast, the popular two-state model, which is frequently used in motor learning, cannot replicate these findings. Second, our model suggests that apparent long-term memory arises from the trade-off between correcting for the most recent movement error and maintaining consistent long-term behavior. Thus, the model surprisingly predicts that stronger long-memory behavior correlates to faster learning during adaptation (in which systematic errors drive large behavioral changes); greater apparent long-term memory indicates more effective incorporation of error from the cumulative history across trials.

  8. Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach

    Science.gov (United States)

    Taufik, Afirah; Sakinah Syed Ahmad, Sharifah

    2016-06-01

    The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.

  9. Exchange rate regimes and monetary arrangements

    Directory of Open Access Journals (Sweden)

    Ivan Ribnikar

    2005-06-01

    Full Text Available There is a close relationship between a country’s exchange rate regime and monetary arrangement and if we are to examine monetary arrangements then exchange rate regimes must first be analysed. Within the conventional and most widely used classification of exchange rate regimes into rigid and flexible or into polar regimes (hard peg and float on one side, and intermediate regimes on the other there, is a much greater variety among intermediate regimes. A more precise and, as will be seen, more useful classification of exchange rate regimes is the first topic of the paper. The second topic is how exchange rate regimes influence or determine monetary arrangements and monetary policy or monetary policy regimes: monetary autonomy versus monetary nonautonomy and discretion in monetary policy versus commitment in monetary policy. Both topics are important for countries on their path to the EU and the euro area

  10. a Point Cloud Classification Approach Based on Vertical Structures of Ground Objects

    Science.gov (United States)

    Zhao, Y.; Hu, Q.; Hu, W.

    2018-04-01

    This paper proposes a novel method for point cloud classification using vertical structural characteristics of ground objects. Since urbanization develops rapidly nowadays, urban ground objects also change frequently. Conventional photogrammetric methods cannot satisfy the requirements of updating the ground objects' information efficiently, so LiDAR (Light Detection and Ranging) technology is employed to accomplish this task. LiDAR data, namely point cloud data, can obtain detailed three-dimensional coordinates of ground objects, but this kind of data is discrete and unorganized. To accomplish ground objects classification with point cloud, we first construct horizontal grids and vertical layers to organize point cloud data, and then calculate vertical characteristics, including density and measures of dispersion, and form characteristic curves for each grids. With the help of PCA processing and K-means algorithm, we analyze the similarities and differences of characteristic curves. Curves that have similar features will be classified into the same class and point cloud correspond to these curves will be classified as well. The whole process is simple but effective, and this approach does not need assistance of other data sources. In this study, point cloud data are classified into three classes, which are vegetation, buildings, and roads. When horizontal grid spacing and vertical layer spacing are 3 m and 1 m respectively, vertical characteristic is set as density, and the number of dimensions after PCA processing is 11, the overall precision of classification result is about 86.31 %. The result can help us quickly understand the distribution of various ground objects.

  11. Crown-level tree species classification from AISA hyperspectral imagery using an innovative pixel-weighting approach

    Science.gov (United States)

    Liu, Haijian; Wu, Changshan

    2018-06-01

    Crown-level tree species classification is a challenging task due to the spectral similarity among different tree species. Shadow, underlying objects, and other materials within a crown may decrease the purity of extracted crown spectra and further reduce classification accuracy. To address this problem, an innovative pixel-weighting approach was developed for tree species classification at the crown level. The method utilized high density discrete LiDAR data for individual tree delineation and Airborne Imaging Spectrometer for Applications (AISA) hyperspectral imagery for pure crown-scale spectra extraction. Specifically, three steps were included: 1) individual tree identification using LiDAR data, 2) pixel-weighted representative crown spectra calculation using hyperspectral imagery, with which pixel-based illuminated-leaf fractions estimated using a linear spectral mixture analysis (LSMA) were employed as weighted factors, and 3) representative spectra based tree species classification was performed through applying a support vector machine (SVM) approach. Analysis of results suggests that the developed pixel-weighting approach (OA = 82.12%, Kc = 0.74) performed better than treetop-based (OA = 70.86%, Kc = 0.58) and pixel-majority methods (OA = 72.26, Kc = 0.62) in terms of classification accuracy. McNemar tests indicated the differences in accuracy between pixel-weighting and treetop-based approaches as well as that between pixel-weighting and pixel-majority approaches were statistically significant.

  12. Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben

    Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based on the po......Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based...

  13. A novel approach to analysing the regimes of temporary streams in relation to their controls on the composition and structure of aquatic biota

    Science.gov (United States)

    Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; De Girolamo, A. M.; Lo Porto, A.; Buffagni, A.; Erba, S.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Skoulikidis, N.; Gómez, R.; Sánchez-Montoya, M. M.; Froebrich, J.

    2012-09-01

    Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the availability of mesohabitats in given periods that

  14. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    Science.gov (United States)

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-01-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. Here, we illustrate how the method can be used to: (1) distinguish between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required. PMID:26156000

  15. Machine Learning Based Classification of Microsatellite Variation: An Effective Approach for Phylogeographic Characterization of Olive Populations.

    Science.gov (United States)

    Torkzaban, Bahareh; Kayvanjoo, Amir Hossein; Ardalan, Arman; Mousavi, Soraya; Mariotti, Roberto; Baldoni, Luciana; Ebrahimie, Esmaeil; Ebrahimi, Mansour; Hosseini-Mazinani, Mehdi

    2015-01-01

    Finding efficient analytical techniques is overwhelmingly turning into a bottleneck for the effectiveness of large biological data. Machine learning offers a novel and powerful tool to advance classification and modeling solutions in molecular biology. However, these methods have been less frequently used with empirical population genetics data. In this study, we developed a new combined approach of data analysis using microsatellite marker data from our previous studies of olive populations using machine learning algorithms. Herein, 267 olive accessions of various origins including 21 reference cultivars, 132 local ecotypes, and 37 wild olive specimens from the Iranian plateau, together with 77 of the most represented Mediterranean varieties were investigated using a finely selected panel of 11 microsatellite markers. We organized data in two '4-targeted' and '16-targeted' experiments. A strategy of assaying different machine based analyses (i.e. data cleaning, feature selection, and machine learning classification) was devised to identify the most informative loci and the most diagnostic alleles to represent the population and the geography of each olive accession. These analyses revealed microsatellite markers with the highest differentiating capacity and proved efficiency for our method of clustering olive accessions to reflect upon their regions of origin. A distinguished highlight of this study was the discovery of the best combination of markers for better differentiating of populations via machine learning models, which can be exploited to distinguish among other biological populations.

  16. An improved discriminative filter bank selection approach for motor imagery EEG signal classification using mutual information.

    Science.gov (United States)

    Kumar, Shiu; Sharma, Alok; Tsunoda, Tatsuhiko

    2017-12-28

    Common spatial pattern (CSP) has been an effective technique for feature extraction in electroencephalography (EEG) based brain computer interfaces (BCIs). However, motor imagery EEG signal feature extraction using CSP generally depends on the selection of the frequency bands to a great extent. In this study, we propose a mutual information based frequency band selection approach. The idea of the proposed method is to utilize the information from all the available channels for effectively selecting the most discriminative filter banks. CSP features are extracted from multiple overlapping sub-bands. An additional sub-band has been introduced that cover the wide frequency band (7-30 Hz) and two different types of features are extracted using CSP and common spatio-spectral pattern techniques, respectively. Mutual information is then computed from the extracted features of each of these bands and the top filter banks are selected for further processing. Linear discriminant analysis is applied to the features extracted from each of the filter banks. The scores are fused together, and classification is done using support vector machine. The proposed method is evaluated using BCI Competition III dataset IVa, BCI Competition IV dataset I and BCI Competition IV dataset IIb, and it outperformed all other competing methods achieving the lowest misclassification rate and the highest kappa coefficient on all three datasets. Introducing a wide sub-band and using mutual information for selecting the most discriminative sub-bands, the proposed method shows improvement in motor imagery EEG signal classification.

  17. Classification by a neural network approach applied to non destructive testing

    International Nuclear Information System (INIS)

    Lefevre, M.; Preteux, F.; Lavayssiere, B.

    1995-01-01

    Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this paper, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighborhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (authors). 21 refs., 5 figs

  18. Multi-Objective Particle Swarm Optimization Approach for Cost-Based Feature Selection in Classification.

    Science.gov (United States)

    Zhang, Yong; Gong, Dun-Wei; Cheng, Jian

    2017-01-01

    Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing. Generally, there are some situations where a user is interested in not only maximizing the classification performance but also minimizing the cost that may be associated with features. This kind of problem is called cost-based feature selection. However, most existing feature selection approaches treat this task as a single-objective optimization problem. This paper presents the first study of multi-objective particle swarm optimization (PSO) for cost-based feature selection problems. The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications. In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO. The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets. Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.

  19. Systemic classification for a new diagnostic approach to acute abdominal pain in children.

    Science.gov (United States)

    Kim, Ji Hoi; Kang, Hyun Sik; Han, Kyung Hee; Kim, Seung Hyo; Shin, Kyung-Sue; Lee, Mu Suk; Jeong, In Ho; Kim, Young Sil; Kang, Ki-Soo

    2014-12-01

    With previous methods based on only age and location, there are many difficulties in identifying the etiology of acute abdominal pain in children. We sought to develop a new systematic classification of acute abdominal pain and to give some helps to physicians encountering difficulties in diagnoses. From March 2005 to May 2010, clinical data were collected retrospectively from 442 children hospitalized due to acute abdominal pain with no apparent underlying disease. According to the final diagnoses, diseases that caused acute abdominal pain were classified into nine groups. The nine groups were group I "catastrophic surgical abdomen" (7 patients, 1.6%), group II "acute appendicitis and mesenteric lymphadenitis" (56 patients, 12.7%), group III "intestinal obstruction" (57 patients, 12.9%), group IV "viral and bacterial acute gastroenteritis" (90 patients, 20.4%), group V "peptic ulcer and gastroduodenitis" (66 patients, 14.9%), group VI "hepatobiliary and pancreatic disease" (14 patients, 3.2%), group VII "febrile viral illness and extraintestinal infection" (69 patients, 15.6%), group VIII "functional gastrointestinal disorder (acute manifestation)" (20 patients, 4.5%), and group IX "unclassified acute abdominal pain" (63 patients, 14.3%). Four patients were enrolled in two disease groups each. Patients were distributed unevenly across the nine groups of acute abdominal pain. In particular, the "unclassified abdominal pain" only group was not uncommon. Considering a systemic classification for acute abdominal pain may be helpful in the diagnostic approach in children.

  20. Flow regimes

    International Nuclear Information System (INIS)

    Liles, D.R.

    1982-01-01

    Internal boundaries in multiphase flow greatly complicate fluid-dynamic and heat-transfer descriptions. Different flow regimes or topological configurations can have radically dissimilar interfacial and wall mass, momentum, and energy exchanges. To model the flow dynamics properly requires estimates of these rates. In this paper the common flow regimes for gas-liquid systems are defined and the techniques used to estimate the extent of a particular regime are described. Also, the current computer-code procedures are delineated and introduce a potentially better method is introduced

  1. A NOVEL APPROACH TO ARRHYTHMIA CLASSIFICATION USING RR INTERVAL AND TEAGER ENERGY

    Directory of Open Access Journals (Sweden)

    CHANDRAKAR KAMATH

    2012-12-01

    Full Text Available It is hypothesized that a key characteristic of electrocardiogram (ECG signal is its nonlinear dynamic behaviour and that the nonlinear component changes more significantly between normal and arrhythmia conditions than the linear component. The usual statistical descriptors used in RR (R to R interval analysis do not capture the nonlinear disposition of RR interval variability. In this paper we explore a novel approach to extract the features from nonlinear component of the RR interval signal using Teager energy operator (TEO. The key feature of Teager energy is that it models the energy of the source that generated the signal rather than the energy of the signal itself. Hence any deviations in regular rhythmic activity of the heart get reflected in the Teager energy function. The classification evaluated on MIT-BIH database, with RR interval and mean of Teager energy computed over RR interval as features, exhibits an average accuracy that exceeds 99.79%.

  2. Use of a Novel Grammatical Inference Approach in Classification of Amyloidogenic Hexapeptides

    Directory of Open Access Journals (Sweden)

    Wojciech Wieczorek

    2016-01-01

    Full Text Available The present paper is a novel contribution to the field of bioinformatics by using grammatical inference in the analysis of data. We developed an algorithm for generating star-free regular expressions which turned out to be good recommendation tools, as they are characterized by a relatively high correlation coefficient between the observed and predicted binary classifications. The experiments have been performed for three datasets of amyloidogenic hexapeptides, and our results are compared with those obtained using the graph approaches, the current state-of-the-art methods in heuristic automata induction, and the support vector machine. The results showed the superior performance of the new grammatical inference algorithm on fixed-length amyloid datasets.

  3. Automatic detection of photoresist residual layer in lithography using a neural classification approach

    KAUST Repository

    Gereige, Issam

    2012-09-01

    Photolithography is a fundamental process in the semiconductor industry and it is considered as the key element towards extreme nanoscale integration. In this technique, a polymer photo sensitive mask with the desired patterns is created on the substrate to be etched. Roughly speaking, the areas to be etched are not covered with polymer. Thus, no residual layer should remain on these areas in order to insure an optimal transfer of the patterns on the substrate. In this paper, we propose a nondestructive method based on a classification approach achieved by artificial neural network for automatic residual layer detection from an ellipsometric signature. Only the case of regular defect, i.e. homogenous residual layer, will be considered. The limitation of the method will be discussed. Then, an experimental result on a 400 nm period grating manufactured with nanoimprint lithography is analyzed with our method. © 2012 Elsevier B.V. All rights reserved.

  4. Classification of lung sounds using higher-order statistics: A divide-and-conquer approach.

    Science.gov (United States)

    Naves, Raphael; Barbosa, Bruno H G; Ferreira, Danton D

    2016-06-01

    Lung sound auscultation is one of the most commonly used methods to evaluate respiratory diseases. However, the effectiveness of this method depends on the physician's training. If the physician does not have the proper training, he/she will be unable to distinguish between normal and abnormal sounds generated by the human body. Thus, the aim of this study was to implement a pattern recognition system to classify lung sounds. We used a dataset composed of five types of lung sounds: normal, coarse crackle, fine crackle, monophonic and polyphonic wheezes. We used higher-order statistics (HOS) to extract features (second-, third- and fourth-order cumulants), Genetic Algorithms (GA) and Fisher's Discriminant Ratio (FDR) to reduce dimensionality, and k-Nearest Neighbors and Naive Bayes classifiers to recognize the lung sound events in a tree-based system. We used the cross-validation procedure to analyze the classifiers performance and the Tukey's Honestly Significant Difference criterion to compare the results. Our results showed that the Genetic Algorithms outperformed the Fisher's Discriminant Ratio for feature selection. Moreover, each lung class had a different signature pattern according to their cumulants showing that HOS is a promising feature extraction tool for lung sounds. Besides, the proposed divide-and-conquer approach can accurately classify different types of lung sounds. The classification accuracy obtained by the best tree-based classifier was 98.1% for classification accuracy on training, and 94.6% for validation data. The proposed approach achieved good results even using only one feature extraction tool (higher-order statistics). Additionally, the implementation of the proposed classifier in an embedded system is feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Web Approach for Ontology-Based Classification, Integration, and Interdisciplinary Usage of Geoscience Metadata

    Directory of Open Access Journals (Sweden)

    B Ritschel

    2012-10-01

    Full Text Available The Semantic Web is a W3C approach that integrates the different sources of semantics within documents and services using ontology-based techniques. The main objective of this approach in the geoscience domain is the improvement of understanding, integration, and usage of Earth and space science related web content in terms of data, information, and knowledge for machines and people. The modeling and representation of semantic attributes and relations within and among documents can be realized by human readable concept maps and machine readable OWL documents. The objectives for the usage of the Semantic Web approach in the GFZ data center ISDC project are the design of an extended classification of metadata documents for product types related to instruments, platforms, and projects as well as the integration of different types of metadata related to data product providers, users, and data centers. Sources of content and semantics for the description of Earth and space science product types and related classes are standardized metadata documents (e.g., DIF documents, publications, grey literature, and Web pages. Other sources are information provided by users, such as tagging data and social navigation information. The integration of controlled vocabularies as well as folksonomies plays an important role in the design of well formed ontologies.

  6. Classification of boreal forest by satellite and inventory data using neural network approach

    Science.gov (United States)

    Romanov, A. A.

    2012-12-01

    The main objective of this research was to develop methodology for boreal (Siberian Taiga) land cover classification in a high accuracy level. The study area covers the territories of Central Siberian several parts along the Yenisei River (60-62 degrees North Latitude): the right bank includes mixed forest and dark taiga, the left - pine forests; so were taken as a high heterogeneity and statistically equal surfaces concerning spectral characteristics. Two main types of data were used: time series of middle spatial resolution satellite images (Landsat 5, 7 and SPOT4) and inventory datasets from the nature fieldworks (used for training samples sets preparation). Method of collecting field datasets included a short botany description (type/species of vegetation, density, compactness of the crowns, individual height and max/min diameters representative of each type, surface altitude of the plot), at the same time the geometric characteristic of each training sample unit corresponded to the spatial resolution of satellite images and geo-referenced (prepared datasets both of the preliminary processing and verification). The network of test plots was planned as irregular and determined by the landscape oriented approach. The main focus of the thematic data processing has been allocated for the use of neural networks (fuzzy logic inc.); therefore, the results of field studies have been converting input parameter of type / species of vegetation cover of each unit and the degree of variability. Proposed approach involves the processing of time series separately for each image mainly for the verification: shooting parameters taken into consideration (time, albedo) and thus expected to assess the quality of mapping. So the input variables for the networks were sensor bands, surface altitude, solar angels and land surface temperature (for a few experiments); also given attention to the formation of the formula class on the basis of statistical pre-processing of results of

  7. Totalitäre Regimes

    OpenAIRE

    Merkel, Wolfgang

    2004-01-01

    "The development of the term and the analytical concept of totalitarianism have gone through several stages since the 1920s. However, even in its most sophisticated form, the version seen in Friedrich/ Brzezinski, the concept exhibits substantial systematic classification problems and analytical weaknesses. This article attempts to frame the type of totalitarian regime within a general typology of political regimes. Special attention is dedicated to the problem of distinguishing autocra...

  8. Chronic Total Occlusion Crossing Approach Based on Plaque Cap Morphology: The CTOP Classification.

    Science.gov (United States)

    Saab, Fadi; Jaff, Michael R; Diaz-Sandoval, Larry J; Engen, Gwennan D; McGoff, Theresa N; Adams, George; Al-Dadah, Ashraf; Goodney, Philip P; Khawaja, Farhan; Mustapha, Jihad A

    2018-02-01

    To present the chronic total occlusion (CTO) crossing approach based on plaque cap morphology (CTOP) classification system and assess its ability to predict successful lesion crossing. A retrospective analysis was conducted of imaging and procedure data from 114 consecutive symptomatic patients (mean age 69±11 years; 84 men) with claudication (Rutherford category 3) or critical limb ischemia (Rutherford category 4-6) who underwent endovascular interventions for 142 CTOs. CTO cap morphology was determined from a review pf angiography and duplex ultrasonography and classified into 4 types (I, II, III, or IV) based on the concave or convex shape of the proximal and distal caps. Statistically significant differences among groups were found in patients with rest pain, lesion length, and severe calcification. CTOP type II CTOs were most common and type III lesions the least common. Type I CTOs were most likely to be crossed antegrade and had a lower incidence of severe calcification. Type IV lesions were more likely to be crossed retrograde from a tibiopedal approach. CTOP type IV was least likely to be crossed in an antegrade fashion. Access conversion, or need for an alternate access, was commonly seen in types II, III, and IV lesions. Distinctive predictors of access conversion were CTO types II and III, lesion length, and severe calcification. CTOP type I lesions were easiest to cross in antegrade fashion and type IV the most difficult. Lesion length >10 cm, severe calcification, and CTO types II, III, and IV benefited from the addition of retrograde tibiopedal access.

  9. An Effective Big Data Supervised Imbalanced Classification Approach for Ortholog Detection in Related Yeast Species

    Directory of Open Access Journals (Sweden)

    Deborah Galpert

    2015-01-01

    Full Text Available Orthology detection requires more effective scaling algorithms. In this paper, a set of gene pair features based on similarity measures (alignment scores, sequence length, gene membership to conserved regions, and physicochemical profiles are combined in a supervised pairwise ortholog detection approach to improve effectiveness considering low ortholog ratios in relation to the possible pairwise comparison between two genomes. In this scenario, big data supervised classifiers managing imbalance between ortholog and nonortholog pair classes allow for an effective scaling solution built from two genomes and extended to other genome pairs. The supervised approach was compared with RBH, RSD, and OMA algorithms by using the following yeast genome pairs: Saccharomyces cerevisiae-Kluyveromyces lactis, Saccharomyces cerevisiae-Candida glabrata, and Saccharomyces cerevisiae-Schizosaccharomyces pombe as benchmark datasets. Because of the large amount of imbalanced data, the building and testing of the supervised model were only possible by using big data supervised classifiers managing imbalance. Evaluation metrics taking low ortholog ratios into account were applied. From the effectiveness perspective, MapReduce Random Oversampling combined with Spark SVM outperformed RBH, RSD, and OMA, probably because of the consideration of gene pair features beyond alignment similarities combined with the advances in big data supervised classification.

  10. Assessment of Sampling Approaches for Remote Sensing Image Classification in the Iranian Playa Margins

    Science.gov (United States)

    Kazem Alavipanah, Seyed

    There are some problems in soil salinity studies based upon remotely sensed data: 1-spectral world is full of ambiguity and therefore soil reflectance can not be attributed to a single soil property such as salinity, 2) soil surface conditions as a function of time and space is a complex phenomena, 3) vegetation with a dynamic biological nature may create some problems in the study of soil salinity. Due to these problems the first question which may arise is how to overcome or minimise these problems. In this study we hypothesised that different sources of data, well established sampling plan and optimum approach could be useful. In order to choose representative training sites in the Iranian playa margins, to define the spectral and informational classes and to overcome some problems encountered in the variation within the field, the following attempts were made: 1) Principal Component Analysis (PCA) in order: a) to determine the most important variables, b) to understand the Landsat satellite images and the most informative components, 2) the photomorphic unit (PMU) consideration and interpretation; 3) study of salt accumulation and salt distribution in the soil profile, 4) use of several forms of field data, such as geologic, geomorphologic and soil information; 6) confirmation of field data and land cover types with farmers and the members of the team. The results led us to find at suitable approaches with a high and acceptable image classification accuracy and image interpretation. KEY WORDS; Photo Morphic Unit, Pprincipal Ccomponent Analysis, Soil Salinity, Field Work, Remote Sensing

  11. In silico prediction of ROCK II inhibitors by different classification approaches.

    Science.gov (United States)

    Cai, Chuipu; Wu, Qihui; Luo, Yunxia; Ma, Huili; Shen, Jiangang; Zhang, Yongbin; Yang, Lei; Chen, Yunbo; Wen, Zehuai; Wang, Qi

    2017-11-01

    ROCK II is an important pharmacological target linked to central nervous system disorders such as Alzheimer's disease. The purpose of this research is to generate ROCK II inhibitor prediction models by machine learning approaches. Firstly, four sets of descriptors were calculated with MOE 2010 and PaDEL-Descriptor, and optimized by F-score and linear forward selection methods. In addition, four classification algorithms were used to initially build 16 classifiers with k-nearest neighbors [Formula: see text], naïve Bayes, Random forest, and support vector machine. Furthermore, three sets of structural fingerprint descriptors were introduced to enhance the predictive capacity of classifiers, which were assessed with fivefold cross-validation, test set validation and external test set validation. The best two models, MFK + MACCS and MLR + SubFP, have both MCC values of 0.925 for external test set. After that, a privileged substructure analysis was performed to reveal common chemical features of ROCK II inhibitors. Finally, binding modes were analyzed to identify relationships between molecular descriptors and activity, while main interactions were revealed by comparing the docking interaction of the most potent and the weakest ROCK II inhibitors. To the best of our knowledge, this is the first report on ROCK II inhibitors utilizing machine learning approaches that provides a new method for discovering novel ROCK II inhibitors.

  12. Pattern Recognition Approaches for Breast Cancer DCE-MRI Classification: A Systematic Review.

    Science.gov (United States)

    Fusco, Roberta; Sansone, Mario; Filice, Salvatore; Carone, Guglielmo; Amato, Daniela Maria; Sansone, Carlo; Petrillo, Antonella

    2016-01-01

    We performed a systematic review of several pattern analysis approaches for classifying breast lesions using dynamic, morphological, and textural features in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Several machine learning approaches, namely artificial neural networks (ANN), support vector machines (SVM), linear discriminant analysis (LDA), tree-based classifiers (TC), and Bayesian classifiers (BC), and features used for classification are described. The findings of a systematic review of 26 studies are presented. The sensitivity and specificity are respectively 91 and 83 % for ANN, 85 and 82 % for SVM, 96 and 85 % for LDA, 92 and 87 % for TC, and 82 and 85 % for BC. The sensitivity and specificity are respectively 82 and 74 % for dynamic features, 93 and 60 % for morphological features, 88 and 81 % for textural features, 95 and 86 % for a combination of dynamic and morphological features, and 88 and 84 % for a combination of dynamic, morphological, and other features. LDA and TC have the best performance. A combination of dynamic and morphological features gives the best performance.

  13. Uncovering the benefits of fluctuating thermal regimes on cold tolerance of drosophila flies by combined metabolomic and lipidomic approach

    Czech Academy of Sciences Publication Activity Database

    Colinet, H.; Renault, D.; Javal, M.; Berková, Petra; Šimek, Petr; Košťál, Vladimír

    2016-01-01

    Roč. 1861, č. 11 (2016), s. 1736-1745 ISSN 1388-1981 R&D Projects: GA ČR GA13-18509S Institutional support: RVO:60077344 Keywords : cold stress * fluctuating thermal regimes * recovery Subject RIV: ED - Physiology Impact factor: 5.547, year: 2016 http://www.sciencedirect.com/science/article/pii/S1388198116302281

  14. Buildings classification from airborne LiDAR point clouds through OBIA and ontology driven approach

    Science.gov (United States)

    Tomljenovic, Ivan; Belgiu, Mariana; Lampoltshammer, Thomas J.

    2013-04-01

    In the last years, airborne Light Detection and Ranging (LiDAR) data proved to be a valuable information resource for a vast number of applications ranging from land cover mapping to individual surface feature extraction from complex urban environments. To extract information from LiDAR data, users apply prior knowledge. Unfortunately, there is no consistent initiative for structuring this knowledge into data models that can be shared and reused across different applications and domains. The absence of such models poses great challenges to data interpretation, data fusion and integration as well as information transferability. The intention of this work is to describe the design, development and deployment of an ontology-based system to classify buildings from airborne LiDAR data. The novelty of this approach consists of the development of a domain ontology that specifies explicitly the knowledge used to extract features from airborne LiDAR data. The overall goal of this approach is to investigate the possibility for classification of features of interest from LiDAR data by means of domain ontology. The proposed workflow is applied to the building extraction process for the region of "Biberach an der Riss" in South Germany. Strip-adjusted and georeferenced airborne LiDAR data is processed based on geometrical and radiometric signatures stored within the point cloud. Region-growing segmentation algorithms are applied and segmented regions are exported to the GeoJSON format. Subsequently, the data is imported into the ontology-based reasoning process used to automatically classify exported features of interest. Based on the ontology it becomes possible to define domain concepts, associated properties and relations. As a consequence, the resulting specific body of knowledge restricts possible interpretation variants. Moreover, ontologies are machinable and thus it is possible to run reasoning on top of them. Available reasoners (FACT++, JESS, Pellet) are used to check

  15. Classification of malignant and benign liver tumors using a radiomics approach

    Science.gov (United States)

    Starmans, Martijn P. A.; Miclea, Razvan L.; van der Voort, Sebastian R.; Niessen, Wiro J.; Thomeer, Maarten G.; Klein, Stefan

    2018-03-01

    Correct diagnosis of the liver tumor phenotype is crucial for treatment planning, especially the distinction between malignant and benign lesions. Clinical practice includes manual scoring of the tumors on Magnetic Resonance (MR) images by a radiologist. As this is challenging and subjective, it is often followed by a biopsy. In this study, we propose a radiomics approach as an objective and non-invasive alternative for distinguishing between malignant and benign phenotypes. T2-weighted (T2w) MR sequences of 119 patients from multiple centers were collected. We developed an efficient semi-automatic segmentation method, which was used by a radiologist to delineate the tumors. Within these regions, features quantifying tumor shape, intensity, texture, heterogeneity and orientation were extracted. Patient characteristics and semantic features were added for a total of 424 features. Classification was performed using Support Vector Machines (SVMs). The performance was evaluated using internal random-split cross-validation. On the training set within each iteration, feature selection and hyperparameter optimization were performed. To this end, another cross validation was performed by splitting the training sets in training and validation parts. The optimal settings were evaluated on the independent test sets. Manual scoring by a radiologist was also performed. The radiomics approach resulted in 95% confidence intervals of the AUC of [0.75, 0.92], specificity [0.76, 0.96] and sensitivity [0.52, 0.82]. These approach the performance of the radiologist, which were an AUC of 0.93, specificity 0.70 and sensitivity 0.93. Hence, radiomics has the potential to predict the liver tumor benignity in an objective and non-invasive manner.

  16. An Examination of the Nature of Global MODIS Cloud Regimes

    Science.gov (United States)

    Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin; Kato, Seiji; Huffman, George J.

    2014-01-01

    We introduce global cloud regimes (previously also referred to as "weather states") derived from cloud retrievals that use measurements by the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Aqua and Terra satellites. The regimes are obtained by applying clustering analysis on joint histograms of retrieved cloud top pressure and cloud optical thickness. By employing a compositing approach on data sets from satellites and other sources, we examine regime structural and thermodynamical characteristics. We establish that the MODIS cloud regimes tend to form in distinct dynamical and thermodynamical environments and have diverse profiles of cloud fraction and water content. When compositing radiative fluxes from the Clouds and the Earth's Radiant Energy System instrument and surface precipitation from the Global Precipitation Climatology Project, we find that regimes with a radiative warming effect on the atmosphere also produce the largest implied latent heat. Taken as a whole, the results of the study corroborate the usefulness of the cloud regime concept, reaffirm the fundamental nature of the regimes as appropriate building blocks for cloud system classification, clarify their association with standard cloud types, and underscore their distinct radiative and hydrological signatures.

  17. An alternative approach to the determination of scaling law expressions for the L–H transition in Tokamaks utilizing classification tools instead of regression

    International Nuclear Information System (INIS)

    Gaudio, P; Gelfusa, M; Lupelli, I; Murari, A; Vega, J

    2014-01-01

    A new approach to determine the power law expressions for the threshold between the H and L mode of confinement is presented. The method is based on two powerful machine learning tools for classification: neural networks and support vector machines. Using as inputs clear examples of the systems on either side of the transition, the machine learning tools learn the input–output mapping corresponding to the equations of the boundary separating the confinement regimes. Systematic tests with synthetic data show that the machine learning tools provide results competitive with traditional statistical regression and more robust against random noise and systematic errors. The developed tools have then been applied to the multi-machine International Tokamak Physics Activity International Global Threshold Database of validated ITER-like Tokamak discharges. The machine learning tools converge on the same scaling law parameters obtained with non-linear regression. On the other hand, the developed tools allow a reduction of 50% of the uncertainty in the extrapolations to ITER. Therefore the proposed approach can effectively complement traditional regression since its application poses much less stringent requirements on the experimental data, to be used to determine the scaling laws, because they do not require examples exactly at the moment of the transition. (paper)

  18. A hierarchical approach of hybrid image classification for land use and land cover mapping

    Directory of Open Access Journals (Sweden)

    Rahdari Vahid

    2018-01-01

    Full Text Available Remote sensing data analysis can provide thematic maps describing land-use and land-cover (LULC in a short period. Using proper image classification method in an area, is important to overcome the possible limitations of satellite imageries for producing land-use and land-cover maps. In the present study, a hierarchical hybrid image classification method was used to produce LULC maps using Landsat Thematic mapper TM for the year of 1998 and operational land imager OLI for the year of 2016. Images were classified using the proposed hybrid image classification method, vegetation cover crown percentage map from normalized difference vegetation index, Fisher supervised classification and object-based image classification methods. Accuracy assessment results showed that the hybrid classification method produced maps with total accuracy up to 84 percent with kappa statistic value 0.81. Results of this study showed that the proposed classification method worked better with OLI sensor than with TM. Although OLI has a higher radiometric resolution than TM, the produced LULC map using TM is almost accurate like OLI, which is because of LULC definitions and image classification methods used.

  19. Building and Solving Odd-One-Out Classification Problems: A Systematic Approach

    Science.gov (United States)

    Ruiz, Philippe E.

    2011-01-01

    Classification problems ("find the odd-one-out") are frequently used as tests of inductive reasoning to evaluate human or animal intelligence. This paper introduces a systematic method for building the set of all possible classification problems, followed by a simple algorithm for solving the problems of the R-ASCM, a psychometric test derived…

  20. Effective Exchange Rate Classifications and Growth

    OpenAIRE

    Justin M. Dubas; Byung-Joo Lee; Nelson C. Mark

    2005-01-01

    We propose an econometric procedure for obtaining de facto exchange rate regime classifications which we apply to study the relationship between exchange rate regimes and economic growth. Our classification method models the de jure regimes as outcomes of a multinomial logit choice problem conditional on the volatility of a country's effective exchange rate, a bilateral exchange rate and international reserves. An `effective' de facto exchange rate regime classification is then obtained by as...

  1. A machine learning approach to galaxy-LSS classification - I. Imprints on halo merger trees

    Science.gov (United States)

    Hui, Jianan; Aragon, Miguel; Cui, Xinping; Flegal, James M.

    2018-04-01

    The cosmic web plays a major role in the formation and evolution of galaxies and defines, to a large extent, their properties. However, the relation between galaxies and environment is still not well understood. Here, we present a machine learning approach to study imprints of environmental effects on the mass assembly of haloes. We present a galaxy-LSS machine learning classifier based on galaxy properties sensitive to the environment. We then use the classifier to assess the relevance of each property. Correlations between galaxy properties and their cosmic environment can be used to predict galaxy membership to void/wall or filament/cluster with an accuracy of 93 per cent. Our study unveils environmental information encoded in properties of haloes not normally considered directly dependent on the cosmic environment such as merger history and complexity. Understanding the physical mechanism by which the cosmic web is imprinted in a halo can lead to significant improvements in galaxy formation models. This is accomplished by extracting features from galaxy properties and merger trees, computing feature scores for each feature and then applying support vector machine (SVM) to different feature sets. To this end, we have discovered that the shape and depth of the merger tree, formation time, and density of the galaxy are strongly associated with the cosmic environment. We describe a significant improvement in the original classification algorithm by performing LU decomposition of the distance matrix computed by the feature vectors and then using the output of the decomposition as input vectors for SVM.

  2. STRUCTURE AND DYNAMICS OF BOREAL ECOSYSTEMS: ANOTHER APPROACH TO LANDSAT IMAGERY CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    P. Litinsky

    2017-01-01

    Full Text Available An alternative approach to information extraction from Landsat TM/ETM+ imagery is proposed. It involves transformation the image space into visible 3D form and comparing location in this space the segments of the ecosystem types with expressed graphically typology of forest and mire cover (biogeocenotic scheme. The model is built in LC1-LC2-MSI axis (the two first principal components of the image matrix in logarithmic form and moisture stress index. Comparing to Tasseled Cap, this transformation is more suitable for study area (north taiga zone of Eastern Fennoscandia. The spectral segments of mature and old-growth forests line up from the ecological optimum (moraine hills along two main environmental gradients: i lack of water and nutrition (fluvioglacial sands bedrock and ii degree of paludication (lacustrine plains. Thus, the biogeocenotic complexes are identified. The succession trajectories of forest regeneration through spectral space are also associated with the type of Quaternary deposits. For mire ecosystems spectral classes accurately reflect the type of water and mineral nutrition (ombrotrophic or mesotrophic. Spectral space model created using measured by the scanner physical ecosystem characteristics can be the base for developing objective classification of boreal ecosystems, where one of the most significant clustering criterions is the position in the spectral space.

  3. Classifying Human Activity Patterns from Smartphone Collected GPS data: a Fuzzy Classification and Aggregation Approach.

    Science.gov (United States)

    Wan, Neng; Lin, Ge

    2016-12-01

    Smartphones have emerged as a promising type of equipment for monitoring human activities in environmental health studies. However, degraded location accuracy and inconsistency of smartphone-measured GPS data have limited its effectiveness for classifying human activity patterns. This study proposes a fuzzy classification scheme for differentiating human activity patterns from smartphone-collected GPS data. Specifically, a fuzzy logic reasoning was adopted to overcome the influence of location uncertainty by estimating the probability of different activity types for single GPS points. Based on that approach, a segment aggregation method was developed to infer activity patterns, while adjusting for uncertainties of point attributes. Validations of the proposed methods were carried out based on a convenient sample of three subjects with different types of smartphones. The results indicate desirable accuracy (e.g., up to 96% in activity identification) with use of this method. Two examples were provided in the appendix to illustrate how the proposed methods could be applied in environmental health studies. Researchers could tailor this scheme to fit a variety of research topics.

  4. An Approach to Orbital Image Classification for the Assessment of Potato Plantation Areas

    Directory of Open Access Journals (Sweden)

    Vassiliki Terezinha Galvão Boulomytis

    2013-12-01

    Full Text Available In the city of Bueno Brandão, South of Minas Gerais State, Brazil, the Watershed of Rio das Antas is located prior to the public water supply and is susceptible to hydro-degradation due to the intensive agricultural activities developed in the area. The potato plantation is the most significant cropping in the city. Because of the possibility of interfering in the preservation areas, mainly the ones surrounding water courses and springs, it is very important to do the assessment of the plantation sites, in order to avoid the risk of water contamination. The procedures adopted by the agro activity farmers generally present the following features: intensive use of agro-chemicals, cropping in places with slopes which are higher than 20%, close to or in permanent preservation areas. The scope of this study was to develop the proper methodology for the assessment of the plantation areas, regarding the short time of procedure, as the period between the plantation and the harvest occurs in six months the furthest. These areas vary year in year out, as the plantation sites often change due to the land degradation. Because of that, geotechnologies are recommended to detect the plantation areas by the use of satellite images and accurate data processing. Considering the availability of LANDSAT medium resolution images, methods for their appropriate classification were approached to provide effective target detection.

  5. Idiopathic interstitial pneumonias and emphysema: detection and classification using a texture-discriminative approach

    Science.gov (United States)

    Fetita, C.; Chang-Chien, K. C.; Brillet, P. Y.; Pr"teux, F.; Chang, R. F.

    2012-03-01

    Our study aims at developing a computer-aided diagnosis (CAD) system for fully automatic detection and classification of pathological lung parenchyma patterns in idiopathic interstitial pneumonias (IIP) and emphysema using multi-detector computed tomography (MDCT). The proposed CAD system is based on three-dimensional (3-D) mathematical morphology, texture and fuzzy logic analysis, and can be divided into four stages: (1) a multi-resolution decomposition scheme based on a 3-D morphological filter was exploited to discriminate the lung region patterns at different analysis scales. (2) An additional spatial lung partitioning based on the lung tissue texture was introduced to reinforce the spatial separation between patterns extracted at the same resolution level in the decomposition pyramid. Then, (3) a hierarchic tree structure was exploited to describe the relationship between patterns at different resolution levels, and for each pattern, six fuzzy membership functions were established for assigning a probability of association with a normal tissue or a pathological target. Finally, (4) a decision step exploiting the fuzzy-logic assignments selects the target class of each lung pattern among the following categories: normal (N), emphysema (EM), fibrosis/honeycombing (FHC), and ground glass (GDG). According to a preliminary evaluation on an extended database, the proposed method can overcome the drawbacks of a previously developed approach and achieve higher sensitivity and specificity.

  6. Comparison of Electrocardiogram Signals in Men and Women during Creativity with Classification Approaches

    Directory of Open Access Journals (Sweden)

    Sahar ZAKERI

    2016-07-01

    Full Text Available Electrocardiogram (ECG analysis is mostly used as a valuable tool in the evaluation of cognitive tasks. By taking and analyzing measurements in vast quantities, researchers are working toward a better understanding of how human physiological systems work. For the first time, this study investigated the function of the cardiovascular system during creative thinking. In addition, the difference between male/female and normal/creativity states from ECG signals was investigated. Overall, the purpose of this paper was to illustrate the heart working during the creativity, and discover the creative men or women subjects. For these goals, six nonlinear features of the ECG signal were extracted to detect creativity states. During the three tasks of the Torrance Tests of Creative Thinking (TTCT- Figural B, ECG signals were recorded from 52 participants (26 men and 26 women. Then, the proficiency of two kinds of classification approaches was evaluated: Artificial Neural Network (ANN and Support Vector Machine (SVM. The results indicated the high accuracy rate of discriminations between male/female (96.09% and normal/creativity states (95.84% using ANN classifier. Therefore, the proposed method can be useful to detect the creativity states.

  7. Comparison of two approaches for the classification of 16S rRNA gene sequences.

    Science.gov (United States)

    Chatellier, Sonia; Mugnier, Nathalie; Allard, Françoise; Bonnaud, Bertrand; Collin, Valérie; van Belkum, Alex; Veyrieras, Jean-Baptiste; Emler, Stefan

    2014-10-01

    The use of 16S rRNA gene sequences for microbial identification in clinical microbiology is accepted widely, and requires databases and algorithms. We compared a new research database containing curated 16S rRNA gene sequences in combination with the lca (lowest common ancestor) algorithm (RDB-LCA) to a commercially available 16S rDNA Centroid approach. We used 1025 bacterial isolates characterized by biochemistry, matrix-assisted laser desorption/ionization time-of-flight MS and 16S rDNA sequencing. Nearly 80 % of isolates were identified unambiguously at the species level by both classification platforms used. The remaining isolates were mostly identified correctly at the genus level due to the limited resolution of 16S rDNA sequencing. Discrepancies between both 16S rDNA platforms were due to differences in database content and the algorithm used, and could amount to up to 10.5 %. Up to 1.4 % of the analyses were found to be inconclusive. It is important to realize that despite the overall good performance of the pipelines for analysis, some inconclusive results remain that require additional in-depth analysis performed using supplementary methods. © 2014 The Authors.

  8. Competition Regime

    Directory of Open Access Journals (Sweden)

    Danilo Icaza Ortiz

    2013-01-01

    Full Text Available This paper is a review of the competition regime works of various authors, published under the auspices of the University of the Hemispheres and the Corporation for Studies and Publications. Analyzes the structure, the general concepts, case law taken for development. Includes comments on the usefulness of this work for the study of competition law and the contribution to the lawyers who want to practice in this branch of economic law.

  9. A hybrid gene selection approach for microarray data classification using cellular learning automata and ant colony optimization.

    Science.gov (United States)

    Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein

    2016-06-01

    This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Feature Selection as a Time and Cost-Saving Approach for Land Suitability Classification (Case Study of Shavur Plain, Iran

    Directory of Open Access Journals (Sweden)

    Saeid Hamzeh

    2016-10-01

    Full Text Available Land suitability classification is important in planning and managing sustainable land use. Most approaches to land suitability analysis combine a large number of land and soil parameters, and are time-consuming and costly. In this study, a potentially useful technique (combined feature selection and fuzzy-AHP method to increase the efficiency of land suitability analysis was presented. To this end, three different feature selection algorithms—random search, best search and genetic methods—were used to determine the most effective parameters for land suitability classification for the cultivation of barely in the Shavur Plain, southwest Iran. Next, land suitability classes were calculated for all methods by using the fuzzy-AHP approach. Salinity (electrical conductivity (EC, alkalinity (exchangeable sodium percentage (ESP, wetness and soil texture were selected using the random search method. Gypsum, EC, ESP, and soil texture were selected using both the best search and genetic methods. The result shows a strong agreement between the standard fuzzy-AHP methods and methods presented in this study. The values of Kappa coefficients were 0.82, 0.79 and 0.79 for the random search, best search and genetic methods, respectively, compared with the standard fuzzy-AHP method. Our results indicate that EC, ESP, soil texture and wetness are the most effective features for evaluating land suitability classification for the cultivation of barely in the study area, and uses of these parameters, together with their appropriate weights as obtained from fuzzy-AHP, can perform good results for land suitability classification. So, the combined feature selection presented and the fuzzy-AHP approach has the potential to save time and money for land suitability classification.

  11. Poster abstract: A machine learning approach for vehicle classification using passive infrared and ultrasonic sensors

    KAUST Repository

    Warriach, Ehsan Ullah; Claudel, Christian G.

    2013-01-01

    This article describes the implementation of four different machine learning techniques for vehicle classification in a dual ultrasonic/passive infrared traffic flow sensors. Using k-NN, Naive Bayes, SVM and KNN-SVM algorithms, we show that KNN

  12. Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive approaches

    CSIR Research Space (South Africa)

    Musee, N

    2012-05-01

    Full Text Available -1 NANOLCA Symposium, "Safety issues and regulatory challenges of nanomaterials", San Sebastian, Spain, 3-4 May 2012 Post engineered nanomaterials lifespan: nanowastes classification, legislative development/implementation challenges, and proactive...

  13. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  14. Poster abstract: A machine learning approach for vehicle classification using passive infrared and ultrasonic sensors

    KAUST Repository

    Warriach, Ehsan Ullah

    2013-01-01

    This article describes the implementation of four different machine learning techniques for vehicle classification in a dual ultrasonic/passive infrared traffic flow sensors. Using k-NN, Naive Bayes, SVM and KNN-SVM algorithms, we show that KNN-SVM significantly outperforms other algorithms in terms of classification accuracy. We also show that some of these algorithms could run in real time on the prototype system. Copyright © 2013 ACM.

  15. Approaches to the classification of brands of professional football clubs in the system of sportive marketing

    OpenAIRE

    Romat, E.; Ostroverh, S.

    2014-01-01

    This article was told about methods of classification of professional football clubs in the system of sportive marketing in the total commercial conditions of the most popular kind sport, football. Also was told about importance of brand in the professional football club as an active method of increasing trade value multi-functional enterprise in the sport area, which works on business model. The criterions where proposed and also was told about essence of the classification, which is used to...

  16. Chronic subdural hematoma: Surgical management and outcome in 986 cases: A classification and regression tree approach

    Science.gov (United States)

    Rovlias, Aristedis; Theodoropoulos, Spyridon; Papoutsakis, Dimitrios

    2015-01-01

    Background: Chronic subdural hematoma (CSDH) is one of the most common clinical entities in daily neurosurgical practice which carries a most favorable prognosis. However, because of the advanced age and medical problems of patients, surgical therapy is frequently associated with various complications. This study evaluated the clinical features, radiological findings, and neurological outcome in a large series of patients with CSDH. Methods: A classification and regression tree (CART) technique was employed in the analysis of data from 986 patients who were operated at Asclepeion General Hospital of Athens from January 1986 to December 2011. Burr holes evacuation with closed system drainage has been the operative technique of first choice at our institution for 29 consecutive years. A total of 27 prognostic factors were examined to predict the outcome at 3-month postoperatively. Results: Our results indicated that neurological status on admission was the best predictor of outcome. With regard to the other data, age, brain atrophy, thickness and density of hematoma, subdural accumulation of air, and antiplatelet and anticoagulant therapy were found to correlate significantly with prognosis. The overall cross-validated predictive accuracy of CART model was 85.34%, with a cross-validated relative error of 0.326. Conclusions: Methodologically, CART technique is quite different from the more commonly used methods, with the primary benefit of illustrating the important prognostic variables as related to outcome. Since, the ideal therapy for the treatment of CSDH is still under debate, this technique may prove useful in developing new therapeutic strategies and approaches for patients with CSDH. PMID:26257985

  17. Automated, high accuracy classification of Parkinsonian disorders: a pattern recognition approach.

    Directory of Open Access Journals (Sweden)

    Andre F Marquand

    Full Text Available Progressive supranuclear palsy (PSP, multiple system atrophy (MSA and idiopathic Parkinson's disease (IPD can be clinically indistinguishable, especially in the early stages, despite distinct patterns of molecular pathology. Structural neuroimaging holds promise for providing objective biomarkers for discriminating these diseases at the single subject level but all studies to date have reported incomplete separation of disease groups. In this study, we employed multi-class pattern recognition to assess the value of anatomical patterns derived from a widely available structural neuroimaging sequence for automated classification of these disorders. To achieve this, 17 patients with PSP, 14 with IPD and 19 with MSA were scanned using structural MRI along with 19 healthy controls (HCs. An advanced probabilistic pattern recognition approach was employed to evaluate the diagnostic value of several pre-defined anatomical patterns for discriminating the disorders, including: (i a subcortical motor network; (ii each of its component regions and (iii the whole brain. All disease groups could be discriminated simultaneously with high accuracy using the subcortical motor network. The region providing the most accurate predictions overall was the midbrain/brainstem, which discriminated all disease groups from one another and from HCs. The subcortical network also produced more accurate predictions than the whole brain and all of its constituent regions. PSP was accurately predicted from the midbrain/brainstem, cerebellum and all basal ganglia compartments; MSA from the midbrain/brainstem and cerebellum and IPD from the midbrain/brainstem only. This study demonstrates that automated analysis of structural MRI can accurately predict diagnosis in individual patients with Parkinsonian disorders, and identifies distinct patterns of regional atrophy particularly useful for this process.

  18. EEG sensorimotor rhythms' variation and functional connectivity measures during motor imagery: linear relations and classification approaches.

    Science.gov (United States)

    Stefano Filho, Carlos A; Attux, Romis; Castellano, Gabriela

    2017-01-01

    Hands motor imagery (MI) has been reported to alter synchronization patterns amongst neurons, yielding variations in the mu and beta bands' power spectral density (PSD) of the electroencephalography (EEG) signal. These alterations have been used in the field of brain-computer interfaces (BCI), in an attempt to assign distinct MI tasks to commands of such a system. Recent studies have highlighted that information may be missing if knowledge about brain functional connectivity is not considered. In this work, we modeled the brain as a graph in which each EEG electrode represents a node. Our goal was to understand if there exists any linear correlation between variations in the synchronization patterns-that is, variations in the PSD of mu and beta bands-induced by MI and alterations in the corresponding functional networks. Moreover, we (1) explored the feasibility of using functional connectivity parameters as features for a classifier in the context of an MI-BCI; (2) investigated three different types of feature selection (FS) techniques; and (3) compared our approach to a more traditional method using the signal PSD as classifier inputs. Ten healthy subjects participated in this study. We observed significant correlations ( p  < 0.05) with values ranging from 0.4 to 0.9 between PSD variations and functional network alterations for some electrodes, prominently in the beta band. The PSD method performed better for data classification, with mean accuracies of (90 ± 8)% and (87 ± 7)% for the mu and beta band, respectively, versus (83 ± 8)% and (83 ± 7)% for the same bands for the graph method. Moreover, the number of features for the graph method was considerably larger. However, results for both methods were relatively close, and even overlapped when the uncertainties of the accuracy rates were considered. Further investigation regarding a careful exploration of other graph metrics may provide better alternatives.

  19. BClass: A Bayesian Approach Based on Mixture Models for Clustering and Classification of Heterogeneous Biological Data

    Directory of Open Access Journals (Sweden)

    Arturo Medrano-Soto

    2004-12-01

    Full Text Available Based on mixture models, we present a Bayesian method (called BClass to classify biological entities (e.g. genes when variables of quite heterogeneous nature are analyzed. Various statistical distributions are used to model the continuous/categorical data commonly produced by genetic experiments and large-scale genomic projects. We calculate the posterior probability of each entry to belong to each element (group in the mixture. In this way, an original set of heterogeneous variables is transformed into a set of purely homogeneous characteristics represented by the probabilities of each entry to belong to the groups. The number of groups in the analysis is controlled dynamically by rendering the groups as 'alive' and 'dormant' depending upon the number of entities classified within them. Using standard Metropolis-Hastings and Gibbs sampling algorithms, we constructed a sampler to approximate posterior moments and grouping probabilities. Since this method does not require the definition of similarity measures, it is especially suitable for data mining and knowledge discovery in biological databases. We applied BClass to classify genes in RegulonDB, a database specialized in information about the transcriptional regulation of gene expression in the bacterium Escherichia coli. The classification obtained is consistent with current knowledge and allowed prediction of missing values for a number of genes. BClass is object-oriented and fully programmed in Lisp-Stat. The output grouping probabilities are analyzed and interpreted using graphical (dynamically linked plots and query-based approaches. We discuss the advantages of using Lisp-Stat as a programming language as well as the problems we faced when the data volume increased exponentially due to the ever-growing number of genomic projects.

  20. International Food Regime

    Directory of Open Access Journals (Sweden)

    A. V. Malov

    2018-01-01

    Full Text Available The review article reveals the content of the concept of Food Regime, which is little-known in the Russian academic reference. The author monitored and codified the semantic dynamic of the terminological unit from its original interpretations to modern formulations based on the retrospective analysis. The rehabilitation of the academic merits of D. Puchala and R. Hopkins — authors who used the concept Food Regime for a few years before its universally recognized origin and official scientific debut, was accomplished with help of historical and comparative methods. The author implemented the method of ascension from the abstract to the concrete to demonstrating the classification of Food Regimes compiled on the basis of geopolitical interests in the sphere of international production, consumption, and distribution of foodstuffs. The characteristic features of historically formed Food Regime were described in the chronological order, as well as modern tendencies possessing reformist potential were identified. In particular, it has been established that the idea of Food Sovereignty (which is an alternative to the modern Corporate Food Regime is the subject for acute academic disputes. The discussion between P. McMichael P. and H. Bernstein devoted to the “peasant question” — mobilization frame of the Food Sovereignty strategy was analyzed using the secondary data processing method. Due to the critical analysis, the author comes to the conclusion that it is necessary to follow the principles of the Food Sovereignty strategy to prevent the catastrophic prospects associated with ecosystem degradation, accelerated erosion of soils, the complete disappearance of biodiversity and corporate autoc racy successfully. The author is convinced that the idea of Food Sovereignty can ward off energetic liberalization of nature, intensive privatization of life and rapid monetization of unconditioned human reflexes.

  1. Two-Stage Classification Approach for Human Detection in Camera Video in Bulk Ports

    Directory of Open Access Journals (Sweden)

    Mi Chao

    2015-09-01

    Full Text Available With the development of automation in ports, the video surveillance systems with automated human detection begun to be applied in open-air handling operation areas for safety and security. The accuracy of traditional human detection based on the video camera is not high enough to meet the requirements of operation surveillance. One of the key reasons is that Histograms of Oriented Gradients (HOG features of the human body will show great different between front & back standing (F&B and side standing (Side human body. Therefore, the final training for classifier will only gain a few useful specific features which have contribution to classification and are insufficient to support effective classification, while using the HOG features directly extracted by the samples from different human postures. This paper proposes a two-stage classification method to improve the accuracy of human detection. In the first stage, during preprocessing classification, images is mainly divided into possible F&B human body and not F&B human body, and then they were put into the second-stage classification among side human and non-human recognition. The experimental results in Tianjin port show that the two-stage classifier can improve the classification accuracy of human detection obviously.

  2. Classification of semiurban landscapes from very high-resolution satellite images using a regionalized multiscale segmentation approach

    Science.gov (United States)

    Kavzoglu, Taskin; Erdemir, Merve Yildiz; Tonbul, Hasan

    2017-07-01

    In object-based image analysis, obtaining representative image objects is an important prerequisite for a successful image classification. The major threat is the issue of scale selection due to the complex spatial structure of landscapes portrayed as an image. This study proposes a two-stage approach to conduct regionalized multiscale segmentation. In the first stage, an initial high-level segmentation is applied through a "broadscale," and a set of image objects characterizing natural borders of the landscape features are extracted. Contiguous objects are then merged to create regions by considering their normalized difference vegetation index resemblance. In the second stage, optimal scale values are estimated for the extracted regions, and multiresolution segmentation is applied with these settings. Two satellite images with different spatial and spectral resolutions were utilized to test the effectiveness of the proposed approach and its transferability to different geographical sites. Results were compared to those of image-based single-scale segmentation and it was found that the proposed approach outperformed the single-scale segmentations. Using the proposed methodology, significant improvement in terms of segmentation quality and classification accuracy (up to 5%) was achieved. In addition, the highest classification accuracies were produced using fine-scale values.

  3. Physiotherapy movement based classification approaches to low back pain: comparison of subgroups through review and developer/expert survey

    Directory of Open Access Journals (Sweden)

    Karayannis Nicholas V

    2012-02-01

    Full Text Available Abstract Background Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". Methods A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT, Treatment Based Classification (TBC, Pathoanatomic Based Classification (PBC, Movement System Impairment Classification (MSI, and O'Sullivan Classification System (OCS schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Results Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i loading strategies (MDT, TBC, PBC aimed at eliciting a phenomenon

  4. A new approach to very short term wind speed prediction using k-nearest neighbor classification

    International Nuclear Information System (INIS)

    Yesilbudak, Mehmet; Sagiroglu, Seref; Colak, Ilhami

    2013-01-01

    Highlights: ► Wind speed parameter was predicted in an n-tupled inputs using k-NN classification. ► The effects of input parameters, nearest neighbors and distance metrics were analyzed. ► Many useful and reasonable inferences were uncovered using the developed model. - Abstract: Wind energy is an inexhaustible energy source and wind power production has been growing rapidly in recent years. However, wind power has a non-schedulable nature due to wind speed variations. Hence, wind speed prediction is an indispensable requirement for power system operators. This paper predicts wind speed parameter in an n-tupled inputs using k-nearest neighbor (k-NN) classification and analyzes the effects of input parameters, nearest neighbors and distance metrics on wind speed prediction. The k-NN classification model was developed using the object oriented programming techniques and includes Manhattan and Minkowski distance metrics except from Euclidean distance metric on the contrary of literature. The k-NN classification model which uses wind direction, air temperature, atmospheric pressure and relative humidity parameters in a 4-tupled space achieved the best wind speed prediction for k = 5 in the Manhattan distance metric. Differently, the k-NN classification model which uses wind direction, air temperature and atmospheric pressure parameters in a 3-tupled inputs gave the worst wind speed prediction for k = 1 in the Minkowski distance metric

  5. DEFLATE Compression Algorithm Corrects for Overestimation of Phylogenetic Diversity by Grantham Approach to Single-Nucleotide Polymorphism Classification

    Directory of Open Access Journals (Sweden)

    Arran Schlosberg

    2014-05-01

    Full Text Available Improvements in speed and cost of genome sequencing are resulting in increasing numbers of novel non-synonymous single nucleotide polymorphisms (nsSNPs in genes known to be associated with disease. The large number of nsSNPs makes laboratory-based classification infeasible and familial co-segregation with disease is not always possible. In-silico methods for classification or triage are thus utilised. A popular tool based on multiple-species sequence alignments (MSAs and work by Grantham, Align-GVGD, has been shown to underestimate deleterious effects, particularly as sequence numbers increase. We utilised the DEFLATE compression algorithm to account for expected variation across a number of species. With the adjusted Grantham measure we derived a means of quantitatively clustering known neutral and deleterious nsSNPs from the same gene; this was then used to assign novel variants to the most appropriate cluster as a means of binary classification. Scaling of clusters allows for inter-gene comparison of variants through a single pathogenicity score. The approach improves upon the classification accuracy of Align-GVGD while correcting for sensitivity to large MSAs. Open-source code and a web server are made available at https://github.com/aschlosberg/CompressGV.

  6. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches.

    Science.gov (United States)

    Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M

    2009-11-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.

  7. An Object-Based Classification of Mangroves Using a Hybrid Decision Tree—Support Vector Machine Approach

    Directory of Open Access Journals (Sweden)

    Benjamin W. Heumann

    2011-11-01

    Full Text Available Mangroves provide valuable ecosystem goods and services such as carbon sequestration, habitat for terrestrial and marine fauna, and coastal hazard mitigation. The use of satellite remote sensing to map mangroves has become widespread as it can provide accurate, efficient, and repeatable assessments. Traditional remote sensing approaches have failed to accurately map fringe mangroves and true mangrove species due to relatively coarse spatial resolution and/or spectral confusion with landward vegetation. This study demonstrates the use of the new Worldview-2 sensor, Object-based image analysis (OBIA, and support vector machine (SVM classification to overcome both of these limitations. An exploratory spectral separability showed that individual mangrove species could not be spectrally separated, but a distinction between true and associate mangrove species could be made. An OBIA classification was used that combined a decision-tree classification with the machine-learning SVM classification. Results showed an overall accuracy greater than 94% (kappa = 0.863 for classifying true mangroves species and other dense coastal vegetation at the object level. There remain serious challenges to accurately mapping fringe mangroves using remote sensing data due to spectral similarity of mangrove and associate species, lack of clear zonation between species, and mixed pixel effects, especially when vegetation is sparse or degraded.

  8. An algebraic approach towards the classification of 2 dimensional conformal field theories

    International Nuclear Information System (INIS)

    Bouwknegt, P.G.

    1988-01-01

    This thesis treats an algebraic method for the construction of 2-dimensional conformal field theories. The method consists of the study of the representation theory of the Virasoro algebra and suitable extensions of this. The classification of 2-dimensional conformal field theories is translated into the classification of combinations of representations which satisfy certain consistence conditions (unitarity and modular invariance). For a certain class of 2-dimensional field theories, namely the one with central charge c = 1 from the theory of Kac-Moody algebra's. there exist indications, but as yet mainly hope, that this construction will finally lead to a classification of 2-dimensional conformal field theories. 182 refs.; 2 figs.; 26 tabs

  9. An approach toward a combined scheme for the petrographic classification of fly ash: Revision and clarification

    Science.gov (United States)

    Hower, J.C.; Suarez-Ruiz, I.; Mastalerz, Maria

    2005-01-01

    Hower and Mastalerz's classification scheme for fly ash is modified to make more widely acceptable. First, proper consideration is given to the potential role of anthracite in the development of isotropic and anisotropic chars. Second, the role of low-reflectance inertinite in producing vesicular chars is noted. It is shown that noncoal chars in the fuel can potentially produce chars that have the potential to stretch the limits of the classification. With care, it is possible to classify certain biomass chars as being distinct from coal-derived chars.

  10. Multi-fluid Approach to High-frequency Waves in Plasmas. III. Nonlinear Regime and Plasma Heating

    Science.gov (United States)

    Martínez-Gómez, David; Soler, Roberto; Terradas, Jaume

    2018-03-01

    The multi-fluid modeling of high-frequency waves in partially ionized plasmas has shown that the behavior of magnetohydrodynamic waves in the linear regime is heavily influenced by the collisional interaction between the different species that form the plasma. Here, we go beyond linear theory and study large-amplitude waves in partially ionized plasmas using a nonlinear multi-fluid code. It is known that in fully ionized plasmas, nonlinear Alfvén waves generate density and pressure perturbations. Those nonlinear effects are more pronounced for standing oscillations than for propagating waves. By means of numerical simulations and analytical approximations, we examine how the collisional interaction between ions and neutrals affects the nonlinear evolution. The friction due to collisions dissipates a fraction of the wave energy, which is transformed into heat and consequently raises the temperature of the plasma. As an application, we investigate frictional heating in a plasma with physical conditions akin to those in a quiescent solar prominence.

  11. Groundwater Recharge and Flow Regime revealed by multi-tracers approach in a headwater, North China Plain

    Science.gov (United States)

    Sakakibara, Koichi; Tsujimura, Maki; Song, Xianfang; Zhang, Jie

    2014-05-01

    Groundwater recharge is a crucial hydrological process for effective water management especially in arid/ semi-arid regions. However, the insufficient number of specific research regarding groundwater recharge process has been reported previously. Intensive field surveys were conducted during rainy season, mid dry season, and end of dry season, in order to clarify comprehensive groundwater recharge and flow regime of Wangkuai watershed in a headwater, which is a main recharge zone of North China Plain. The groundwater, spring, stream water and lake water were sampled, and inorganic solute constituents and stable isotopes of oxygen 18 and deuterium were determined on all water samples. Also the stream flow rate was observed. The solute ion concentrations and stable isotopic compositions show that the most water of this region can be characterized by Ca-HCO3 type and the main water source is precipitation which is affected by altitude effect of stable isotopes. In addition, the river and reservoir of the area seem to recharge the groundwater during rainy season, whereas interaction between surface water and groundwater does not become dominant gradually after the rainy season. The inversion analysis applied in Wangkuai watershed using simple mixing model represents an existing multi-flow systems which shows a distinctive tracer signal and flow rate. In summary, the groundwater recharged at different locations in the upper stream of Wangkuai reservoir flows downward to alluvial fan with a certain amount of mixing together, also the surface water recharges certainly the groundwater in alluvial plain in the rainy season.

  12. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food

    International Nuclear Information System (INIS)

    Päßler, Sebastian; Fischer, Wolf-Joachim; Wolff, Matthias

    2012-01-01

    Obesity and nutrition-related diseases are currently growing challenges for medicine. A precise and timesaving method for food intake monitoring is needed. For this purpose, an approach based on the classification of sounds produced during food intake is presented. Sounds are recorded non-invasively by miniature microphones in the outer ear canal. A database of 51 participants eating seven types of food and consuming one drink has been developed for algorithm development and model training. The database is labeled manually using a protocol with introductions for annotation. The annotation procedure is evaluated using Cohen's kappa coefficient. The food intake activity is detected by the comparison of the signal energy of in-ear sounds to environmental sounds recorded by a reference microphone. Hidden Markov models are used for the recognition of single chew or swallowing events. Intake cycles are modeled as event sequences in finite-state grammars. Classification of consumed food is realized by a finite-state grammar decoder based on the Viterbi algorithm. We achieved a detection accuracy of 83% and a food classification accuracy of 79% on a test set of 10% of all records. Our approach faces the need of monitoring the time and occurrence of eating. With differentiation of consumed food, a first step toward the goal of meal weight estimation is taken. (paper)

  13. Probabilistic Gait Classification in Children with Cerebral Palsy: A Bayesian Approach

    Science.gov (United States)

    Van Gestel, Leen; De Laet, Tinne; Di Lello, Enrico; Bruyninckx, Herman; Molenaers, Guy; Van Campenhout, Anja; Aertbelien, Erwin; Schwartz, Mike; Wambacq, Hans; De Cock, Paul; Desloovere, Kaat

    2011-01-01

    Three-dimensional gait analysis (3DGA) generates a wealth of highly variable data. Gait classifications help to reduce, simplify and interpret this vast amount of 3DGA data and thereby assist and facilitate clinical decision making in the treatment of CP. CP gait is often a mix of several clinically accepted distinct gait patterns. Therefore,…

  14. Fingerprint pattern classification approach based on the coordinate geometry of singularities

    CSIR Research Space (South Africa)

    Msiza, IS

    2009-10-01

    Full Text Available of fingerprint matching, it serves to reduce the duration of the query. The fingerprint classes discussed in this document are the Central Twins (CT), Tented Arch (TA), Left Loop (LL), Right Loop (RL) and the Plain Arch (PA). The classification rules employed...

  15. Comparing and optimizing land use classification in a Himalayan area using parametric and non parametric approaches

    NARCIS (Netherlands)

    Sterk, G.; Sameer Saran,; Raju, P.L.N.; Amit, Bharti

    2007-01-01

    Supervised classification is one of important tasks in remote sensing image interpretation, in which the image pixels are classified to various predefined land use/land cover classes based on the spectral reflectance values in different bands. In reality some classes may have very close spectral

  16. On Internet Traffic Classification: A Two-Phased Machine Learning Approach

    Directory of Open Access Journals (Sweden)

    Taimur Bakhshi

    2016-01-01

    Full Text Available Traffic classification utilizing flow measurement enables operators to perform essential network management. Flow accounting methods such as NetFlow are, however, considered inadequate for classification requiring additional packet-level information, host behaviour analysis, and specialized hardware limiting their practical adoption. This paper aims to overcome these challenges by proposing two-phased machine learning classification mechanism with NetFlow as input. The individual flow classes are derived per application through k-means and are further used to train a C5.0 decision tree classifier. As part of validation, the initial unsupervised phase used flow records of fifteen popular Internet applications that were collected and independently subjected to k-means clustering to determine unique flow classes generated per application. The derived flow classes were afterwards used to train and test a supervised C5.0 based decision tree. The resulting classifier reported an average accuracy of 92.37% on approximately 3.4 million test cases increasing to 96.67% with adaptive boosting. The classifier specificity factor which accounted for differentiating content specific from supplementary flows ranged between 98.37% and 99.57%. Furthermore, the computational performance and accuracy of the proposed methodology in comparison with similar machine learning techniques lead us to recommend its extension to other applications in achieving highly granular real-time traffic classification.

  17. CLASSIFICATION AND COMPLEX STATE VALUE OF SHOPPING CENTERS: PROJECT-ORIENTED APPROACH

    Directory of Open Access Journals (Sweden)

    Юрій Павлович РАК

    2016-02-01

    Full Text Available Was done the analysis of projects objects of trade and entertainment centers from the perspective of improving the life safety and is proposed the definition of "Trade and entertainment center", "Trade and entertainment center" and "Complex value of trade and entertainment center." A classification of shopping centers on the classification criteria and the criteria are characterized by increased security status and attractiveness of their operation. The classification of trade and entertainment centers on the criteria of classification features were made. It characterizes the security situation and will increase the attractiveness of their operation. In the nearest future the most secure and modern TEC will be those buildings who will have unique qualities such as safety systems, excellent customer service, and thus by a high level of trust (the client to the mall. The important role will play those TEC, who have clearly formed value oriented project management, including communication values using innovative methods and models. Trade and entertainment centers as an organization are included in the complex process of interaction management. They being both as an enterprise that serves the public and satisfying a great range of his interests and architectural site, which is leased and increases the business attractiveness of the district of TEC location. This duality of the essence of TEC center makes difficult to assess the effectiveness of its security.

  18. An automated Pearson's correlation change classification (APC3) approach for GC/MS metabonomic data using total ion chromatograms (TICs).

    Science.gov (United States)

    Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei

    2013-05-21

    A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.

  19. Experimental Investigation for Fault Diagnosis Based on a Hybrid Approach Using Wavelet Packet and Support Vector Classification

    Directory of Open Access Journals (Sweden)

    Pengfei Li

    2014-01-01

    Full Text Available To deal with the difficulty to obtain a large number of fault samples under the practical condition for mechanical fault diagnosis, a hybrid method that combined wavelet packet decomposition and support vector classification (SVC is proposed. The wavelet packet is employed to decompose the vibration signal to obtain the energy ratio in each frequency band. Taking energy ratios as feature vectors, the pattern recognition results are obtained by the SVC. The rolling bearing and gear fault diagnostic results of the typical experimental platform show that the present approach is robust to noise and has higher classification accuracy and, thus, provides a better way to diagnose mechanical faults under the condition of small fault samples.

  20. Application of a new genetic classification and semi-automated geomorphic mapping approach in the Perth submarine canyon, Australia

    Science.gov (United States)

    Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.

    2017-12-01

    The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.

  1. An Object-Based Classification Approach for Mapping Migrant Housing in the Mega-Urban Area of the Pearl River Delta (China

    Directory of Open Access Journals (Sweden)

    Sebastian D’Oleire-Oltmanns

    2011-08-01

    Full Text Available Urban areas develop on formal and informal levels. Informal development is often highly dynamic, leading to a lag of spatial information about urban structure types. In this work, an object-based remote sensing approach will be presented to map the migrant housing urban structure type in the Pearl River Delta, China. SPOT5 data were utilized for the classification (auxiliary data, particularly up-to-date cadastral data, were not available. A hierarchically structured classification process was used to create (spectral independence from single satellite scenes and to arrive at a transferrable classification process. Using the presented classification approach, an overall classification accuracy of migrant housing of 68.0% is attained.

  2. Comparison of drift-velocity and drag coefficient approaches for one-dimensional two-fluid models in bubbly flow regime and validation with experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Gómez-Zarzuela, C.; Miró, R.; Verdú, G. [Institute for Industrial Safety, Radiology and Environmental (ISIRYM), Universitat Politècnica de València (Spain); Peña-Monferrer, C.; Chiva, S. [Department of Mechanical Engineering and Construction, Universitat Jaume I, Castellón de la Plana (Spain); Muñoz-Cobo, J.L., E-mail: congoque@iqn.upv.es, E-mail: cpena@uji.es [Institute for Energy Engineering, Universitat Politècnica de València (Spain)

    2017-07-01

    Two-phase flow simulation has been an extended research topic over the years due to the importance of predicting with accuracy the flow behavior within different installations, including nuclear power plants. Some of them are low pressure events, like low water pressure injection, nuclear refueling or natural circulation. This work is devoted to investigate the level of accuracy of the results when a two-phase flow experiment, which has been carried out at low pressure, is performed in a one-dimensional simulation code. In particular, the codes that have been selected to represent the experiment are the best-estimate system codes RELAP5/MOD3 and TRACE v5.0 patch4. The experiment consists in a long vertical pipe along which an air-water fluid in bubbly regime moves upwards in adiabatic conditions and atmospheric pressure. The simulations have been first performed in both codes with their original correlations, which are based on the drift flux model for the case of bubbly regime in vertical pipes. Then, a different implementation for the drag force has been undertaken, in order to perform a simulation with equivalent bubble diameter to the experiment. Results show that the calculation obtained from the codes are within the ranges of validity of the experiment with some discrepancies, which leads to the conclusion that the use of a drag correlation approach is more realistic than drift flux model. (author)

  3. Comparison of drift-velocity and drag coefficient approaches for one-dimensional two-fluid models in bubbly flow regime and validation with experimental data

    International Nuclear Information System (INIS)

    Gómez-Zarzuela, C.; Miró, R.; Verdú, G.; Peña-Monferrer, C.; Chiva, S.; Muñoz-Cobo, J.L.

    2017-01-01

    Two-phase flow simulation has been an extended research topic over the years due to the importance of predicting with accuracy the flow behavior within different installations, including nuclear power plants. Some of them are low pressure events, like low water pressure injection, nuclear refueling or natural circulation. This work is devoted to investigate the level of accuracy of the results when a two-phase flow experiment, which has been carried out at low pressure, is performed in a one-dimensional simulation code. In particular, the codes that have been selected to represent the experiment are the best-estimate system codes RELAP5/MOD3 and TRACE v5.0 patch4. The experiment consists in a long vertical pipe along which an air-water fluid in bubbly regime moves upwards in adiabatic conditions and atmospheric pressure. The simulations have been first performed in both codes with their original correlations, which are based on the drift flux model for the case of bubbly regime in vertical pipes. Then, a different implementation for the drag force has been undertaken, in order to perform a simulation with equivalent bubble diameter to the experiment. Results show that the calculation obtained from the codes are within the ranges of validity of the experiment with some discrepancies, which leads to the conclusion that the use of a drag correlation approach is more realistic than drift flux model. (author)

  4. Comparison of rule induction, decision trees and formal concept analysis approaches for classification

    Science.gov (United States)

    Kotelnikov, E. V.; Milov, V. R.

    2018-05-01

    Rule-based learning algorithms have higher transparency and easiness to interpret in comparison with neural networks and deep learning algorithms. These properties make it possible to effectively use such algorithms to solve descriptive tasks of data mining. The choice of an algorithm depends also on its ability to solve predictive tasks. The article compares the quality of the solution of the problems with binary and multiclass classification based on the experiments with six datasets from the UCI Machine Learning Repository. The authors investigate three algorithms: Ripper (rule induction), C4.5 (decision trees), In-Close (formal concept analysis). The results of the experiments show that In-Close demonstrates the best quality of classification in comparison with Ripper and C4.5, however the latter two generate more compact rule sets.

  5. DIAGNOSTICS OF DISORDERS AND DISEASES OF MUSCULOSKELETAL SYSTEM IN SCHOOLCHILDREN: APPROACHES, TERMINOLOGY, CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    N.B. Mirskaya

    2009-01-01

    Full Text Available This article describes an information system for physicians working in general education institutes, which is named «Detection, correction and prophylaxis of musculoskeletal system disorders in students of general education institutes». This system was created for the purpose of improving diagnostics of initial stages of musculoskeletal system in schoolchildren, detecting of risk factors, and for the provision of timely prophylaxis during school education. The system was based on classification of functional disorders and initial stages of diseases of musculoskeletal system in schoolchildren, developed by authors of present article, and methods of medical examination and organization of this work.Key words: schoolchildren, musculoskeletal system, diagnostics, classification, prophylaxis.(Voprosy sovremennoi pediatrii — Current Pediatrics. 2009;8(3:10-13

  6. Three approaches to the classification of inland wetlands. [Dismal Swamp, Tennessee, and Florida

    Science.gov (United States)

    Gammon, P. T.; Malone, D.; Brooks, P. D.; Carter, V.

    1977-01-01

    In the Dismal Swamp project, seasonal, color-infrared aerial photographs and LANDSAT digital data were interpreted for a detailed analysis of the vegetative communities in a large, highly altered wetland. In Western Tennessee, seasonal high altitude color-infrared aerial photographs provided the hydrologic and vegetative information needed to map inland wetlands, using a classification system developed for the Tennessee Valley Region. In Florida, color-infrared aerial photographs were analyzed to produce wetland maps using three existing classification systems to evaluate the information content and mappability of each system. The methods used in each of the three projects can be extended or modified for use in the mapping of inland wetlands in other parts of the United States.

  7. A Novel Segment-Based Approach for Improving Classification Performance of Transport Mode Detection.

    Science.gov (United States)

    Guvensan, M Amac; Dusun, Burak; Can, Baris; Turkmen, H Irem

    2017-12-30

    Transportation planning and solutions have an enormous impact on city life. To minimize the transport duration, urban planners should understand and elaborate the mobility of a city. Thus, researchers look toward monitoring people's daily activities including transportation types and duration by taking advantage of individual's smartphones. This paper introduces a novel segment-based transport mode detection architecture in order to improve the results of traditional classification algorithms in the literature. The proposed post-processing algorithm, namely the Healing algorithm, aims to correct the misclassification results of machine learning-based solutions. Our real-life test results show that the Healing algorithm could achieve up to 40% improvement of the classification results. As a result, the implemented mobile application could predict eight classes including stationary, walking, car, bus, tram, train, metro and ferry with a success rate of 95% thanks to the proposed multi-tier architecture and Healing algorithm.

  8. Classification of PolSAR Images Using Multilayer Autoencoders and a Self-Paced Learning Approach

    Directory of Open Access Journals (Sweden)

    Wenshuai Chen

    2018-01-01

    Full Text Available In this paper, a novel polarimetric synthetic aperture radar (PolSAR image classification method based on multilayer autoencoders and self-paced learning (SPL is proposed. The multilayer autoencoders network is used to learn the features, which convert raw data into more abstract expressions. Then, softmax regression is applied to produce the predicted probability distributions over all the classes of each pixel. When we optimize the multilayer autoencoders network, self-paced learning is used to accelerate the learning convergence and achieve a stronger generalization capability. Under this learning paradigm, the network learns the easier samples first and gradually involves more difficult samples in the training process. The proposed method achieves the overall classification accuracies of 94.73%, 94.82% and 78.12% on the Flevoland dataset from AIRSAR, Flevoland dataset from RADARSAT-2 and Yellow River delta dataset, respectively. Such results are comparable with other state-of-the-art methods.

  9. MYOCLONUS IN CHILDREN: DEFINITIONS AND CLASSIFICATIONS, DIFFERENTIAL DIAGNOSIS, APPROACHES TO THERAPY (A LECTURE

    Directory of Open Access Journals (Sweden)

    M. Yu. Bobylova

    2014-01-01

    Full Text Available Myoclonus is a manifestation of many neurological diseases, by differing in etiology and pathogenesis. The high prevalence of myoclonus in children with cardinally different prognoses of diseases of not only the nervous system, but also other organs and systems causes to resume investigations into myoclonus as a syndrome, to specify its terminology and classification, to improve diagnostic criteria, and to optimize additional diagnostic schemes.

  10. Neuropsychological Test Selection for Cognitive Impairment Classification: A Machine Learning Approach

    Science.gov (United States)

    Williams, Jennifer A.; Schmitter-Edgecombe, Maureen; Cook, Diane J.

    2016-01-01

    Introduction Reducing the amount of testing required to accurately detect cognitive impairment is clinically relevant. The aim of this research was to determine the fewest number of clinical measures required to accurately classify participants as healthy older adult, mild cognitive impairment (MCI) or dementia using a suite of classification techniques. Methods Two variable selection machine learning models (i.e., naive Bayes, decision tree), a logistic regression, and two participant datasets (i.e., clinical diagnosis, clinical dementia rating; CDR) were explored. Participants classified using clinical diagnosis criteria included 52 individuals with dementia, 97 with MCI, and 161 cognitively healthy older adults. Participants classified using CDR included 154 individuals CDR = 0, 93 individuals with CDR = 0.5, and 25 individuals with CDR = 1.0+. Twenty-seven demographic, psychological, and neuropsychological variables were available for variable selection. Results No significant difference was observed between naive Bayes, decision tree, and logistic regression models for classification of both clinical diagnosis and CDR datasets. Participant classification (70.0 – 99.1%), geometric mean (60.9 – 98.1%), sensitivity (44.2 – 100%), and specificity (52.7 – 100%) were generally satisfactory. Unsurprisingly, the MCI/CDR = 0.5 participant group was the most challenging to classify. Through variable selection only 2 – 9 variables were required for classification and varied between datasets in a clinically meaningful way. Conclusions The current study results reveal that machine learning techniques can accurately classifying cognitive impairment and reduce the number of measures required for diagnosis. PMID:26332171

  11. A High Throughput Ambient Mass Spectrometric Approach to Species Identification and Classification from Chemical Fingerprint Signatures

    OpenAIRE

    Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.

    2015-01-01

    A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes a...

  12. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  13. Ionic classification of Xe laser lines: A new approach through time resolved spectroscopy

    International Nuclear Information System (INIS)

    Schinca, D.; Duchowicz, R.; Gallardo, M.

    1992-01-01

    Visible and UV laser emission from a highly ionized pulsed Xe plasma was studied in relation to the ionic assignment of the laser lines. Time-resolved spectroscopy was used to determine the ionic origin of the studied lines. The results are in agreement with an intensity versus pressure analysis performed over the same wavelength range. From the temporal behaviour of the spontaneous emission, a probable classification can be obtained. (author). 7 refs, 7 figs, 1 tab

  14. Classification and Compression of Multi-Resolution Vectors: A Tree Structured Vector Quantizer Approach

    Science.gov (United States)

    2002-01-01

    their expression profile and for classification of cells into tumerous and non- tumerous classes. Then we will present a parallel tree method for... cancerous cells. We will use the same dataset and use tree structured classifiers with multi-resolution analysis for classifying cancerous from non- cancerous ...cells. We have the expressions of 4096 genes from 98 different cell types. Of these 98, 72 are cancerous while 26 are non- cancerous . We are interested

  15. Causes of death and associated conditions (Codac): a utilitarian approach to the classification of perinatal deaths.

    Science.gov (United States)

    Frøen, J Frederik; Pinar, Halit; Flenady, Vicki; Bahrin, Safiah; Charles, Adrian; Chauke, Lawrence; Day, Katie; Duke, Charles W; Facchinetti, Fabio; Fretts, Ruth C; Gardener, Glenn; Gilshenan, Kristen; Gordijn, Sanne J; Gordon, Adrienne; Guyon, Grace; Harrison, Catherine; Koshy, Rachel; Pattinson, Robert C; Petersson, Karin; Russell, Laurie; Saastad, Eli; Smith, Gordon C S; Torabi, Rozbeh

    2009-06-10

    A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD), although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes.We tested the Causes of Death and Associated Conditions (Codac) classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions.The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies), two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy), a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal).For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured.The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions associated with them, and the most

  16. Classification of Hypertrophy of Labia Minora: Consideration of a Multiple Component Approach.

    Science.gov (United States)

    González, Pablo I

    2015-11-01

    Labia minora hypertrophy of unknown and under-reported incidence in the general population is considered a variant of normal anatomy. Its origin is multi-factorial including genetic, hormonal, and infectious factors, and voluntary elongation of the labiae minorae in some cultures. Consults with patients bothered by this condition have been increasing with patients complaining of poor aesthetics and symptoms such as difficulty with vaginal secretions, vulvovaginitis, chronic irritation, and superficial dyspareunia, all of which can have a negative effect on these patients' sexuality and self esteem. Surgical management of labial hypertrophy is an option for women with these physical complaints or aesthetic issues. Labia minora hypertrophy can consist of multiple components, including the clitoral hood, lateral prepuce, frenulum, and the body of the labia minora. To date, there is not a consensus in the literature with respect to the classification and definition of varying grades of hypertrophy, aside from measurement of the length in centimeters. In order to offer patients the most appropriate surgical technique, an objective and understandable classification that can be used as part of the preoperative evaluation is necessary. Such a classification should have the aim of offering patients the best cosmetic and functional results with the fewest complications.

  17. Wall-corner classification using sonar: a new approach based on geometric features.

    Science.gov (United States)

    Martínez, Milagros; Benet, Ginés

    2010-01-01

    Ultrasonic signals coming from rotary sonar sensors in a robot gives us several features about the environment. This enables us to locate and classify the objects in the scenario of the robot. Each object and reflector produces a series of peaks in the amplitude of the signal. The radial and angular position of the sonar sensor gives information about location and their amplitudes offer information about the nature of the surface. Early works showed that the amplitude can be modeled and used to classify objects with very good results at short distances-80% average success in classifying both walls and corners at distances less than 1.5 m. In this paper, a new set of geometric features derived from the amplitude analysis of the echo is presented. These features constitute a set of characteristics that can be used to improve the results of classification at distances from 1.5 m to 4 m. Also, a comparative study on classification algorithms widely used in pattern recognition techniques has been carried out for sensor distances ranging between 0.5 to 4 m, and with incidence angles ranging between 20° to 70°. Experimental results show an enhancement on the success in classification rates when these geometric features are considered.

  18. A novel approach to internal crown characterization for coniferous tree species classification

    Science.gov (United States)

    Harikumar, A.; Bovolo, F.; Bruzzone, L.

    2016-10-01

    The knowledge about individual trees in forest is highly beneficial in forest management. High density small foot- print multi-return airborne Light Detection and Ranging (LiDAR) data can provide a very accurate information about the structural properties of individual trees in forests. Every tree species has a unique set of crown structural characteristics that can be used for tree species classification. In this paper, we use both the internal and external crown structural information of a conifer tree crown, derived from a high density small foot-print multi-return LiDAR data acquisition for species classification. Considering the fact that branches are the major building blocks of a conifer tree crown, we obtain the internal crown structural information using a branch level analysis. The structure of each conifer branch is represented using clusters in the LiDAR point cloud. We propose the joint use of the k-means clustering and geometric shape fitting, on the LiDAR data projected onto a novel 3-dimensional space, to identify branch clusters. After mapping the identified clusters back to the original space, six internal geometric features are estimated using a branch-level analysis. The external crown characteristics are modeled by using six least correlated features based on cone fitting and convex hull. Species classification is performed using a sparse Support Vector Machines (sparse SVM) classifier.

  19. A Neural-Network-Based Approach to White Blood Cell Classification

    Directory of Open Access Journals (Sweden)

    Mu-Chun Su

    2014-01-01

    Full Text Available This paper presents a new white blood cell classification system for the recognition of five types of white blood cells. We propose a new segmentation algorithm for the segmentation of white blood cells from smear images. The core idea of the proposed segmentation algorithm is to find a discriminating region of white blood cells on the HSI color space. Pixels with color lying in the discriminating region described by an ellipsoidal region will be regarded as the nucleus and granule of cytoplasm of a white blood cell. Then, through a further morphological process, we can segment a white blood cell from a smear image. Three kinds of features (i.e., geometrical features, color features, and LDP-based texture features are extracted from the segmented cell. These features are fed into three different kinds of neural networks to recognize the types of the white blood cells. To test the effectiveness of the proposed white blood cell classification system, a total of 450 white blood cells images were used. The highest overall correct recognition rate could reach 99.11% correct. Simulation results showed that the proposed white blood cell classification system was very competitive to some existing systems.

  20. Temporal Data Fusion Approaches to Remote Sensing-Based Wetland Classification

    Science.gov (United States)

    Montgomery, Joshua S. M.

    This thesis investigates the ecology of wetlands and associated classification in prairie and boreal environments of Alberta, Canada, using remote sensing technology to enhance classification of wetlands in the province. Objectives of the thesis are divided into two case studies, 1) examining how satellite borne Synthetic Aperture Radar (SAR), optical (RapidEye & SPOT) can be used to evaluate surface water trends in a prairie pothole environment (Shepard Slough); and 2) investigating a data fusion methodology combining SAR, optical and Lidar data to characterize wetland vegetation and surface water attributes in a boreal environment (Utikuma Regional Study Area (URSA)). Surface water extent and hydroperiod products were derived from SAR data, and validated using optical imagery with high accuracies (76-97% overall) for both case studies. High resolution Lidar Digital Elevation Models (DEM), Digital Surface Models (DSM), and Canopy Height Model (CHM) products provided the means for data fusion to extract riparian vegetation communities and surface water; producing model accuracies of (R2 0.90) for URSA, and RMSE of 0.2m to 0.7m at Shepard Slough when compared to field and optical validation data. Integration of Alberta and Canadian wetland classifications systems used to classify and determine economic value of wetlands into the methodology produced thematic maps relevant for policy and decision makers for potential wetland monitoring and policy development.

  1. Multi-phase classification by a least-squares support vector machine approach in tomography images of geological samples

    Science.gov (United States)

    Khan, Faisal; Enzmann, Frieder; Kersten, Michael

    2016-03-01

    Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.

  2. Predicting allergic contact dermatitis: a hierarchical structure activity relationship (SAR) approach to chemical classification using topological and quantum chemical descriptors

    Science.gov (United States)

    Basak, Subhash C.; Mills, Denise; Hawkins, Douglas M.

    2008-06-01

    A hierarchical classification study was carried out based on a set of 70 chemicals—35 which produce allergic contact dermatitis (ACD) and 35 which do not. This approach was implemented using a regular ridge regression computer code, followed by conversion of regression output to binary data values. The hierarchical descriptor classes used in the modeling include topostructural (TS), topochemical (TC), and quantum chemical (QC), all of which are based solely on chemical structure. The concordance, sensitivity, and specificity are reported. The model based on the TC descriptors was found to be the best, while the TS model was extremely poor.

  3. Making the ecosystem approach operational-Can regime shifts in ecological- and governance systems facilitate the transition?

    DEFF Research Database (Denmark)

    Österblom, H.; Gårdmark, A.; Bergström, L.

    2010-01-01

    Effectively reducing cumulative impacts on marine ecosystems requires co-evolution between science, policy and practice. Here, long-term social–ecological changes in the Baltic Sea are described, illustrating how the process of making the ecosystem approach operational in a large marine ecosystem...... stimulating innovations and re-organizing governance structures at drainage basin level to the Baltic Sea catchment as a whole. Experimentation and innovation at local to the regional levels is critical for a transition to ecosystem-based management. Establishing science-based learning platforms at sub...

  4. Multiple classifier systems in texton-based approach for the classification of CT images of Lung

    DEFF Research Database (Denmark)

    Gangeh, Mehrdad J.; Sørensen, Lauge; Shaker, Saher B.

    2010-01-01

    In this paper, we propose using texton signatures based on raw pixel representation along with a parallel multiple classifier system for the classification of emphysema in computed tomography images of the lung. The multiple classifier system is composed of support vector machines on the texton.......e., texton size and k value in k-means. Our results show that while aggregation of single decisions by SVMs over various k values using multiple classifier systems helps to improve the results compared to single SVMs, combining over different texton sizes is not beneficial. The performance of the proposed...

  5. A Model-Based Approach to Infer Shifts in Regional Fire Regimes Over Time Using Sediment Charcoal Records

    Science.gov (United States)

    Itter, M.; Finley, A. O.; Hooten, M.; Higuera, P. E.; Marlon, J. R.; McLachlan, J. S.; Kelly, R.

    2016-12-01

    Sediment charcoal records are used in paleoecological analyses to identify individual local fire events and to estimate fire frequency and regional biomass burned at centennial to millenial time scales. Methods to identify local fire events based on sediment charcoal records have been well developed over the past 30 years, however, an integrated statistical framework for fire identification is still lacking. We build upon existing paleoecological methods to develop a hierarchical Bayesian point process model for local fire identification and estimation of fire return intervals. The model is unique in that it combines sediment charcoal records from multiple lakes across a region in a spatially-explicit fashion leading to estimation of a joint, regional fire return interval in addition to lake-specific local fire frequencies. Further, the model estimates a joint regional charcoal deposition rate free from the effects of local fires that can be used as a measure of regional biomass burned over time. Finally, the hierarchical Bayesian approach allows for tractable error propagation such that estimates of fire return intervals reflect the full range of uncertainty in sediment charcoal records. Specific sources of uncertainty addressed include sediment age models, the separation of local versus regional charcoal sources, and generation of a composite charcoal record The model is applied to sediment charcoal records from a dense network of lakes in the Yukon Flats region of Alaska. The multivariate joint modeling approach results in improved estimates of regional charcoal deposition with reduced uncertainty in the identification of individual fire events and local fire return intervals compared to individual lake approaches. Modeled individual-lake fire return intervals range from 100 to 500 years with a regional interval of roughly 200 years. Regional charcoal deposition to the network of lakes is correlated up to 50 kilometers. Finally, the joint regional charcoal

  6. Electron acceleration by an obliquely propagating electromagnetic wave in the regime of validity of the Fokker-Planck-Kolmogorov approach

    Science.gov (United States)

    Hizanidis, Kyriakos; Vlahos, L.; Polymilis, C.

    1989-01-01

    The relativistic motion of an ensemble of electrons in an intense monochromatic electromagnetic wave propagating obliquely in a uniform external magnetic field is studied. The problem is formulated from the viewpoint of Hamiltonian theory and the Fokker-Planck-Kolmogorov approach analyzed by Hizanidis (1989), leading to a one-dimensional diffusive acceleration along paths of constant zeroth-order generalized Hamiltonian. For values of the wave amplitude and the propagating angle inside the analytically predicted stochastic region, the numerical results suggest that the diffusion probes proceeds in stages. In the first stage, the electrons are accelerated to relatively high energies by sampling the first few overlapping resonances one by one. During that stage, the ensemble-average square deviation of the variable involved scales quadratically with time. During the second stage, they scale linearly with time. For much longer times, deviation from linear scaling slowly sets in.

  7. Mastectomy or breast conserving surgery? Factors affecting type of surgical treatment for breast cancer – a classification tree approach

    International Nuclear Information System (INIS)

    Martin, Michael A; Meyricke, Ramona; O'Neill, Terry; Roberts, Steven

    2006-01-01

    A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS) – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of 'propensity' is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA) data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients

  8. MRI-based treatment plan simulation and adaptation for ion radiotherapy using a classification-based approach

    International Nuclear Information System (INIS)

    Rank, Christopher M; Tremmel, Christoph; Hünemohr, Nora; Nagel, Armin M; Jäkel, Oliver; Greilich, Steffen

    2013-01-01

    In order to benefit from the highly conformal irradiation of tumors in ion radiotherapy, sophisticated treatment planning and simulation are required. The purpose of this study was to investigate the potential of MRI for ion radiotherapy treatment plan simulation and adaptation using a classification-based approach. Firstly, a voxelwise tissue classification was applied to derive pseudo CT numbers from MR images using up to 8 contrasts. Appropriate MR sequences and parameters were evaluated in cross-validation studies of three phantoms. Secondly, ion radiotherapy treatment plans were optimized using both MRI-based pseudo CT and reference CT and recalculated on reference CT. Finally, a target shift was simulated and a treatment plan adapted to the shift was optimized on a pseudo CT and compared to reference CT optimizations without plan adaptation. The derivation of pseudo CT values led to mean absolute errors in the range of 81 - 95 HU. Most significant deviations appeared at borders between air and different tissue classes and originated from partial volume effects. Simulations of ion radiotherapy treatment plans using pseudo CT for optimization revealed only small underdosages in distal regions of a target volume with deviations of the mean dose of PTV between 1.4 - 3.1% compared to reference CT optimizations. A plan adapted to the target volume shift and optimized on the pseudo CT exhibited a comparable target dose coverage as a non-adapted plan optimized on a reference CT. We were able to show that a MRI-based derivation of pseudo CT values using a purely statistical classification approach is feasible although no physical relationship exists. Large errors appeared at compact bone classes and came from an imperfect distinction of bones and other tissue types in MRI. In simulations of treatment plans, it was demonstrated that these deviations are comparable to uncertainties of a target volume shift of 2 mm in two directions indicating that especially

  9. Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

    Directory of Open Access Journals (Sweden)

    Daniel Peralta

    2015-01-01

    Full Text Available Nowadays, many disciplines have to deal with big datasets that additionally involve a high number of features. Feature selection methods aim at eliminating noisy, redundant, or irrelevant features that may deteriorate the classification performance. However, traditional methods lack enough scalability to cope with datasets of millions of instances and extract successful results in a delimited time. This paper presents a feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets. The algorithm decomposes the original dataset in blocks of instances to learn from them in the map phase; then, the reduce phase merges the obtained partial results into a final vector of feature weights, which allows a flexible application of the feature selection procedure using a threshold to determine the selected subset of features. The feature selection method is evaluated by using three well-known classifiers (SVM, Logistic Regression, and Naive Bayes implemented within the Spark framework to address big data problems. In the experiments, datasets up to 67 millions of instances and up to 2000 attributes have been managed, showing that this is a suitable framework to perform evolutionary feature selection, improving both the classification accuracy and its runtime when dealing with big data problems.

  10. Deep Learning Approach for Automatic Classification of Ocular and Cardiac Artifacts in MEG Data

    Directory of Open Access Journals (Sweden)

    Ahmad Hasasneh

    2018-01-01

    Full Text Available We propose an artifact classification scheme based on a combined deep and convolutional neural network (DCNN model, to automatically identify cardiac and ocular artifacts from neuromagnetic data, without the need for additional electrocardiogram (ECG and electrooculogram (EOG recordings. From independent components, the model uses both the spatial and temporal information of the decomposed magnetoencephalography (MEG data. In total, 7122 samples were used after data augmentation, in which task and nontask related MEG recordings from 48 subjects served as the database for this study. Artifact rejection was applied using the combined model, which achieved a sensitivity and specificity of 91.8% and 97.4%, respectively. The overall accuracy of the model was validated using a cross-validation test and revealed a median accuracy of 94.4%, indicating high reliability of the DCNN-based artifact removal in task and nontask related MEG experiments. The major advantages of the proposed method are as follows: (1 it is a fully automated and user independent workflow of artifact classification in MEG data; (2 once the model is trained there is no need for auxiliary signal recordings; (3 the flexibility in the model design and training allows for various modalities (MEG/EEG and various sensor types.

  11. Approach for Text Classification Based on the Similarity Measurement between Normal Cloud Models

    Directory of Open Access Journals (Sweden)

    Jin Dai

    2014-01-01

    Full Text Available The similarity between objects is the core research area of data mining. In order to reduce the interference of the uncertainty of nature language, a similarity measurement between normal cloud models is adopted to text classification research. On this basis, a novel text classifier based on cloud concept jumping up (CCJU-TC is proposed. It can efficiently accomplish conversion between qualitative concept and quantitative data. Through the conversion from text set to text information table based on VSM model, the text qualitative concept, which is extraction from the same category, is jumping up as a whole category concept. According to the cloud similarity between the test text and each category concept, the test text is assigned to the most similar category. By the comparison among different text classifiers in different feature selection set, it fully proves that not only does CCJU-TC have a strong ability to adapt to the different text features, but also the classification performance is also better than the traditional classifiers.

  12. Audio Classification in Speech and Music: A Comparison between a Statistical and a Neural Approach

    Directory of Open Access Journals (Sweden)

    Alessandro Bugatti

    2002-04-01

    Full Text Available We focus the attention on the problem of audio classification in speech and music for multimedia applications. In particular, we present a comparison between two different techniques for speech/music discrimination. The first method is based on Zero crossing rate and Bayesian classification. It is very simple from a computational point of view, and gives good results in case of pure music or speech. The simulation results show that some performance degradation arises when the music segment contains also some speech superimposed on music, or strong rhythmic components. To overcome these problems, we propose a second method, that uses more features, and is based on neural networks (specifically a multi-layer Perceptron. In this case we obtain better performance, at the expense of a limited growth in the computational complexity. In practice, the proposed neural network is simple to be implemented if a suitable polynomial is used as the activation function, and a real-time implementation is possible even if low-cost embedded systems are used.

  13. A Cluster-then-label Semi-supervised Learning Approach for Pathology Image Classification.

    Science.gov (United States)

    Peikari, Mohammad; Salama, Sherine; Nofech-Mozes, Sharon; Martel, Anne L

    2018-05-08

    Completely labeled pathology datasets are often challenging and time-consuming to obtain. Semi-supervised learning (SSL) methods are able to learn from fewer labeled data points with the help of a large number of unlabeled data points. In this paper, we investigated the possibility of using clustering analysis to identify the underlying structure of the data space for SSL. A cluster-then-label method was proposed to identify high-density regions in the data space which were then used to help a supervised SVM in finding the decision boundary. We have compared our method with other supervised and semi-supervised state-of-the-art techniques using two different classification tasks applied to breast pathology datasets. We found that compared with other state-of-the-art supervised and semi-supervised methods, our SSL method is able to improve classification performance when a limited number of labeled data instances are made available. We also showed that it is important to examine the underlying distribution of the data space before applying SSL techniques to ensure semi-supervised learning assumptions are not violated by the data.

  14. Prediction of pediatric unipolar depression using multiple neuromorphometric measurements: a pattern classification approach.

    Science.gov (United States)

    Wu, Mon-Ju; Wu, Hanjing Emily; Mwangi, Benson; Sanches, Marsal; Selvaraj, Sudhakar; Zunta-Soares, Giovana B; Soares, Jair C

    2015-03-01

    Diagnosis of pediatric neuropsychiatric disorders such as unipolar depression is largely based on clinical judgment - without objective biomarkers to guide diagnostic process and subsequent therapeutic interventions. Neuroimaging studies have previously reported average group-level neuroanatomical differences between patients with pediatric unipolar depression and healthy controls. In the present study, we investigated the utility of multiple neuromorphometric indices in distinguishing pediatric unipolar depression patients from healthy controls at an individual subject level. We acquired structural T1-weighted scans from 25 pediatric unipolar depression patients and 26 demographically matched healthy controls. Multiple neuromorphometric indices such as cortical thickness, volume, and cortical folding patterns were obtained. A support vector machine pattern classification model was 'trained' to distinguish individual subjects with pediatric unipolar depression from healthy controls based on multiple neuromorphometric indices and model predictive validity (sensitivity and specificity) calculated. The model correctly identified 40 out of 51 subjects translating to 78.4% accuracy, 76.0% sensitivity and 80.8% specificity, chi-square p-value = 0.000049. Volumetric and cortical folding abnormalities in the right thalamus and right temporal pole respectively were most central in distinguishing individual patients with pediatric unipolar depression from healthy controls. These findings provide evidence that a support vector machine pattern classification model using multiple neuromorphometric indices may qualify as diagnostic marker for pediatric unipolar depression. In addition, our results identified the most relevant neuromorphometric features in distinguishing PUD patients from healthy controls. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. An improved strategy for skin lesion detection and classification using uniform segmentation and feature selection based approach.

    Science.gov (United States)

    Nasir, Muhammad; Attique Khan, Muhammad; Sharif, Muhammad; Lali, Ikram Ullah; Saba, Tanzila; Iqbal, Tassawar

    2018-02-21

    Melanoma is the deadliest type of skin cancer with highest mortality rate. However, the annihilation in early stage implies a high survival rate therefore, it demands early diagnosis. The accustomed diagnosis methods are costly and cumbersome due to the involvement of experienced experts as well as the requirements for highly equipped environment. The recent advancements in computerized solutions for these diagnoses are highly promising with improved accuracy and efficiency. In this article, we proposed a method for the classification of melanoma and benign skin lesions. Our approach integrates preprocessing, lesion segmentation, features extraction, features selection, and classification. Preprocessing is executed in the context of hair removal by DullRazor, whereas lesion texture and color information are utilized to enhance the lesion contrast. In lesion segmentation, a hybrid technique has been implemented and results are fused using additive law of probability. Serial based method is applied subsequently that extracts and fuses the traits such as color, texture, and HOG (shape). The fused features are selected afterwards by implementing a novel Boltzman Entropy method. Finally, the selected features are classified by Support Vector Machine. The proposed method is evaluated on publically available data set PH2. Our approach has provided promising results of sensitivity 97.7%, specificity 96.7%, accuracy 97.5%, and F-score 97.5%, which are significantly better than the results of existing methods available on the same data set. The proposed method detects and classifies melanoma significantly good as compared to existing methods. © 2018 Wiley Periodicals, Inc.

  16. Evolving towards a critical point: A possible electromagnetic way in which the critical regime is reached as the rupture approaches

    Directory of Open Access Journals (Sweden)

    P. G. Kapiris

    2003-01-01

    Full Text Available In analogy to the study of critical phase transitions in statistical physics, it has been argued recently that the fracture of heterogeneous materials could be viewed as a critical phenomenon, either at laboratory or at geophysical scales. If the picture of the development of the fracture is correct one may guess that the precursors may reveal the critical approach of the main-shock. When a heterogeneous material is stretched, its evolution towards breaking is characterized by the appearance of microcracks before the final  break-up. Microcracks produce both acoustic and electromagnetic(EM emission in the frequency range from VLF to VHF. The microcracks and the associated acoustic and EM activities constitute the so-called precursors of general fracture. These precursors are detectable not only at laboratory but also at geophysical scales. VLF and VHF acoustic and EM emissions have been reported resulting from volcanic and seismic activities in various geologically distinct regions of the world. In the present work we attempt to establish the hypothesis that the evolution of the Earth's crust towards the critical point takes place not only in a mechanical but also in an electromagnetic sense. In other words, we focus on the possible electromagnetic criticality, which is reached while the catastrophic rupture in the Earth's crust approaches. Our main tool is the monitoring of micro-fractures that occur before the final breakup, by recording their radio-electromagnetic emissions. We show that the spectral power law analysis of the electromagnetic precursors reveals distinguishing signatures of underlying critical dynamics, such as: (i the emergence of memory effects; (ii the decrease with time of the anti-persistence behaviour; (iii the presence of persistence properties in the tail of the sequence of the precursors; and (iv the acceleration of the precursory electro-magnetic energy release. Moreover, the statistical analysis of the amplitudes of

  17. Multi-fluid Approach to High-frequency Waves in Plasmas. II. Small-amplitude Regime in Partially Ionized Media

    Energy Technology Data Exchange (ETDEWEB)

    Martínez-Gómez, David; Soler, Roberto; Terradas, Jaume, E-mail: david.martinez@uib.es [Departament de Física, Universitat de les Illes Balears, E-07122, Palma de Mallorca (Spain)

    2017-03-01

    The presence of neutral species in a plasma has been shown to greatly affect the properties of magnetohydrodynamic waves. For instance, the interaction between ions and neutrals through momentum transfer collisions causes the damping of Alfvén waves and alters their oscillation frequency and phase speed. When the collision frequencies are larger than the frequency of the waves, single-fluid magnetohydrodynamic approximations can accurately describe the effects of partial ionization, since there is a strong coupling between the various species. However, at higher frequencies, the single-fluid models are not applicable and more complex approaches are required. Here, we use a five-fluid model with three ionized and two neutral components, which takes into consideration Hall’s current and Ohm’s diffusion in addition to the friction due to collisions between different species. We apply our model to plasmas composed of hydrogen and helium, and allow the ionization degree to be arbitrary. By analyzing the corresponding dispersion relation and numerical simulations, we study the properties of small-amplitude perturbations. We discuss the effect of momentum transfer collisions on the ion-cyclotron resonances and compare the importance of magnetic resistivity, and ion–neutral and ion–ion collisions on the wave damping at various frequency ranges. Applications to partially ionized plasmas of the solar atmosphere are performed.

  18. MULTI-FLUID APPROACH TO HIGH-FREQUENCY WAVES IN PLASMAS. I. SMALL-AMPLITUDE REGIME IN FULLY IONIZED MEDIUM

    Energy Technology Data Exchange (ETDEWEB)

    Martínez-Gómez, David; Soler, Roberto; Terradas, Jaume, E-mail: david.martinez@uib.es [Departament de Física, Universitat de les Illes Balears, E-07122, Palma de Mallorca (Spain)

    2016-12-01

    Ideal magnetohydrodynamics (MHD) provides an accurate description of low-frequency Alfvén waves in fully ionized plasmas. However, higher-frequency waves in many plasmas of the solar atmosphere cannot be correctly described by ideal MHD and a more accurate model is required. Here, we study the properties of small-amplitude incompressible perturbations in both the low- and the high-frequency ranges in plasmas composed of several ionized species. We use a multi-fluid approach and take into account the effects of collisions between ions and the inclusion of Hall’s term in the induction equation. Through the analysis of the corresponding dispersion relations and numerical simulations, we check that at high frequencies ions of different species are not as strongly coupled as in the low-frequency limit. Hence, they cannot be treated as a single fluid. In addition, elastic collisions between the distinct ionized species are not negligible for high-frequency waves, since an appreciable damping is obtained. Furthermore, Coulomb collisions between ions remove the cyclotron resonances and the strict cutoff regions, which are present when collisions are not taken into account. The implications of these results for the modeling of high-frequency waves in solar plasmas are discussed.

  19. ''Global and local approaches of fracture in the ductile to brittle regime of a low alloy steel''

    International Nuclear Information System (INIS)

    Renevey, S.

    1998-01-01

    The study is a contribution to the prediction of flow fracture toughness of low alloy steel and to a better knowledge of fracture behavior in the ductile to brittle transition region. Experiments were performed on a nozzle cut-off from a pressurized water reactor vessel made of steels A508C13 type steel. Axisymmetrical notched specimens were tested to study the fracture onset in a volume element while pre-cracked specimens were used to investigate cleavage fracture after stable crack growth. Systematic observations of fracture surfaces showed manganese sulfide inclusions (MnS) at cleavage sites or in the vicinity. The experimental results were used for modelling by the local approach to fracture. In a volume element the fracture is described by an original probabilistic model. This model is based on volume fraction distributions of MnS inclusions gathered in clusters and on the assumption of a competition without interaction between ductile and cleavage fracture modes. This model was applied to pre-cracked specimens (CT specimens). It is able to describe the scatter in the toughness after a small stable crack growth if a temperature effect on the cleavage stress is assumed. So, the modelling is able to give a lower bound of fracture toughness as a function of temperature. (author)

  20. Classification of breast cancer patients using somatic mutation profiles and machine learning approaches.

    Science.gov (United States)

    Vural, Suleyman; Wang, Xiaosheng; Guda, Chittibabu

    2016-08-26

    The high degree of heterogeneity observed in breast cancers makes it very difficult to classify the cancer patients into distinct clinical subgroups and consequently limits the ability to devise effective therapeutic strategies. Several classification strategies based on ER/PR/HER2 expression or the expression profiles of a panel of genes have helped, but such methods often produce misleading results due to their dynamic nature. In contrast, somatic DNA mutations are relatively stable and lead to initiation and progression of many sporadic cancers. Hence in this study, we explore the use of gene mutation profiles to classify, characterize and predict the subgroups of breast cancers. We analyzed the whole exome sequencing data from 358 ethnically similar breast cancer patients in The Cancer Genome Atlas (TCGA) project. Somatic and non-synonymous single nucleotide variants identified from each patient were assigned a quantitative score (C-score) that represents the extent of negative impact on the gene function. Using these scores with non-negative matrix factorization method, we clustered the patients into three subgroups. By comparing the clinical stage of patients, we identified an early-stage-enriched and a late-stage-enriched subgroup. Comparison of the mutation scores of early and late-stage-enriched subgroups identified 358 genes that carry significantly higher mutations rates in the late stage subgroup. Functional characterization of these genes revealed important functional gene families that carry a heavy mutational load in the late state rich subgroup of patients. Finally, using the identified subgroups, we also developed a supervised classification model to predict the stage of the patients. This study demonstrates that gene mutation profiles can be effectively used with unsupervised machine-learning methods to identify clinically distinguishable breast cancer subgroups. The classification model developed in this method could provide a reasonable

  1. An approach to quality classification of deep groundwaters in Sweden and Finland

    International Nuclear Information System (INIS)

    Laaksoharju, M.; Smellie, J.; Ruotsalainen, P.; Snellman, M.

    1993-11-01

    In Sweden and Finland high quality groundwater samples are required in the site characterization programmes relating to safe disposal of spent nuclear fuel. SKB (Swedish Nuclear Fuel and Waste Management Co.) and TVO (Teollisuuden Voima Oy, Finland) initiated a cooperative task to critically evaluate the quality of the earlier sampling programmes and to further develop the understanding of quality or representativeness of the groundwater samples. The major aim in this report has been, therefore, to make an attempt to classify groundwaters from site investigations in Sweden and Finland based on quality. Different classification systems have been tested and developed. These can be divided in two main groups; manual methods and computer-based mathematical methods. Manual, statistical, mixing ratio and scoring systems have all been used to illustrate the difficulty in judging groundwater quality. (28 refs., 19 figs., 11 tabs.)

  2. From "tactical discussion" in collaboration game to "behaviors": A classification approach in stages.

    Directory of Open Access Journals (Sweden)

    Francisco Serrano

    2018-03-01

    Full Text Available The analysis of group dynamics is extremely useful for understanding and predicting the performance of teamwork’s, since in this context, collaboration problems can naturally arise. Artificial intelligence, and specially machine learning techniques, enables automating the observation process and the analysis of groups of users who use an online collaborative platform. Among the online collaborative platforms available, games are an attractive alternative for all audiences that enable capturing the players’ behavior by observing their social interactions, while engaging them in a pleasant activity. In this paper, we present experimental results of classifying observed conversations in an online game to collaborative behaviors, guided by the Interaction Process Analysis, a theory for categorizing social interactions. The proposed automation of the classification process can be used to assist teachers or team leaders to detect alterations in the balance of group reactions and to improve their performance by indicating actions to improve the balance.

  3. Current Approaches to Diagnosis and Classification Features of Neuroosteoarthropathy Charcot (literature review

    Directory of Open Access Journals (Sweden)

    Balatiuk Irina

    2016-12-01

    Full Text Available The article provides the analysis of the publications of domestic and foreign authors on such complication of diabetes as diabetic Charcot osteoarthropathy. It formulates modern domestic classification of diabetic foot syndrome. It has been stated that diabetic foot syndrome is a serious medical and social problem, due to the high level of disability of patients, it causes significant social and economic losses to society. Pathogenetic basis for the development of diabetic osteoarthropathy is a combination of uncontrolled bone resorption and the lack of sensitivity of the defense, which leads to the destruction of joints. The gold standard for diagnosis of Charcot osteoarthropathy is X-ray densitometry that allows you to objectively assess the state of bone mineral density.

  4. Different approaches for the texture classification of a remote sensing image bank

    Science.gov (United States)

    Durand, Philippe; Brunet, Gerard; Ghorbanzadeh, Dariush; Jaupi, Luan

    2018-04-01

    In this paper, we summarize and compare two different approaches used by the authors, to classify different natural textures. The first approach, which is simple and inexpensive in computing time, uses a data bank image and an expert system able to classify different textures from a number of rules established by discipline specialists. The second method uses the same database and a neural networks approach.

  5. Classification of suicide attempters in schizophrenia using sociocultural and clinical features: A machine learning approach.

    Science.gov (United States)

    Hettige, Nuwan C; Nguyen, Thai Binh; Yuan, Chen; Rajakulendran, Thanara; Baddour, Jermeen; Bhagwat, Nikhil; Bani-Fatemi, Ali; Voineskos, Aristotle N; Mallar Chakravarty, M; De Luca, Vincenzo

    2017-07-01

    Suicide is a major concern for those afflicted by schizophrenia. Identifying patients at the highest risk for future suicide attempts remains a complex problem for psychiatric interventions. Machine learning models allow for the integration of many risk factors in order to build an algorithm that predicts which patients are likely to attempt suicide. Currently it is unclear how to integrate previously identified risk factors into a clinically relevant predictive tool to estimate the probability of a patient with schizophrenia for attempting suicide. We conducted a cross-sectional assessment on a sample of 345 participants diagnosed with schizophrenia spectrum disorders. Suicide attempters and non-attempters were clearly identified using the Columbia Suicide Severity Rating Scale (C-SSRS) and the Beck Suicide Ideation Scale (BSS). We developed four classification algorithms using a regularized regression, random forest, elastic net and support vector machine models with sociocultural and clinical variables as features to train the models. All classification models performed similarly in identifying suicide attempters and non-attempters. Our regularized logistic regression model demonstrated an accuracy of 67% and an area under the curve (AUC) of 0.71, while the random forest model demonstrated 66% accuracy and an AUC of 0.67. Support vector classifier (SVC) model demonstrated an accuracy of 67% and an AUC of 0.70, and the elastic net model demonstrated and accuracy of 65% and an AUC of 0.71. Machine learning algorithms offer a relatively successful method for incorporating many clinical features to predict individuals at risk for future suicide attempts. Increased performance of these models using clinically relevant variables offers the potential to facilitate early treatment and intervention to prevent future suicide attempts. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.

    Science.gov (United States)

    Tatsuoka, Kikumi K.

    A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…

  7. Post-classification approaches to estimating change in forest area using remotely sense auxiliary data.

    Science.gov (United States)

    Ronald E. McRoberts

    2014-01-01

    Multiple remote sensing-based approaches to estimating gross afforestation, gross deforestation, and net deforestation are possible. However, many of these approaches have severe data requirements in the form of long time series of remotely sensed data and/or large numbers of observations of land cover change to train classifiers and assess the accuracy of...

  8. A Quantum Hybrid PSO Combined with Fuzzy k-NN Approach to Feature Selection and Cell Classification in Cervical Cancer Detection

    Directory of Open Access Journals (Sweden)

    Abdullah M. Iliyasu

    2017-12-01

    Full Text Available A quantum hybrid (QH intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO method with the intuitionistic rationality of traditional fuzzy k-nearest neighbours (Fuzzy k-NN algorithm (known simply as the Q-Fuzzy approach is proposed for efficient feature selection and classification of cells in cervical smeared (CS images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection and another hybrid technique combining the standard PSO algorithm with the Fuzzy k-NN technique (P-Fuzzy approach. In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k-NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.

  9. Hybrid three-dimensional and support vector machine approach for automatic vehicle tracking and classification using a single camera

    Science.gov (United States)

    Kachach, Redouane; Cañas, José María

    2016-05-01

    Using video in traffic monitoring is one of the most active research domains in the computer vision community. TrafficMonitor, a system that employs a hybrid approach for automatic vehicle tracking and classification on highways using a simple stationary calibrated camera, is presented. The proposed system consists of three modules: vehicle detection, vehicle tracking, and vehicle classification. Moving vehicles are detected by an enhanced Gaussian mixture model background estimation algorithm. The design includes a technique to resolve the occlusion problem by using a combination of two-dimensional proximity tracking algorithm and the Kanade-Lucas-Tomasi feature tracking algorithm. The last module classifies the shapes identified into five vehicle categories: motorcycle, car, van, bus, and truck by using three-dimensional templates and an algorithm based on histogram of oriented gradients and the support vector machine classifier. Several experiments have been performed using both real and simulated traffic in order to validate the system. The experiments were conducted on GRAM-RTM dataset and a proper real video dataset which is made publicly available as part of this work.

  10. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio; Lozano, Jose A.; Iñ za, Iñ aki; Irigoien, Xabier; Pé rez, Aritz; Rodrí guez, Juan Diego

    2013-01-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models

  11. City housing atmospheric pollutant impact on emergency visit for asthma: A classification and regression tree approach.

    Science.gov (United States)

    Mazenq, Julie; Dubus, Jean-Christophe; Gaudart, Jean; Charpin, Denis; Viudes, Gilles; Noel, Guilhem

    2017-11-01

    Particulate matter, nitrogen dioxide (NO 2 ) and ozone are recognized as the three pollutants that most significantly affect human health. Asthma is a multifactorial disease. However, the place of residence has rarely been investigated. We compared the impact of air pollution, measured near patients' homes, on emergency department (ED) visits for asthma or trauma (controls) within the Provence-Alpes-Côte-d'Azur region. Variables were selected using classification and regression trees on asthmatic and control population, 3-99 years, visiting ED from January 1 to December 31, 2013. Then in a nested case control study, randomization was based on the day of ED visit and on defined age groups. Pollution, meteorological, pollens and viral data measured that day were linked to the patient's ZIP code. A total of 794,884 visits were reported including 6250 for asthma and 278,192 for trauma. Factors associated with an excess risk of emergency visit for asthma included short-term exposure to NO 2 , female gender, high viral load and a combination of low temperature and high humidity. Short-term exposures to high NO 2 concentrations, as assessed close to the homes of the patients, were significantly associated with asthma-related ED visits in children and adults. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Review of Cuttability Indices and A New Rockmass Classification Approach for Selection of Surface Miners

    Science.gov (United States)

    Dey, Kaushik; Ghose, A. K.

    2011-09-01

    Rock excavation is carried out either by drilling and blasting or using rock-cutting machines like rippers, bucket wheel excavators, surface miners, road headers etc. Economics of mechanised rock excavation by rock-cutting machines largely depends on the achieved production rates. Thus, assessment of the performance (productivity) is important prior to deploying a rock-cutting machine. In doing so, several researchers have classified rockmass in different ways and have developed cuttability indices to correlate machine performance directly. However, most of these indices were developed to assess the performance of road headers/tunnel-boring machines apart from a few that were developed in the earlier days when the ripper was a popular excavating equipment. Presently, around 400 surface miners are in operation around the world amongst which, 105 are in India. Until now, no rockmass classification system is available to assess the performance of surface miners. Surface miners are being deployed largely on trial and error basis or based on the performance charts provided by the manufacturer. In this context, it is logical to establish a suitable cuttability index to predict the performance of surface miners. In this present paper, the existing cuttability indices are reviewed and a new cuttability indexes proposed. A new relationship is also developed to predict the output from surface miners using the proposed cuttability index.

  13. Characterization of Schizophrenia Adverse Drug Interactions through a Network Approach and Drug Classification

    Directory of Open Access Journals (Sweden)

    Jingchun Sun

    2013-01-01

    Full Text Available Antipsychotic drugs are medications commonly for schizophrenia (SCZ treatment, which include two groups: typical and atypical. SCZ patients have multiple comorbidities, and the coadministration of drugs is quite common. This may result in adverse drug-drug interactions, which are events that occur when the effect of a drug is altered by the coadministration of another drug. Therefore, it is important to provide a comprehensive view of these interactions for further coadministration improvement. Here, we extracted SCZ drugs and their adverse drug interactions from the DrugBank and compiled a SCZ-specific adverse drug interaction network. This network included 28 SCZ drugs, 241 non-SCZs, and 991 interactions. By integrating the Anatomical Therapeutic Chemical (ATC classification with the network analysis, we characterized those interactions. Our results indicated that SCZ drugs tended to have more adverse drug interactions than other drugs. Furthermore, SCZ typical drugs had significant interactions with drugs of the “alimentary tract and metabolism” category while SCZ atypical drugs had significant interactions with drugs of the categories “nervous system” and “antiinfectives for systemic uses.” This study is the first to characterize the adverse drug interactions in the course of SCZ treatment and might provide useful information for the future SCZ treatment.

  14. Two-Stage Approach to Image Classification by Deep Neural Networks

    Directory of Open Access Journals (Sweden)

    Ososkov Gennady

    2018-01-01

    Full Text Available The paper demonstrates the advantages of the deep learning networks over the ordinary neural networks on their comparative applications to image classifying. An autoassociative neural network is used as a standalone autoencoder for prior extraction of the most informative features of the input data for neural networks to be compared further as classifiers. The main efforts to deal with deep learning networks are spent for a quite painstaking work of optimizing the structures of those networks and their components, as activation functions, weights, as well as the procedures of minimizing their loss function to improve their performances and speed up their learning time. It is also shown that the deep autoencoders develop the remarkable ability for denoising images after being specially trained. Convolutional Neural Networks are also used to solve a quite actual problem of protein genetics on the example of the durum wheat classification. Results of our comparative study demonstrate the undoubted advantage of the deep networks, as well as the denoising power of the autoencoders. In our work we use both GPU and cloud services to speed up the calculations.

  15. An approach to quality classification of deep groundwaters in Sweden and Finland

    International Nuclear Information System (INIS)

    Laaksoharju, M.; Smellie, J.; Ruotsalainen, P.; Snellman, M.

    1993-11-01

    The quality and representativeness of groundwaters sampled in the Swedish SKB and in the Finnish TVO nuclear waste disposal site investigations have been evaluated. By definition a high quality sample is considered to be the one which best reflects the undisturbed hydrological and geochemical in situ conditions for the sampled section. Manual (expert judgement), statistical multivariate, mixing models and quality scoring system have been used to classify the waters regarding representativity. The constructed scoring system is best suited for quality classification, although the expert judgement is always needed as a complement. The observations are scored on a continuous scale based on the response of selected quality indicating parameters. Less representative samples are not rejected but given a value indicating the confidence of the observation. Finnish data obtained 45% of the possible scores compared to 55% for the Swedish data. The quality is generally 10% higher in the Swedish samples compared to the Finnish samples. The difference in sampling procedure is the probable reason for this

  16. Two-Stage Approach to Image Classification by Deep Neural Networks

    Science.gov (United States)

    Ososkov, Gennady; Goncharov, Pavel

    2018-02-01

    The paper demonstrates the advantages of the deep learning networks over the ordinary neural networks on their comparative applications to image classifying. An autoassociative neural network is used as a standalone autoencoder for prior extraction of the most informative features of the input data for neural networks to be compared further as classifiers. The main efforts to deal with deep learning networks are spent for a quite painstaking work of optimizing the structures of those networks and their components, as activation functions, weights, as well as the procedures of minimizing their loss function to improve their performances and speed up their learning time. It is also shown that the deep autoencoders develop the remarkable ability for denoising images after being specially trained. Convolutional Neural Networks are also used to solve a quite actual problem of protein genetics on the example of the durum wheat classification. Results of our comparative study demonstrate the undoubted advantage of the deep networks, as well as the denoising power of the autoencoders. In our work we use both GPU and cloud services to speed up the calculations.

  17. A practical approach to the classification of IRAS sources using infrared colors alone

    International Nuclear Information System (INIS)

    Walker, H.J.; Volk, K.; Wainscoat, R.J.; Schwartz, D.E.; Cohen, M.

    1989-01-01

    Zones of the IRAS color-color planes in which a variety of different types of known source occur, have been defined for the purpose of obtaining representative IRAS colors for them. There is considerable overlap between many of these zones, rendering a unique classification difficult on the basis of IRAS colors alone, although galactic latitude can resolve ambiguities between galactic and extragalactic populations. The color dependence of these zones on the presence of spectral emission/absorption features and on the spatial extent of the sources has been investigated. It is found that silicate emission features do not significantly influence the IRAS colors. Planetary nebulae may show a dependence of color on the presence of atomic or molecular features in emission, although the dominant cause of this effect may be the underlying red continua of nebulae with strong atomic lines. Only small shifts are detected in the colors of individual spatially extended sources when total flux measurements are substituted for point-source measurements. 36 refs

  18. Improved MODIS aerosol retrieval in urban areas using a land classification approach and empirical orthogonal functions

    Science.gov (United States)

    Levitan, Nathaniel; Gross, Barry

    2016-10-01

    New, high-resolution aerosol products are required in urban areas to improve the spatial coverage of the products, in terms of both resolution and retrieval frequency. These new products will improve our understanding of the spatial variability of aerosols in urban areas and will be useful in the detection of localized aerosol emissions. Urban aerosol retrieval is challenging for existing algorithms because of the high spatial variability of the surface reflectance, indicating the need for improved urban surface reflectance models. This problem can be stated in the language of novelty detection as the problem of selecting aerosol parameters whose effective surface reflectance spectrum is not an outlier in some space. In this paper, empirical orthogonal functions, a reconstruction-based novelty detection technique, is used to perform single-pixel aerosol retrieval using the single angular and temporal sample provided by the MODIS sensor. The empirical orthogonal basis functions are trained for different land classes using the MODIS BRDF MCD43 product. Existing land classification products are used in training and aerosol retrieval. The retrieval is compared against the existing operational MODIS 3 KM Dark Target (DT) aerosol product and co-located AERONET data. Based on the comparison, our method allows for a significant increase in retrieval frequency and a moderate decrease in the known biases of MODIS urban aerosol retrievals.

  19. A scale space approach for unsupervised feature selection in mass spectra classification for ovarian cancer detection.

    Science.gov (United States)

    Ceccarelli, Michele; d'Acierno, Antonio; Facchiano, Angelo

    2009-10-15

    Mass spectrometry spectra, widely used in proteomics studies as a screening tool for protein profiling and to detect discriminatory signals, are high dimensional data. A large number of local maxima (a.k.a. peaks) have to be analyzed as part of computational pipelines aimed at the realization of efficient predictive and screening protocols. With this kind of data dimensions and samples size the risk of over-fitting and selection bias is pervasive. Therefore the development of bio-informatics methods based on unsupervised feature extraction can lead to general tools which can be applied to several fields of predictive proteomics. We propose a method for feature selection and extraction grounded on the theory of multi-scale spaces for high resolution spectra derived from analysis of serum. Then we use support vector machines for classification. In particular we use a database containing 216 samples spectra divided in 115 cancer and 91 control samples. The overall accuracy averaged over a large cross validation study is 98.18. The area under the ROC curve of the best selected model is 0.9962. We improved previous known results on the problem on the same data, with the advantage that the proposed method has an unsupervised feature selection phase. All the developed code, as MATLAB scripts, can be downloaded from http://medeaserver.isa.cnr.it/dacierno/spectracode.htm.

  20. Predicting the disease of Alzheimer with SNP biomarkers and clinical data using data mining classification approach: decision tree.

    Science.gov (United States)

    Erdoğan, Onur; Aydin Son, Yeşim

    2014-01-01

    Single Nucleotide Polymorphisms (SNPs) are the most common genomic variations where only a single nucleotide differs between individuals. Individual SNPs and SNP profiles associated with diseases can be utilized as biological markers. But there is a need to determine the SNP subsets and patients' clinical data which is informative for the diagnosis. Data mining approaches have the highest potential for extracting the knowledge from genomic datasets and selecting the representative SNPs as well as most effective and informative clinical features for the clinical diagnosis of the diseases. In this study, we have applied one of the widely used data mining classification methodology: "decision tree" for associating the SNP biomarkers and significant clinical data with the Alzheimer's disease (AD), which is the most common form of "dementia". Different tree construction parameters have been compared for the optimization, and the most accurate tree for predicting the AD is presented.

  1. Classification of Suncus murinus species complex (Soricidae: Crocidurinae) in Peninsular Malaysia using image analysis and machine learning approaches.

    Science.gov (United States)

    Abu, Arpah; Leow, Lee Kien; Ramli, Rosli; Omar, Hasmahzaiti

    2016-12-22

    Taxonomists frequently identify specimen from various populations based on the morphological characteristics and molecular data. This study looks into another invasive process in identification of house shrew (Suncus murinus) using image analysis and machine learning approaches. Thus, an automated identification system is developed to assist and simplify this task. In this study, seven descriptors namely area, convex area, major axis length, minor axis length, perimeter, equivalent diameter and extent which are based on the shape are used as features to represent digital image of skull that consists of dorsal, lateral and jaw views for each specimen. An Artificial Neural Network (ANN) is used as classifier to classify the skulls of S. murinus based on region (northern and southern populations of Peninsular Malaysia) and sex (adult male and female). Thus, specimen classification using Training data set and identification using Testing data set were performed through two stages of ANNs. At present, the classifier used has achieved an accuracy of 100% based on skulls' views. Classification and identification to regions and sexes have also attained 72.5%, 87.5% and 80.0% of accuracy for dorsal, lateral, and jaw views, respectively. This results show that the shape characteristic features used are substantial because they can differentiate the specimens based on regions and sexes up to the accuracy of 80% and above. Finally, an application was developed and can be used for the scientific community. This automated system demonstrates the practicability of using computer-assisted systems in providing interesting alternative approach for quick and easy identification of unknown species.

  2. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  3. Analysis of effects of manhole covers on motorcycle driver maneuvers: a nonparametric classification tree approach.

    Science.gov (United States)

    Chang, Li-Yen

    2014-01-01

    A manhole cover is a removable plate forming the lid over the opening of a manhole to allow traffic to pass over the manhole and to prevent people from falling in. Because most manhole covers are placed in roadway traffic lanes, if these manhole covers are not appropriately installed or maintained, they can represent unexpected hazards on the road, especially for motorcycle drivers. The objective of this study is to identify the effects of manhole cover characteristics as well as driver factors and traffic and roadway conditions on motorcycle driver maneuvers. A video camera was used to record motorcycle drivers' maneuvers when they encountered an inappropriately installed or maintained manhole cover. Information on 3059 drivers' maneuver decisions was recorded. Classification and regression tree (CART) models were applied to explore factors that can significantly affect motorcycle driver maneuvers when passing a manhole cover. Nearly 50 percent of the motorcycle drivers decelerated or changed their driving path to reduce the effects of the manhole cover. The manhole cover characteristics including the level difference between manhole cover and pavement, the pavement condition over the manhole cover, and the size of the manhole cover can significantly affect motorcycle driver maneuvers. Other factors, including traffic conditions, lane width, motorcycle speed, and loading conditions, also have significant effects on motorcycle driver maneuvers. To reduce the effects and potential risks from the manhole covers, highway authorities not only need to make sure that any newly installed manhole covers are as level as possible but also need to regularly maintain all the manhole covers to ensure that they are in good condition. In the long run, the size of manhole covers should be kept as small as possible so that the impact of manhole covers on motorcycle drivers can be effectively reduced. Supplemental materials are available for this article. Go to the publisher

  4. Pattern classification approach to characterizing solitary pulmonary nodules imaged on high-resolution computed tomography

    Science.gov (United States)

    McNitt-Gray, Michael F.; Hart, Eric M.; Goldin, Jonathan G.; Yao, Chih-Wei; Aberle, Denise R.

    1996-04-01

    The purpose of our study was to characterize solitary pulmonary nodules (SPN) as benign or malignant based on pattern classification techniques using size, shape, density and texture features extracted from HRCT images. HRCT images of patients with a SPN are acquired, routed through a PACS and displayed on a thoracic radiology workstation. Using the original data, the SPN is semiautomatically contoured using a nodule/background threshold. The contour is used to calculate size and several shape parameters, including compactness and bending energy. Pixels within the interior of the contour are used to calculate several features including: (1) nodule density-related features, such as representative Hounsfield number and moment of inertia, and (2) texture measures based on the spatial gray level dependence matrix and fractal dimension. The true diagnosis of the SPN is established by histology from biopsy or, in the case of some benign nodules, extended follow-up. Multi-dimensional analyses of the features are then performed to determine which features can discriminate between benign and malignant nodules. When a sufficient number of cases are obtained two pattern classifiers, a linear discriminator and a neural network, are trained and tested using a select subset of features. Preliminary data from nine (9) nodule cases have been obtained and several features extracted. While the representative CT number is a reasonably good indicator, it is an inconclusive predictor of SPN diagnosis when considered by itself. Separation between benign and malignant nodules improves when other features, such as the distribution of density as measured by moment of inertia, are included in the analysis. Software has been developed and preliminary results have been obtained which show that individual features may not be sufficient to discriminate between benign and malignant nodules. However, combinations of these features may be able to discriminate between these two classes. With

  5. An approach to understanding sleep and depressed mood in adolescents: person-centred sleep classification.

    Science.gov (United States)

    Shochat, Tamar; Barker, David H; Sharkey, Katherine M; Van Reen, Eliza; Roane, Brandy M; Carskadon, Mary A

    2017-12-01

    Depressive mood in youth has been associated with distinct sleep dimensions, such as timing, duration and quality. To identify discrete sleep phenotypes, we applied person-centred analysis (latent class mixture models) based on self-reported sleep patterns and quality, and examined associations between phenotypes and mood in high-school seniors. Students (n = 1451; mean age = 18.4 ± 0.3 years; 648 M) completed a survey near the end of high-school. Indicators used for classification included school night bed- and rise-times, differences between non-school night and school night bed- and rise-times, sleep-onset latency, number of awakenings, naps, and sleep quality and disturbance. Mood was measured using the total score on the Center for Epidemiologic Studies-Depression Scale. One-way anova tested differences between phenotype for mood. Fit indexes were split between 3-, 4- and 5-phenotype solutions. For all solutions, between phenotype differences were shown for all indicators: bedtime showed the largest difference; thus, classes were labelled from earliest to latest bedtime as 'A' (n = 751), 'B' (n = 428) and 'C' (n = 272) in the 3-class solution. Class B showed the lowest sleep disturbances and remained stable, whereas classes C and A each split in the 4- and 5-class solutions, respectively. Associations with mood were consistent, albeit small, with class B showing the lowest scores. Person-centred analysis identified sleep phenotypes that differed in mood, such that those with the fewest depressive symptoms had moderate sleep timing, shorter sleep-onset latencies and fewer arousals. Sleep characteristics in these groups may add to our understanding of how sleep and depressed mood associate in teens. © 2017 European Sleep Research Society.

  6. Automatic Approach to Morphological Classification of Galaxies With Analysis of Galaxy Populations in Clusters

    Science.gov (United States)

    Sultanova, Madina; Barkhouse, Wayne; Rude, Cody

    2018-01-01

    The classification of galaxies based on their morphology is a field in astrophysics that aims to understand galaxy formation and evolution based on their physical differences. Whether structural differences are due to internal factors or a result of local environment, the dominate mechanism that determines galaxy type needs to be robustly quantified in order to have a thorough grasp of the origin of the different types of galaxies. The main subject of my Ph.D. dissertation is to explore the use of computers to automatically classify and analyze large numbers of galaxies according to their morphology, and to analyze sub-samples of galaxies selected by type to understand galaxy formation in various environments. I have developed a computer code to classify galaxies by measuring five parameters from their images in FITS format. The code was trained and tested using visually classified SDSS galaxies from Galaxy Zoo and the EFIGI data set. I apply my morphology software to numerous galaxies from diverse data sets. Among the data analyzed are the 15 Abell galaxy clusters (0.03 Frontier Field galaxy clusters. The high resolution of HST allows me to compare distant clusters with those nearby to look for evolutionary changes in the galaxy cluster population. I use the results from the software to examine the properties (e.g. luminosity functions, radial dependencies, star formation rates) of selected galaxies. Due to the large amount of data that will be available from wide-area surveys in the future, the use of computer software to classify and analyze the morphology of galaxies will be extremely important in terms of efficiency. This research aims to contribute to the solution of this problem.

  7. Investigating the Predictive Value of Functional MRI to Appetitive and Aversive Stimuli: A Pattern Classification Approach.

    Directory of Open Access Journals (Sweden)

    Ciara McCabe

    Full Text Available Dysfunctional neural responses to appetitive and aversive stimuli have been investigated as possible biomarkers for psychiatric disorders. However it is not clear to what degree these are separate processes across the brain or in fact overlapping systems. To help clarify this issue we used Gaussian process classifier (GPC analysis to examine appetitive and aversive processing in the brain.25 healthy controls underwent functional MRI whilst seeing pictures and receiving tastes of pleasant and unpleasant food. We applied GPCs to discriminate between the appetitive and aversive sights and tastes using functional activity patterns.The diagnostic accuracy of the GPC for the accuracy to discriminate appetitive taste from neutral condition was 86.5% (specificity = 81%, sensitivity = 92%, p = 0.001. If a participant experienced neutral taste stimuli the probability of correct classification was 92. The accuracy to discriminate aversive from neutral taste stimuli was 82.5% (specificity = 73%, sensitivity = 92%, p = 0.001 and appetitive from aversive taste stimuli was 73% (specificity = 77%, sensitivity = 69%, p = 0.001. In the sight modality, the accuracy to discriminate appetitive from neutral condition was 88.5% (specificity = 85%, sensitivity = 92%, p = 0.001, to discriminate aversive from neutral sight stimuli was 92% (specificity = 92%, sensitivity = 92%, p = 0.001, and to discriminate aversive from appetitive sight stimuli was 63.5% (specificity = 73%, sensitivity = 54%, p = 0.009.Our results demonstrate the predictive value of neurofunctional data in discriminating emotional and neutral networks of activity in the healthy human brain. It would be of interest to use pattern recognition techniques and fMRI to examine network dysfunction in the processing of appetitive, aversive and neutral stimuli in psychiatric disorders. Especially where problems with reward and punishment processing have been implicated in the pathophysiology of the disorder.

  8. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  9. Statistical modelling approach to derive quantitative nanowastes classification index; estimation of nanomaterials exposure

    CSIR Research Space (South Africa)

    Ntaka, L

    2013-08-01

    Full Text Available . In this work, statistical inference approach specifically the non-parametric bootstrapping and linear model were applied. Data used to develop the model were sourced from the literature. 104 data points with information on aggregation, natural organic matter...

  10. A multiple kernel classification approach based on a Quadratic Successive Geometric Segmentation methodology with a fault diagnosis case.

    Science.gov (United States)

    Honório, Leonardo M; Barbosa, Daniele A; Oliveira, Edimar J; Garcia, Paulo A Nepomuceno; Santos, Murillo F

    2018-03-01

    This work presents a new approach for solving classification and learning problems. The Successive Geometric Segmentation technique is applied to encapsulate large datasets by using a series of Oriented Bounding Hyper Box (OBHBs). Each OBHB is obtained through linear separation analysis and each one represents a specific region in a pattern's solution space. Also, each OBHB can be seen as a data abstraction layer and be considered as an individual Kernel. Thus, it is possible by applying a quadratic discriminant function, to assemble a set of nonlinear surfaces separating each desirable pattern. This approach allows working with large datasets using high speed linear analysis tools and yet providing a very accurate non-linear classifier as final result. The methodology was tested using the UCI Machine Learning repository and a Power Transformer Fault Diagnosis real scenario problem. The results were compared with different approaches provided by literature and, finally, the potential and further applications of the methodology were also discussed. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals.

    Science.gov (United States)

    Verma, Gyanendra K; Tiwary, Uma Shanker

    2014-11-15

    The purpose of this paper is twofold: (i) to investigate the emotion representation models and find out the possibility of a model with minimum number of continuous dimensions and (ii) to recognize and predict emotion from the measured physiological signals using multiresolution approach. The multimodal physiological signals are: Electroencephalogram (EEG) (32 channels) and peripheral (8 channels: Galvanic skin response (GSR), blood volume pressure, respiration pattern, skin temperature, electromyogram (EMG) and electrooculogram (EOG)) as given in the DEAP database. We have discussed the theories of emotion modeling based on i) basic emotions, ii) cognitive appraisal and physiological response approach and iii) the dimensional approach and proposed a three continuous dimensional representation model for emotions. The clustering experiment on the given valence, arousal and dominance values of various emotions has been done to validate the proposed model. A novel approach for multimodal fusion of information from a large number of channels to classify and predict emotions has also been proposed. Discrete Wavelet Transform, a classical transform for multiresolution analysis of signal has been used in this study. The experiments are performed to classify different emotions from four classifiers. The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for 'Depressing' with 85.46% using SVM. The 32 EEG channels are considered as independent modes and features from each channel are considered with equal importance. May be some of the channel data are correlated but they may contain supplementary information. In comparison with the results given by others, the high accuracy of 85% with 13 emotions and 32 subjects from our proposed method clearly proves the potential of our multimodal fusion approach. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. A comparison of mandatory and voluntary approaches to the implementation of Globally Harmonized System of Classification and Labelling of Chemicals (GHS) in the management of hazardous chemicals.

    Science.gov (United States)

    Ta, Goh Choo; Mokhtar, Mazlin Bin; Peterson, Peter John; Yahaya, Nadzri Bin

    2011-01-01

    The European Union (EU) and the World Health Organization (WHO) have applied different approaches to facilitate the implementation of the UN Globally Harmonized System of Classification and Labelling of Chemicals (GHS). The EU applied the mandatory approach by gazetting the EU Regulation 1272/2008 incorporating GHS elements on classification, labelling and packaging of substances and mixtures in 2008; whereas the WHO utilized a voluntary approach by incorporating GHS elements in the WHO guidelines entitled 'WHO Recommended Classification of Pesticides by Hazard' in 2009. We report on an analysis of both the mandatory and voluntary approaches practised by the EU and the WHO respectively, with close reference to the GHS 'purple book'. Our findings indicate that the mandatory approach practiced by the EU covers all the GHS elements referred to in the second revised edition of the GHS 'purple book'. Hence we can conclude that the EU has implemented the GHS particularly for industrial chemicals. On the other hand, the WHO guidelines published in 2009 should be revised to address concerns raised in this paper. In addition, both mandatory and voluntary approaches should be carefully examined because the classification results may be different.

  13. Using resistance and resilience concepts to reduce impacts of invasive annual grasses and altered fire regimes on the sagebrush ecosystem and greater sage-grouse: A strategic multi-scale approach

    Science.gov (United States)

    Jeanne C. Chambers; David A. Pyke; Jeremy D. Maestas; Mike Pellant; Chad S. Boyd; Steven B. Campbell; Shawn Espinosa; Douglas W. Havlina; Kenneth E. Mayer; Amarina Wuenschel

    2014-01-01

    This Report provides a strategic approach for conservation of sagebrush ecosystems and Greater Sage- Grouse (sage-grouse) that focuses specifically on habitat threats caused by invasive annual grasses and altered fire regimes. It uses information on factors that influence (1) sagebrush ecosystem resilience to disturbance and resistance to invasive annual grasses and (2...

  14. A novel approach for SEMG signal classification with adaptive local binary patterns.

    Science.gov (United States)

    Ertuğrul, Ömer Faruk; Kaya, Yılmaz; Tekin, Ramazan

    2016-07-01

    Feature extraction plays a major role in the pattern recognition process, and this paper presents a novel feature extraction approach, adaptive local binary pattern (aLBP). aLBP is built on the local binary pattern (LBP), which is an image processing method, and one-dimensional local binary pattern (1D-LBP). In LBP, each pixel is compared with its neighbors. Similarly, in 1D-LBP, each data in the raw is judged against its neighbors. 1D-LBP extracts feature based on local changes in the signal. Therefore, it has high a potential to be employed in medical purposes. Since, each action or abnormality, which is recorded in SEMG signals, has its own pattern, and via the 1D-LBP these (hidden) patterns may be detected. But, the positions of the neighbors in 1D-LBP are constant depending on the position of the data in the raw. Also, both LBP and 1D-LBP are very sensitive to noise. Therefore, its capacity in detecting hidden patterns is limited. To overcome these drawbacks, aLBP was proposed. In aLBP, the positions of the neighbors and their values can be assigned adaptively via the down-sampling and the smoothing coefficients. Therefore, the potential to detect (hidden) patterns, which may express an illness or an action, is really increased. To validate the proposed feature extraction approach, two different datasets were employed. Achieved accuracies by the proposed approach were higher than obtained results by employed popular feature extraction approaches and the reported results in the literature. Obtained accuracy results were brought out that the proposed method can be employed to investigate SEMG signals. In summary, this work attempts to develop an adaptive feature extraction scheme that can be utilized for extracting features from local changes in different categories of time-varying signals.

  15. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    Science.gov (United States)

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. PCA-based ANN approach to leak classification in the main pipes of VVER-1000

    International Nuclear Information System (INIS)

    Hadad, Kamal; Jabbari, Masoud; Tabadar, Z.; Hashemi-Tilehnoee, Mehdi

    2012-01-01

    This paper presents a neural network based fault diagnosing approach which allows dynamic crack and leaks fault identification. The method utilizes the Principal Component Analysis (PCA) technique to reduce the problem dimension. Such a dimension reduction approach leads to faster diagnosing and allows a better graphic presentation of the results. To show the effectiveness of the proposed approach, two methodologies are used to train the neural network (NN). At first, a training matrix composed of 14 variables is used to train a Multilayer Perceptron neural network (MLP) with Resilient Backpropagation (RBP) algorithm. Employing the proposed method, a more accurate and simpler network is designed where the input size is reduced from 14 to 6 variables for training the NN. In short, the application of PCA highly reduces the network topology and allows employing more efficient training algorithms. The accuracy, generalization ability, and reliability of the designed networks are verified using 10 simulated events data from a VVER-1000 simulation using DINAMIKA-97 code. Noise is added to the data to evaluate the robustness of the method and the method again shows to be effective and powerful. (orig.)

  17. An obstructive sleep apnea detection approach using kernel density classification based on single-lead electrocardiogram.

    Science.gov (United States)

    Chen, Lili; Zhang, Xi; Wang, Hui

    2015-05-01

    Obstructive sleep apnea (OSA) is a common sleep disorder that often remains undiagnosed, leading to an increased risk of developing cardiovascular diseases. Polysomnogram (PSG) is currently used as a golden standard for screening OSA. However, because it is time consuming, expensive and causes discomfort, alternative techniques based on a reduced set of physiological signals are proposed to solve this problem. This study proposes a convenient non-parametric kernel density-based approach for detection of OSA using single-lead electrocardiogram (ECG) recordings. Selected physiologically interpretable features are extracted from segmented RR intervals, which are obtained from ECG signals. These features are fed into the kernel density classifier to detect apnea event and bandwidths for density of each class (normal or apnea) are automatically chosen through an iterative bandwidth selection algorithm. To validate the proposed approach, RR intervals are extracted from ECG signals of 35 subjects obtained from a sleep apnea database ( http://physionet.org/cgi-bin/atm/ATM ). The results indicate that the kernel density classifier, with two features for apnea event detection, achieves a mean accuracy of 82.07 %, with mean sensitivity of 83.23 % and mean specificity of 80.24 %. Compared with other existing methods, the proposed kernel density approach achieves a comparably good performance but by using fewer features without significantly losing discriminant power, which indicates that it could be widely used for home-based screening or diagnosis of OSA.

  18. Comparison of Pixel-Based and Object-Based Classification Using Parameters and Non-Parameters Approach for the Pattern Consistency of Multi Scale Landcover

    Science.gov (United States)

    Juniati, E.; Arrofiqoh, E. N.

    2017-09-01

    Information extraction from remote sensing data especially land cover can be obtained by digital classification. In practical some people are more comfortable using visual interpretation to retrieve land cover information. However, it is highly influenced by subjectivity and knowledge of interpreter, also takes time in the process. Digital classification can be done in several ways, depend on the defined mapping approach and assumptions on data distribution. The study compared several classifiers method for some data type at the same location. The data used Landsat 8 satellite imagery, SPOT 6 and Orthophotos. In practical, the data used to produce land cover map in 1:50,000 map scale for Landsat, 1:25,000 map scale for SPOT and 1:5,000 map scale for Orthophotos, but using visual interpretation to retrieve information. Maximum likelihood Classifiers (MLC) which use pixel-based and parameters approach applied to such data, and also Artificial Neural Network classifiers which use pixel-based and non-parameters approach applied too. Moreover, this study applied object-based classifiers to the data. The classification system implemented is land cover classification on Indonesia topographic map. The classification applied to data source, which is expected to recognize the pattern and to assess consistency of the land cover map produced by each data. Furthermore, the study analyse benefits and limitations the use of methods.

  19. Object-Based Classification as an Alternative Approach to the Traditional Pixel-Based Classification to Identify Potential Habitat of the Grasshopper Sparrow

    Science.gov (United States)

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  20. Human movement activity classification approaches that use wearable sensors and mobile devices

    Science.gov (United States)

    Kaghyan, Sahak; Sarukhanyan, Hakob; Akopian, David

    2013-03-01

    Cell phones and other mobile devices become part of human culture and change activity and lifestyle patterns. Mobile phone technology continuously evolves and incorporates more and more sensors for enabling advanced applications. Latest generations of smart phones incorporate GPS and WLAN location finding modules, vision cameras, microphones, accelerometers, temperature sensors etc. The availability of these sensors in mass-market communication devices creates exciting new opportunities for data mining applications. Particularly healthcare applications exploiting build-in sensors are very promising. This paper reviews different approaches of human activity recognition.

  1. Machine-learning approach for local classification of crystalline structures in multiphase systems

    Science.gov (United States)

    Dietz, C.; Kretz, T.; Thoma, M. H.

    2017-07-01

    Machine learning is one of the most popular fields in computer science and has a vast number of applications. In this work we will propose a method that will use a neural network to locally identify crystal structures in a mixed phase Yukawa system consisting of fcc, hcp, and bcc clusters and disordered particles similar to plasma crystals. We compare our approach to already used methods and show that the quality of identification increases significantly. The technique works very well for highly disturbed lattices and shows a flexible and robust way to classify crystalline structures that can be used by only providing particle positions. This leads to insights into highly disturbed crystalline structures.

  2. Depression and suicidal behavior in adolescents: a multi-informant and multi-methods approach to diagnostic classification.

    Directory of Open Access Journals (Sweden)

    Andrew James Lewis

    2014-07-01

    Full Text Available Background: Informant discrepancies have been reported between parent and adolescent measures of depressive disorders and suicidality. We aimed to examine the concordance between adolescent and parent ratings of depressive disorder using both clinical interview and questionnaire measures and assess multi-informant and multi-method approaches to classification.Method: Within the context of assessment of eligibility for a randomized clinical trial, 50 parent–adolescent pairs (mean age of adolescents = 15.0 years were interviewed separately with a structured diagnostic interview for depression, the KID-SCID. Adolescent self-report and parent-report versions of the Strengths and Difficulties Questionnaire, the Short Mood and Feelings Questionnaire and the Depressive Experiences Questionnaire were also administered. We examined the diagnostic concordance rates of the parent vs. adolescent structured interview methods and the prediction of adolescent diagnosis via questionnaire methods.Results: Parent proxy reporting of adolescent depression and suicidal thoughts and behavior is not strongly concordant with adolescent report. Adolescent self-reported symptoms on depression scales provide a more accurate report of diagnosable adolescent depression than parent proxy reports of adolescent depressive symptoms. Adolescent self-report measures can be combined to improve the accuracy of classification. Parents tend to over report their adolescent’s depressive symptoms while under reporting their suicidal thoughts and behavior.Conclusion: Parent proxy report is clearly less reliable than the adolescent’s own report of their symptoms and subjective experiences, and could be considered inaccurate for research purposes. While parent report would still be sought clinically where an adolescent refuses to provide information, our findings suggest that parent reporting of adolescent suicidality should be interpreted with caution.

  3. A vegetation height classification approach based on texture analysis of a single VHR image

    International Nuclear Information System (INIS)

    Petrou, Z I; Manakos, I; Stathaki, T; Tarantino, C; Adamo, M; Blonda, P

    2014-01-01

    Vegetation height is a crucial feature in various applications related to ecological mapping, enhancing the discrimination among different land cover or habitat categories and facilitating a series of environmental tasks, ranging from biodiversity monitoring and assessment to landscape characterization, disaster management and conservation planning. Primary sources of information on vegetation height include in situ measurements and data from active satellite or airborne sensors, which, however, may often be non-affordable or unavailable for certain regions. Alternative approaches on extracting height information from very high resolution (VHR) satellite imagery based on texture analysis, have recently been presented, with promising results. Following the notion that multispectral image bands may often be highly correlated, data transformation and dimensionality reduction techniques are expected to reduce redundant information, and thus, the computational cost of the approaches, without significantly compromising their accuracy. In this paper, dimensionality reduction is performed on a VHR image and textural characteristics are calculated on its reconstructed approximations, to show that their discriminatory capabilities are maintained up to a large degree. Texture analysis is also performed on the projected data to investigate whether the different height categories can be distinguished in a similar way

  4. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  5. A new approach to children's footwear based on foot type classification.

    Science.gov (United States)

    Mauch, M; Grau, S; Krauss, I; Maiwald, C; Horstmann, T

    2009-08-01

    Current shoe designs do not allow for the comprehensive 3-D foot shape, which means they are unable to reproduce the wide variability in foot morphology. Therefore, the purpose of this study was to capture these variations of children's feet by classifying them into groups (types) and thereby provide a basis for their implementation in the design of children's shoes. The feet of 2867 German children were measured using a 3-D foot scanner. Cluster analysis was then applied to classify the feet into three different foot types. The characteristics of these foot types differ regarding their volume and forefoot shape both within and between shoe sizes. This new approach is in clear contrast to previous systems, since it captures the variability of foot morphology in a more comprehensive way by using a foot typing system and therefore paves the way for the unimpaired development of children's feet. Previous shoe systems do not allow for the wide variations in foot morphology. A new approach was developed regarding different morphological foot types based on 3-D measurements relevant in shoe construction. This can be directly applied to create specific designs for children's shoes.

  6. Optimization of classification and regression analysis of four monoclonal antibodies from Raman spectra using collaborative machine learning approach.

    Science.gov (United States)

    Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric

    2018-07-01

    The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Artificial Neural Network approach to develop unique Classification and Raga identification tools for Pattern Recognition in Carnatic Music

    Science.gov (United States)

    Srimani, P. K.; Parimala, Y. G.

    2011-12-01

    A unique approach has been developed to study patterns in ragas of Carnatic Classical music based on artificial neural networks. Ragas in Carnatic music which have found their roots in the Vedic period, have grown on a Scientific foundation over thousands of years. However owing to its vastness and complexities it has always been a challenge for scientists and musicologists to give an all encompassing perspective both qualitatively and quantitatively. Cognition, comprehension and perception of ragas in Indian classical music have always been the subject of intensive research, highly intriguing and many facets of these are hitherto not unravelled. This paper is an attempt to view the melakartha ragas with a cognitive perspective using artificial neural network based approach which has given raise to very interesting results. The 72 ragas of the melakartha system were defined through the combination of frequencies occurring in each of them. The data sets were trained using several neural networks. 100% accurate pattern recognition and classification was obtained using linear regression, TLRN, MLP and RBF networks. Performance of the different network topologies, by varying various network parameters, were compared. Linear regression was found to be the best performing network.

  8. Modeling Personalized Email Prioritization: Classification-based and Regression-based Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Yoo S.; Yang, Y.; Carbonell, J.

    2011-10-24

    Email overload, even after spam filtering, presents a serious productivity challenge for busy professionals and executives. One solution is automated prioritization of incoming emails to ensure the most important are read and processed quickly, while others are processed later as/if time permits in declining priority levels. This paper presents a study of machine learning approaches to email prioritization into discrete levels, comparing ordinal regression versus classier cascades. Given the ordinal nature of discrete email priority levels, SVM ordinal regression would be expected to perform well, but surprisingly a cascade of SVM classifiers significantly outperforms ordinal regression for email prioritization. In contrast, SVM regression performs well -- better than classifiers -- on selected UCI data sets. This unexpected performance inversion is analyzed and results are presented, providing core functionality for email prioritization systems.

  9. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    Science.gov (United States)

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  10. Classification of Breast Cancer Resistant Protein (BCRP) Inhibitors and Non-Inhibitors Using Machine Learning Approaches.

    Science.gov (United States)

    Belekar, Vilas; Lingineni, Karthik; Garg, Prabha

    2015-01-01

    The breast cancer resistant protein (BCRP) is an important transporter and its inhibitors play an important role in cancer treatment by improving the oral bioavailability as well as blood brain barrier (BBB) permeability of anticancer drugs. In this work, a computational model was developed to predict the compounds as BCRP inhibitors or non-inhibitors. Various machine learning approaches like, support vector machine (SVM), k-nearest neighbor (k-NN) and artificial neural network (ANN) were used to develop the models. The Matthews correlation coefficients (MCC) of developed models using ANN, k-NN and SVM are 0.67, 0.71 and 0.77, and prediction accuracies are 85.2%, 88.3% and 90.8% respectively. The developed models were tested with a test set of 99 compounds and further validated with external set of 98 compounds. Distribution plot analysis and various machine learning models were also developed based on druglikeness descriptors. Applicability domain is used to check the prediction reliability of the new molecules.

  11. Novel approach to predict the azeotropy at any pressure using classification by subgroups

    Directory of Open Access Journals (Sweden)

    Taehyung Kim

    2012-11-01

    Full Text Available Distillation is one of the dominating separation processes, but there are some problems as inseparable mixtures areformed in some cases. This phenomenon is called as azeotropy. It is essential to understand azeotropy in any distillationprocesses since azeotropes, i.e. inseparable mixtures, cannot be separated by ordinary distillation. In this study, to constructa model which predicts the azeotropic formation at any pressure, a novel approach using support vector machine (SVM ispresented. The SVM method is used to classify data in the two classes, that is, azeotropes and non-azeotropes. 13 variables,including pressure, were used as explanatory variables in this model. From the result of the SVM models which were constructed with data measured at 1 atm and data measured at all pressures, the 1 atm model showed a higher prediction performance to the data measured at 1 atm than the all pressure model. Thus, for improving the performance of the all pressuremodel, we focused on intermolecular forces of solvents. The SVM models were constructed with only data of the solventshaving same subgroups. The accuracy of the model increased and it is expected that this proposed method will be used topredict azeotropic formation at any pressure with high accuracy.

  12. A connectionist-geostatistical approach for classification of deformation types in ice surfaces

    Science.gov (United States)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.

    2014-12-01

    Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.

  13. Medical subdomain classification of clinical notes using a machine learning-based natural language processing approach.

    Science.gov (United States)

    Weng, Wei-Hung; Wagholikar, Kavishwar B; McCray, Alexa T; Szolovits, Peter; Chueh, Henry C

    2017-12-01

    learning-based NLP approach is useful to develop medical subdomain classifiers. The deep learning algorithm with distributed word representation yields better performance yet shallow learning algorithms with the word and concept representation achieves comparable performance with better clinical interpretability. Portable classifiers may also be used across datasets from different institutions.

  14. Spatial prediction of landslides using a hybrid machine learning approach based on Random Subspace and Classification and Regression Trees

    Science.gov (United States)

    Pham, Binh Thai; Prakash, Indra; Tien Bui, Dieu

    2018-02-01

    A hybrid machine learning approach of Random Subspace (RSS) and Classification And Regression Trees (CART) is proposed to develop a model named RSSCART for spatial prediction of landslides. This model is a combination of the RSS method which is known as an efficient ensemble technique and the CART which is a state of the art classifier. The Luc Yen district of Yen Bai province, a prominent landslide prone area of Viet Nam, was selected for the model development. Performance of the RSSCART model was evaluated through the Receiver Operating Characteristic (ROC) curve, statistical analysis methods, and the Chi Square test. Results were compared with other benchmark landslide models namely Support Vector Machines (SVM), single CART, Naïve Bayes Trees (NBT), and Logistic Regression (LR). In the development of model, ten important landslide affecting factors related with geomorphology, geology and geo-environment were considered namely slope angles, elevation, slope aspect, curvature, lithology, distance to faults, distance to rivers, distance to roads, and rainfall. Performance of the RSSCART model (AUC = 0.841) is the best compared with other popular landslide models namely SVM (0.835), single CART (0.822), NBT (0.821), and LR (0.723). These results indicate that performance of the RSSCART is a promising method for spatial landslide prediction.

  15. A novel approach for detection and classification of mammographic microcalcifications using wavelet analysis and extreme learning machine.

    Science.gov (United States)

    Malar, E; Kandaswamy, A; Chakravarthy, D; Giri Dharan, A

    2012-09-01

    The objective of this paper is to reveal the effectiveness of wavelet based tissue texture analysis for microcalcification detection in digitized mammograms using Extreme Learning Machine (ELM). Microcalcifications are tiny deposits of calcium in the breast tissue which are potential indicators for early detection of breast cancer. The dense nature of the breast tissue and the poor contrast of the mammogram image prohibit the effectiveness in identifying microcalcifications. Hence, a new approach to discriminate the microcalcifications from the normal tissue is done using wavelet features and is compared with different feature vectors extracted using Gray Level Spatial Dependence Matrix (GLSDM) and Gabor filter based techniques. A total of 120 Region of Interests (ROIs) extracted from 55 mammogram images of mini-Mias database, including normal and microcalcification images are used in the current research. The network is trained with the above mentioned features and the results denote that ELM produces relatively better classification accuracy (94%) with a significant reduction in training time than the other artificial neural networks like Bayesnet classifier, Naivebayes classifier, and Support Vector Machine. ELM also avoids problems like local minima, improper learning rate, and over fitting. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  17. Characteristics of regulatory regimes

    Directory of Open Access Journals (Sweden)

    Noralv Veggeland

    2013-03-01

    Full Text Available The overarching theme of this paper is institutional analysis of basic characteristics of regulatory regimes. The concepts of path dependence and administrative traditions are used throughout. Self-reinforcing or positive feedback processes in political systems represent a basic framework. The empirical point of departure is the EU public procurement directive linked to OECD data concerning use of outsourcing among member states. The question is asked: What has caused the Nordic countries, traditionally not belonging to the Anglo-Saxon market-centred administrative tradition, to be placed so high on the ranking as users of the Market-Type Mechanism (MTM of outsourcing in the public sector vs. in-house provision of services? A thesis is that the reason may be complex, but might be found in an innovative Scandinavian regulatory approach rooted in the Nordic model.

  18. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  19. Identificando mudanças de regimes sistêmicos em processos econômicos: um procedimento baseado na abordagem de dinâmica de sistemas Identifying systemic regime shifts in economic processes: a procedure based on the system dynamics approach

    Directory of Open Access Journals (Sweden)

    Newton Paulo Bueno

    2013-04-01

    Full Text Available A tese deste trabalho é que as técnicas mais sofisticadas atualmente utilizadas pelos economistas para fazer previsões (métodos não estruturais de previsão, em geral, e modelos de detecção de mudanças de regime, em particular não parecem realmente muito eficazes em prever mudanças radicais de regime como a que ocorreu na economia mundial recentemente. Assim, para aumentar seu grau de acurácia, parece razoável imaginar que tais técnicas devam ser complementadas por abordagens mais holísticas. O objetivo geral deste trabalho é mostrar que a metodologia de dinâmica de sistemas (system dynamics, que permite identificar os ciclos de feedback que comandam a dinâmica de sistemas complexos, parece estar especialmente bem-equipada para se tornar uma dessas abordagens complementares. Pretende-se, especificamente, apresentar um algoritmo sistêmico para identificar processos de mudança de regime como os que ocorrem quando uma economia, após anos de expansão continuada, sofre os efeitos da explosão de uma bolha financeira, como ocorreu recentemente.This paper argues that the sophisticated techniques presently used by economists to forecast macroeconomic variables behavior (non-structural forecasting methods, in general, and regime-switching models, in particular do not seem much effective for anticipating radical regime shifts as recently happened in the world economy. Thus, in order to improve their accuracy, it seems that they should be complemented by more holistic approaches. The general purpose of the paper is to show that the system dynamics methodology, which allows identifying the critical feedback loops that drive complex systems' dynamics, seems to be especially fitted to be one of those complementary approaches. To reach that goal, we present a systemic algorithm which allows identifying regime shift processes as the ones that take place when an economy is hit by the effects of a financial bubble burst.

  20. Effect of a standardized treatment regime for infection after osteosynthesis.

    Science.gov (United States)

    Hellebrekers, Pien; Leenen, Luke P H; Hoekstra, Meriam; Hietbrink, Falco

    2017-03-09

    Infection after osteosynthesis is an important complication with significant morbidity and even mortality. These infections are often caused by biofilm-producing bacteria. Treatment algorithms dictate an aggressive approach with surgical debridement and antibiotic treatment. The aim of this study is to analyze the effect of such an aggressive standardized treatment regime with implant retention for acute, existing regime consisted of implant retention, thorough surgical debridement, and immediate antibiotic combination therapy with rifampicin. The primary outcome was success. Success was defined as consolidation of the fracture and resolved symptoms of infection. Culture and susceptibility testing were performed to identify bacteria and resistance patterns. Univariate analysis was conducted on patient-related factors in association with primary success and antibiotic resistance. Forty-nine patients were included for analysis. The primary success rate was 63% and overall success rate 88%. Factors negatively associated with primary success were the following: Gustilo classification (P = 0.023), higher number of debridements needed (P = 0.015), inability of primary closure (P = 0.017), and subsequent application of vacuum therapy (P = 0.030). Adherence to the treatment regime was positively related to primary success (P = 0.034). The described treatment protocol results in high success rates, comparable with success rates achieved in staged exchange in prosthetic joint infection treatment.

  1. Valuing a gas-fired power plant: A comparison of ordinary linear models, regime-switching approaches, and models with stochastic volatility

    International Nuclear Information System (INIS)

    Heydari, Somayeh; Siddiqui, Afzal

    2010-01-01

    Energy prices are often highly volatile with unexpected spikes. Capturing these sudden spikes may lead to more informed decision-making in energy investments, such as valuing gas-fired power plants, than ignoring them. In this paper, non-linear regime-switching models and models with mean-reverting stochastic volatility are compared with ordinary linear models. The study is performed using UK electricity and natural gas daily spot prices and suggests that with the aim of valuing a gas-fired power plant with and without operational flexibility, non-linear models with stochastic volatility, specifically for logarithms of electricity prices, provide better out-of-sample forecasts than both linear models and regime-switching models.

  2. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach

    Science.gov (United States)

    Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-01

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.

  3. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach.

    Science.gov (United States)

    Tarai, Madhumita; Kumar, Keshav; Divya, O; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-05

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The morphological /settlement pattern classification of South African settlements based on a settlement catchment approach, to inform facility allocation and service delivery

    CSIR Research Space (South Africa)

    Sogoni, Z

    2016-07-01

    Full Text Available / settlement pattern classification of South African settlements based on a settlement catchment approach, to inform facility allocation and service delivery Zukisa Sogoni Planning Africa Conference 2016 4 July 2Project Focus and Background • CSIR... services. • Purpose is to support application & planning for new investment & prevent “unsustainable” investments / White elephants. 3Outputs • National set of service delivery catchments • Profile information per individual catchment • Ranking...

  5. Colombia: Territorial classification

    International Nuclear Information System (INIS)

    Mendoza Morales, Alberto

    1998-01-01

    The article is about the approaches of territorial classification, thematic axes, handling principles and territorial occupation, politician and administrative units and administration regions among other topics. Understanding as Territorial Classification the space distribution on the territory of the country, of the geographical configurations, the human communities, the political-administrative units and the uses of the soil, urban and rural, existent and proposed

  6. A Hidden Markov Models Approach for Crop Classification: Linking Crop Phenology to Time Series of Multi-Sensor Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Sofia Siachalou

    2015-03-01

    Full Text Available Vegetation monitoring and mapping based on multi-temporal imagery has recently received much attention due to the plethora of medium-high spatial resolution satellites and the improved classification accuracies attained compared to uni-temporal approaches. Efficient image processing strategies are needed to exploit the phenological information present in temporal image sequences and to limit data redundancy and computational complexity. Within this framework, we implement the theory of Hidden Markov Models in crop classification, based on the time-series analysis of phenological states, inferred by a sequence of remote sensing observations. More specifically, we model the dynamics of vegetation over an agricultural area of Greece, characterized by spatio-temporal heterogeneity and small-sized fields, using RapidEye and Landsat ETM+ imagery. In addition, the classification performance of image sequences with variable spatial and temporal characteristics is evaluated and compared. The classification model considering one RapidEye and four pan-sharpened Landsat ETM+ images was found superior, resulting in a conditional kappa from 0.77 to 0.94 per class and an overall accuracy of 89.7%. The results highlight the potential of the method for operational crop mapping in Euro-Mediterranean areas and provide some hints for optimal image acquisition windows regarding major crop types in Greece.

  7. Classification and evaluation strategies of auto-segmentation approaches for PET: Report of AAPM task group No. 211

    Science.gov (United States)

    Hatt, Mathieu; Lee, John A.; Schmidtlein, Charles R.; Naqa, Issam El; Caldwell, Curtis; De Bernardi, Elisabetta; Lu, Wei; Das, Shiva; Geets, Xavier; Gregoire, Vincent; Jeraj, Robert; MacManus, Michael P.; Mawlawi, Osama R.; Nestle, Ursula; Pugachev, Andrei B.; Schöder, Heiko; Shepherd, Tony; Spezi, Emiliano; Visvikis, Dimitris; Zaidi, Habib; Kirov, Assen S.

    2017-01-01

    Purpose The purpose of this educational report is to provide an overview of the present state-of-the-art PET auto-segmentation (PET-AS) algorithms and their respective validation, with an emphasis on providing the user with help in understanding the challenges and pitfalls associated with selecting and implementing a PET-AS algorithm for a particular application. Approach A brief description of the different types of PET-AS algorithms is provided using a classification based on method complexity and type. The advantages and the limitations of the current PET-AS algorithms are highlighted based on current publications and existing comparison studies. A review of the available image datasets and contour evaluation metrics in terms of their applicability for establishing a standardized evaluation of PET-AS algorithms is provided. The performance requirements for the algorithms and their dependence on the application, the radiotracer used and the evaluation criteria are described and discussed. Finally, a procedure for algorithm acceptance and implementation, as well as the complementary role of manual and auto-segmentation are addressed. Findings A large number of PET-AS algorithms have been developed within the last 20 years. Many of the proposed algorithms are based on either fixed or adaptively selected thresholds. More recently, numerous papers have proposed the use of more advanced image analysis paradigms to perform semi-automated delineation of the PET images. However, the level of algorithm validation is variable and for most published algorithms is either insufficient or inconsistent which prevents recommending a single algorithm. This is compounded by the fact that realistic image configurations with low signal-to-noise ratios (SNR) and heterogeneous tracer distributions have rarely been used. Large variations in the evaluation methods used in the literature point to the need for a standardized evaluation protocol. Conclusions Available comparison studies

  8. A discriminative model-constrained EM approach to 3D MRI brain tissue classification and intensity non-uniformity correction

    International Nuclear Information System (INIS)

    Wels, Michael; Hornegger, Joachim; Zheng Yefeng; Comaniciu, Dorin; Huber, Martin

    2011-01-01

    We describe a fully automated method for tissue classification, which is the segmentation into cerebral gray matter (GM), cerebral white matter (WM), and cerebral spinal fluid (CSF), and intensity non-uniformity (INU) correction in brain magnetic resonance imaging (MRI) volumes. It combines supervised MRI modality-specific discriminative modeling and unsupervised statistical expectation maximization (EM) segmentation into an integrated Bayesian framework. While both the parametric observation models and the non-parametrically modeled INUs are estimated via EM during segmentation itself, a Markov random field (MRF) prior model regularizes segmentation and parameter estimation. Firstly, the regularization takes into account knowledge about spatial and appearance-related homogeneity of segments in terms of pairwise clique potentials of adjacent voxels. Secondly and more importantly, patient-specific knowledge about the global spatial distribution of brain tissue is incorporated into the segmentation process via unary clique potentials. They are based on a strong discriminative model provided by a probabilistic boosting tree (PBT) for classifying image voxels. It relies on the surrounding context and alignment-based features derived from a probabilistic anatomical atlas. The context considered is encoded by 3D Haar-like features of reduced INU sensitivity. Alignment is carried out fully automatically by means of an affine registration algorithm minimizing cross-correlation. Both types of features do not immediately use the observed intensities provided by the MRI modality but instead rely on specifically transformed features, which are less sensitive to MRI artifacts. Detailed quantitative evaluations on standard phantom scans and standard real-world data show the accuracy and robustness of the proposed method. They also demonstrate relative superiority in comparison to other state-of-the-art approaches to this kind of computational task: our method achieves average

  9. Sustainable urban regime adjustments

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Jensen, Jens Stissing; Elle, Morten

    2013-01-01

    The endogenous agency that urban governments increasingly portray by making conscious and planned efforts to adjust the regimes they operate within is currently not well captured in transition studies. There is a need to acknowledge the ambiguity of regime enactment at the urban scale. This direc...

  10. Flux scaling: Ultimate regime

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Flux scaling: Ultimate regime. With the Nusselt number and the mixing length scales, we get the Nusselt number and Reynolds number (w'd/ν) scalings: and or. and. scaling expected to occur at extremely high Ra Rayleigh-Benard convection. Get the ultimate regime ...

  11. Role of medium heterogeneity and viscosity contrast in miscible flow regimes and mixing zone growth: A computational pore-scale approach

    Science.gov (United States)

    Afshari, Saied; Hejazi, S. Hossein; Kantzas, Apostolos

    2018-05-01

    Miscible displacement of fluids in porous media is often characterized by the scaling of the mixing zone length with displacement time. Depending on the viscosity contrast of fluids, the scaling law varies between the square root relationship, a sign for dispersive transport regime during stable displacement, and the linear relationship, which represents the viscous fingering regime during an unstable displacement. The presence of heterogeneities in a porous medium significantly affects the scaling behavior of the mixing length as it interacts with the viscosity contrast to control the mixing of fluids in the pore space. In this study, the dynamics of the flow and transport during both unit and adverse viscosity ratio miscible displacements are investigated in heterogeneous packings of circular grains using pore-scale numerical simulations. The pore-scale heterogeneity level is characterized by the variations of the grain diameter and velocity field. The growth of mixing length is employed to identify the nature of the miscible transport regime at different viscosity ratios and heterogeneity levels. It is shown that as the viscosity ratio increases to higher adverse values, the scaling law of mixing length gradually shifts from dispersive to fingering nature up to a certain viscosity ratio and remains almost the same afterwards. In heterogeneous media, the mixing length scaling law is observed to be generally governed by the variations of the velocity field rather than the grain size. Furthermore, the normalization of mixing length temporal plots with respect to the governing parameters of viscosity ratio, heterogeneity, medium length, and medium aspect ratio is performed. The results indicate that mixing length scales exponentially with log-viscosity ratio and grain size standard deviation while the impact of aspect ratio is insignificant. For stable flows, mixing length scales with the square root of medium length, whereas it changes linearly with length during

  12. Describing the brain in autism in five dimensions--magnetic resonance imaging-assisted diagnosis of autism spectrum disorder using a multiparameter classification approach.

    Science.gov (United States)

    Ecker, Christine; Marquand, Andre; Mourão-Miranda, Janaina; Johnston, Patrick; Daly, Eileen M; Brammer, Michael J; Maltezos, Stefanos; Murphy, Clodagh M; Robertson, Dene; Williams, Steven C; Murphy, Declan G M

    2010-08-11

    Autism spectrum disorder (ASD) is a neurodevelopmental condition with multiple causes, comorbid conditions, and a wide range in the type and severity of symptoms expressed by different individuals. This makes the neuroanatomy of autism inherently difficult to describe. Here, we demonstrate how a multiparameter classification approach can be used to characterize the complex and subtle structural pattern of gray matter anatomy implicated in adults with ASD, and to reveal spatially distributed patterns of discriminating regions for a variety of parameters describing brain anatomy. A set of five morphological parameters including volumetric and geometric features at each spatial location on the cortical surface was used to discriminate between people with ASD and controls using a support vector machine (SVM) analytic approach, and to find a spatially distributed pattern of regions with maximal classification weights. On the basis of these patterns, SVM was able to identify individuals with ASD at a sensitivity and specificity of up to 90% and 80%, respectively. However, the ability of individual cortical features to discriminate between groups was highly variable, and the discriminating patterns of regions varied across parameters. The classification was specific to ASD rather than neurodevelopmental conditions in general (e.g., attention deficit hyperactivity disorder). Our results confirm the hypothesis that the neuroanatomy of autism is truly multidimensional, and affects multiple and most likely independent cortical features. The spatial patterns detected using SVM may help further exploration of the specific genetic and neuropathological underpinnings of ASD, and provide new insights into the most likely multifactorial etiology of the condition.

  13. A Novel Approach to Developing a Supervised Spatial Decision Support System for Image Classification: A Study of Paddy Rice Investigation

    Directory of Open Access Journals (Sweden)

    Shih-Hsun Chang

    2014-01-01

    Full Text Available Paddy rice area estimation via remote sensing techniques has been well established in recent years. Texture information and vegetation indicators are widely used to improve the classification accuracy of satellite images. Accordingly, this study employs texture information and vegetation indicators as ancillary information for classifying paddy rice through remote sensing images. In the first stage, the images are attained using a remote sensing technique and ancillary information is employed to increase the accuracy of classification. In the second stage, we decide to construct an efficient supervised classifier, which is used to evaluate the ancillary information. In the third stage, linear discriminant analysis (LDA is introduced. LDA is a well-known method for classifying images to various categories. Also, the particle swarm optimization (PSO algorithm is employed to optimize the LDA classification outcomes and increase classification performance. In the fourth stage, we discuss the strategy of selecting different window sizes and analyze particle numbers and iteration numbers with corresponding accuracy. Accordingly, a rational strategy for the combination of ancillary information is introduced. Afterwards, the PSO algorithm improves the accuracy rate from 82.26% to 89.31%. The improved accuracy results in a much lower salt-and-pepper effect in the thematic map.

  14. Diagnostics of enterprise bankruptcy occurrence probability in an anti-crisis management: modern approaches and classification of models

    Directory of Open Access Journals (Sweden)

    I.V. Zhalinska

    2015-09-01

    Full Text Available Diagnostics of enterprise bankruptcy occurrence probability is defined as an important tool ensuring the viability of an organization under conditions of unpredictable dynamic environment. The paper aims to define the basic features of diagnostics of bankruptcy occurrence probability models and their classification. The article grounds the objective increasing of crisis probability in modern enterprises where such increasing leads to the need to improve the efficiency of anti-crisis enterprise activities. The system of anti-crisis management is based on the subsystem of diagnostics of bankruptcy occurrence probability. Such a subsystem is the main one for further measures to prevent and overcome the crisis. The classification of existing models of enterprise bankruptcy occurrence probability has been suggested. The classification is based on methodical and methodological principles of models. The following main groups of models are determined: the models using financial ratios, aggregates and scores, the models of discriminated analysis, the methods of strategic analysis, informal models, artificial intelligence systems and the combination of the models. The classification made it possible to identify the analytical capabilities of each of the groups of models suggested.

  15. Classification and evaluation strategies of auto-segmentation approaches for PET: Report of AAPM task group No. 211.

    Science.gov (United States)

    Hatt, Mathieu; Lee, John A; Schmidtlein, Charles R; Naqa, Issam El; Caldwell, Curtis; De Bernardi, Elisabetta; Lu, Wei; Das, Shiva; Geets, Xavier; Gregoire, Vincent; Jeraj, Robert; MacManus, Michael P; Mawlawi, Osama R; Nestle, Ursula; Pugachev, Andrei B; Schöder, Heiko; Shepherd, Tony; Spezi, Emiliano; Visvikis, Dimitris; Zaidi, Habib; Kirov, Assen S

    2017-06-01

    The purpose of this educational report is to provide an overview of the present state-of-the-art PET auto-segmentation (PET-AS) algorithms and their respective validation, with an emphasis on providing the user with help in understanding the challenges and pitfalls associated with selecting and implementing a PET-AS algorithm for a particular application. A brief description of the different types of PET-AS algorithms is provided using a classification based on method complexity and type. The advantages and the limitations of the current PET-AS algorithms are highlighted based on current publications and existing comparison studies. A review of the available image datasets and contour evaluation metrics in terms of their applicability for establishing a standardized evaluation of PET-AS algorithms is provided. The performance requirements for the algorithms and their dependence on the application, the radiotracer used and the evaluation criteria are described and discussed. Finally, a procedure for algorithm acceptance and implementation, as well as the complementary role of manual and auto-segmentation are addressed. A large number of PET-AS algorithms have been developed within the last 20 years. Many of the proposed algorithms are based on either fixed or adaptively selected thresholds. More recently, numerous papers have proposed the use of more advanced image analysis paradigms to perform semi-automated delineation of the PET images. However, the level of algorithm validation is variable and for most published algorithms is either insufficient or inconsistent which prevents recommending a single algorithm. This is compounded by the fact that realistic image configurations with low signal-to-noise ratios (SNR) and heterogeneous tracer distributions have rarely been used. Large variations in the evaluation methods used in the literature point to the need for a standardized evaluation protocol. Available comparison studies suggest that PET-AS algorithms relying

  16. Satellite Image Classification of Building Damages Using Airborne and Satellite Image Samples in a Deep Learning Approach

    Science.gov (United States)

    Duarte, D.; Nex, F.; Kerle, N.; Vosselman, G.

    2018-05-01

    The localization and detailed assessment of damaged buildings after a disastrous event is of utmost importance to guide response operations, recovery tasks or for insurance purposes. Several remote sensing platforms and sensors are currently used for the manual detection of building damages. However, there is an overall interest in the use of automated methods to perform this task, regardless of the used platform. Owing to its synoptic coverage and predictable availability, satellite imagery is currently used as input for the identification of building damages by the International Charter, as well as the Copernicus Emergency Management Service for the production of damage grading and reference maps. Recently proposed methods to perform image classification of building damages rely on convolutional neural networks (CNN). These are usually trained with only satellite image samples in a binary classification problem, however the number of samples derived from these images is often limited, affecting the quality of the classification results. The use of up/down-sampling image samples during the training of a CNN, has demonstrated to improve several image recognition tasks in remote sensing. However, it is currently unclear if this multi resolution information can also be captured from images with different spatial resolutions like satellite and airborne imagery (from both manned and unmanned platforms). In this paper, a CNN framework using residual connections and dilated convolutions is used considering both manned and unmanned aerial image samples to perform the satellite image classification of building damages. Three network configurations, trained with multi-resolution image samples are compared against two benchmark networks where only satellite image samples are used. Combining feature maps generated from airborne and satellite image samples, and refining these using only the satellite image samples, improved nearly 4 % the overall satellite image

  17. Echocardiographic Classification and Surgical Approaches to Double-Outlet Right Ventricle for Great Arteries Arising Almost Exclusively from the Right Ventricle.

    Science.gov (United States)

    Pang, Kun-Jing; Meng, Hong; Hu, Sheng-Shou; Wang, Hao; Hsi, David; Hua, Zhong-Dong; Pan, Xiang-Bin; Li, Shou-Jun

    2017-08-01

    Selecting an appropriate surgical approach for double-outlet right ventricle (DORV), a complex congenital cardiac malformation with many anatomic variations, is difficult. Therefore, we determined the feasibility of using an echocardiographic classification system, which describes the anatomic variations in more precise terms than the current system does, to determine whether it could help direct surgical plans. Our system includes 8 DORV subtypes, categorized according to 3 factors: the relative positions of the great arteries (normal or abnormal), the relationship between the great arteries and the ventricular septal defect (committed or noncommitted), and the presence or absence of right ventricular outflow tract obstruction (RVOTO). Surgical approaches in 407 patients were based on their DORV subtype, as determined by echocardiography. We found that the optimal surgical management of patients classified as normal/committed/no RVOTO, normal/committed/RVOTO, and abnormal/committed/no RVOTO was, respectively, like that for patients with large ventricular septal defects, tetralogy of Fallot, and transposition of the great arteries without RVOTO. Patients with abnormal/committed/RVOTO anatomy and those with abnormal/noncommitted/RVOTO anatomy underwent intraventricular repair and double-root translocation. For patients with other types of DORV, choosing the appropriate surgical approach and biventricular repair techniques was more complex. We think that our classification system accurately groups DORV patients and enables surgeons to select the best approach for each patient's cardiac anatomy.

  18. Investigating the Importance of the Pocket-estimation Method in Pocket-based Approaches: An Illustration Using Pocket-ligand Classification.

    Science.gov (United States)

    Caumes, Géraldine; Borrel, Alexandre; Abi Hussein, Hiba; Camproux, Anne-Claude; Regad, Leslie

    2017-09-01

    Small molecules interact with their protein target on surface cavities known as binding pockets. Pocket-based approaches are very useful in all of the phases of drug design. Their first step is estimating the binding pocket based on protein structure. The available pocket-estimation methods produce different pockets for the same target. The aim of this work is to investigate the effects of different pocket-estimation methods on the results of pocket-based approaches. We focused on the effect of three pocket-estimation methods on a pocket-ligand (PL) classification. This pocket-based approach is useful for understanding the correspondence between the pocket and ligand spaces and to develop pharmacological profiling models. We found pocket-estimation methods yield different binding pockets in terms of boundaries and properties. These differences are responsible for the variation in the PL classification results that can have an impact on the detected correspondence between pocket and ligand profiles. Thus, we highlighted the importance of the pocket-estimation method choice in pocket-based approaches. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. IAEA Classification of Uranium Deposits

    International Nuclear Information System (INIS)

    Bruneton, Patrice

    2014-01-01

    Classifications of uranium deposits follow two general approaches, focusing on: • descriptive features such as the geotectonic position, the host rock type, the orebody morphology, …… : « geologic classification »; • or on genetic aspects: « genetic classification »

  20. INCODE-DK 2014. Classification of cause of intrauterine fetal death – a new approach to perinatal audit

    DEFF Research Database (Denmark)

    Maroun, Lisa Leth; Ramsing, Mette; Olsen, Tina Elisabeth

    on a national level as described in the national guideline for IUFD. Multidisciplinary perinatal audit is an important tool in the evaluation of stillbirth, however, the establishment of the C-IUFD has until now been hampered by the lack of a recommended classification system. Material and methods...... on the perinatal audit system in use as introduced by K. Vitting Andersen. The scheme is adapted to INCODE in main categories and allows grading and coding of C-IUFD. INCODE –DK and INCODE perinatal audittabel are available in an updated version of the IUFD guideline 2014, as well as in a separate excel file...... of the working group that the new audit scheme in combination with the new national classification system will improve the uniformity and quality of perinatal audits on a national level....

  1. Causes of death and associated conditions (Codac – a utilitarian approach to the classification of perinatal deaths

    Directory of Open Access Journals (Sweden)

    Harrison Catherine

    2009-06-01

    Full Text Available Abstract A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD, although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes. We tested the Causes of Death and Associated Conditions (Codac classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions. The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies, two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy, a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal. For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured. The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions

  2. An Approach to Structure Determination and Estimation of Hierarchical Archimedean Copulas and its Application to Bayesian Classification

    Czech Academy of Sciences Publication Activity Database

    Górecki, J.; Hofert, M.; Holeňa, Martin

    2016-01-01

    Roč. 46, č. 1 (2016), s. 21-59 ISSN 0925-9902 R&D Projects: GA ČR GA13-17187S Grant - others:Slezská univerzita v Opavě(CZ) SGS/21/2014 Institutional support: RVO:67985807 Keywords : Copula * Hierarchical archimedean copula * Copula estimation * Structure determination * Kendall’s tau * Bayesian classification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.294, year: 2016

  3. Causes of death and associated conditions (Codac) – a utilitarian approach to the classification of perinatal deaths

    Science.gov (United States)

    Frøen, J Frederik; Pinar, Halit; Flenady, Vicki; Bahrin, Safiah; Charles, Adrian; Chauke, Lawrence; Day, Katie; Duke, Charles W; Facchinetti, Fabio; Fretts, Ruth C; Gardener, Glenn; Gilshenan, Kristen; Gordijn, Sanne J; Gordon, Adrienne; Guyon, Grace; Harrison, Catherine; Koshy, Rachel; Pattinson, Robert C; Petersson, Karin; Russell, Laurie; Saastad, Eli; Smith, Gordon CS; Torabi, Rozbeh

    2009-01-01

    A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD), although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes. We tested the Causes of Death and Associated Conditions (Codac) classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions. The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies), two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy), a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal). For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured. The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions associated with them, and the

  4. Bagging Approach for Increasing Classification Accuracy of CART on Family Participation Prediction in Implementation of Elderly Family Development Program

    Directory of Open Access Journals (Sweden)

    Wisoedhanie Widi Anugrahanti

    2017-06-01

    Full Text Available Classification and Regression Tree (CART was a method of Machine Learning where data exploration was done by decision tree technique. CART was a classification technique with binary recursive reconciliation algorithms where the sorting was performed on a group of data collected in a space called a node / node into two child nodes (Lewis, 2000. The aim of this study was to predict family participation in Elderly Family Development program based on family behavior in providing physical, mental, social care for the elderly. Family involvement accuracy using Bagging CART method was calculated based on 1-APER value, sensitivity, specificity, and G-Means. Based on CART method, classification accuracy was obtained 97,41% with Apparent Error Rate value 2,59%. The most important determinant of family behavior as a sorter was society participation (100,00000, medical examination (98,95988, providing nutritious food (68.60476, establishing communication (67,19877 and worship (57,36587. To improved the stability and accuracy of CART prediction, used CART Bootstrap Aggregating (Bagging with 100% accuracy result. Bagging CART classifies a total of 590 families (84.77% were appropriately classified into implement elderly Family Development program class.

  5. Segmentation Based Classification of 3D Urban Point Clouds: A Super-Voxel Based Approach with Evaluation

    Directory of Open Access Journals (Sweden)

    Laurent Trassoudaine

    2013-03-01

    Full Text Available Segmentation and classification of urban range data into different object classes have several challenges due to certain properties of the data, such as density variation, inconsistencies due to missing data and the large data size that require heavy computation and large memory. A method to classify urban scenes based on a super-voxel segmentation of sparse 3D data obtained from LiDAR sensors is presented. The 3D point cloud is first segmented into voxels, which are then characterized by several attributes transforming them into super-voxels. These are joined together by using a link-chain method rather than the usual region growing algorithm to create objects. These objects are then classified using geometrical models and local descriptors. In order to evaluate the results, a new metric that combines both segmentation and classification results simultaneously is presented. The effects of voxel size and incorporation of RGB color and laser reflectance intensity on the classification results are also discussed. The method is evaluated on standard data sets using different metrics to demonstrate its efficacy.

  6. Classification of nasolabial folds in Asians and the corresponding surgical approaches: By Shanghai 9th People's Hospital.

    Science.gov (United States)

    Zhang, Lu; Tang, Meng-Yao; Jin, Rong; Zhang, Ying; Shi, Yao-Ming; Sun, Bao-Shan; Zhang, Yu-Guang

    2015-07-01

    One of the earliest signs of aging appears in the nasolabial fold, which is a special anatomical region that requires many factors for comprehensive assessment. Hence, it is inadequate to rely on a single index to facilitate the classification of nasolabial folds. Through clinical observation, we have observed that traditional filling treatments provide little improvement for some patients, which prompted us to seek a more specific and scientific classification standard and assessment system. A total of 900 patients who sought facial rejuvenation treatment in Shanghai 9th People's Hospital were invited in this study. We observed the different nasolabial fold traits for different age groups and in different states, and the results were compared with the Wrinkle Severity Rating Scale (WSRS). We summarized the data, presented a classification scheme, and proposed a selection of treatment options. Consideration of the anatomical and histological features of nasolabial folds allowed us to divide nasolabial folds into five types, namely the skin type, fat pad type, muscular type, bone retrusion type, and hybrid type. Because different types of nasolabial folds require different treatments, it is crucial to accurately assess and correctly classify the conditions. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. Classification and source determination of medium petroleum distillates by chemometric and artificial neural networks: a self organizing feature approach.

    Science.gov (United States)

    Mat-Desa, Wan N S; Ismail, Dzulkiflee; NicDaeid, Niamh

    2011-10-15

    Three different medium petroleum distillate (MPD) products (white spirit, paint brush cleaner, and lamp oil) were purchased from commercial stores in Glasgow, Scotland. Samples of 10, 25, 50, 75, 90, and 95% evaporated product were prepared, resulting in 56 samples in total which were analyzed using gas chromatography-mass spectrometry. Data sets from the chromatographic patterns were examined and preprocessed for unsupervised multivariate analyses using principal component analysis (PCA), hierarchical cluster analysis (HCA), and a self organizing feature map (SOFM) artificial neural network. It was revealed that data sets comprised of higher boiling point hydrocarbon compounds provided a good means for the classification of the samples and successfully linked highly weathered samples back to their unevaporated counterpart in every case. The classification abilities of SOFM were further tested and validated for their predictive abilities where one set of weather data in each case was withdrawn from the sample set and used as a test set of the retrained network. This revealed SOFM to be an outstanding mechanism for sample discrimination and linkage over the more conventional PCA and HCA methods often suggested for such data analysis. SOFM also has the advantage of providing additional information through the evaluation of component planes facilitating the investigation of underlying variables that account for the classification. © 2011 American Chemical Society

  8. In search of a consumer-focused food classification system. An experimental heuristic approach to differentiate degrees of quality.

    Science.gov (United States)

    Torres-Ruiz, Francisco J; Marano-Marcolini, Carla; Lopez-Zafra, Esther

    2018-06-01

    The present paper focuses on the problems that arise in food classification systems (FCSs), especially when the food product type has different levels or grades of quality. Despite the principal function of these systems being to assist the consumer (to inform, clarify and facilitate choice and purchase), they frequently have the opposite effect. Thus, the main aim of the present research involves providing orientations for the design of effective food classification systems. To address this objective, considering the context of food product consumption (related to heuristic processing), we conducted an experimental study with 720 participants. We analysed the usefulness of heuristic elements by a factorial 2 (category length: short and long) × 3 (visual signs: colours, numbers and images) design in relation to recall and recognition activities. The results showed that the elements used to make the classification more effective for consumers vary depending on whether the user seeks to prioritize the recall or the recognition of product categories. Thus, long categories with images significantly improve recognition, and short categories with colours improve recall. A series of recommendations are provided that can help to enhance FCSs and to make them more intuitive and easier to understand for consumers. Implications with regard to theory and practice are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    Science.gov (United States)

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  10. DEFINING RELATIONAL PATHOLOGY IN EARLY CHILDHOOD: THE DIAGNOSTIC CLASSIFICATION OF MENTAL HEALTH AND DEVELOPMENTAL DISORDERS OF INFANCY AND EARLY CHILDHOOD DC:0-5 APPROACH.

    Science.gov (United States)

    Zeanah, Charles H; Lieberman, Alicia

    2016-09-01

    Infant mental health is explicitly relational in its focus, and therefore a diagnostic classification system for early childhood disorders should include attention not only to within-the-child psychopathology but also between child and caregiver psychopathology. In this article, we begin by providing a review of previous efforts to introduce this approach that date back more than 30 years. Next, we introduce changes proposed in the Diagnostic Classification of Mental Health and Developmental Disorders of Infancy and Early Childhood DC:0-5 (ZERO TO THREE, in press). In a major change from previous attempts, the DC:0-5 includes an Axis I "Relationship Specific Disorder of Early Childhood." This disorder intends to capture disordered behavior that is limited to one caregiver relationship rather than cross contextually. An axial characterization is continued from the Diagnostic Classification of Mental Health and Developmental Disorders of Infancy and Early Childhood DC:0-3R (ZERO TO THREE, 2005), but two major changes are introduced. First, the DC:0-5 proposes to simplify ratings of relationship adaptation/maladaptation, and to expand what is rated so that in addition to characterizing the child's relationship with his or her primary caregiver, there also is a characterization of the network of family relationships in which the child develops. This includes coparenting relationships and the entire network of close relationships that impinge on the young child's development and adaptation. © 2016 Michigan Association for Infant Mental Health.

  11. Defining active sacroiliitis on magnetic resonance imaging (MRI) for classification of axial spondyloarthritis: a consensual approach by the ASAS/OMERACT MRI group

    DEFF Research Database (Denmark)

    Rudwaleit, M; Jurik, A G; Hermann, K-G A

    2009-01-01

    BACKGROUND: Magnetic resonance imaging (MRI) of sacroiliac joints has evolved as the most relevant imaging modality for diagnosis and classification of early axial spondyloarthritis (SpA) including early ankylosing spondylitis. OBJECTIVES: To identify and describe MRI findings in sacroiliitis and...... relevant for sacroiliitis have been defined by consensus by a group of rheumatologists and radiologists. These definitions should help in applying correctly the imaging feature "active sacroiliitis by MRI" in the new ASAS classification criteria for axial SpA.......BACKGROUND: Magnetic resonance imaging (MRI) of sacroiliac joints has evolved as the most relevant imaging modality for diagnosis and classification of early axial spondyloarthritis (SpA) including early ankylosing spondylitis. OBJECTIVES: To identify and describe MRI findings in sacroiliitis...... conditions which may mimic SpA. Descriptions of the pathological findings and technical requirements for the appropriate acquisition were formulated. In a consensual approach MRI findings considered to be essential for sacroiliitis were defined. RESULTS: Active inflammatory lesions such as bone marrow oedema...

  12. 'CANDLE' burnup regime after LWR regime

    International Nuclear Information System (INIS)

    Sekimoto, Hiroshi; Nagata, Akito

    2008-01-01

    CANDLE (Constant Axial shape of Neutron flux, nuclide densities and power shape During Life of Energy producing reactor) burnup strategy can derive many merits. From safety point of view, the change of excess reactivity along burnup is theoretically zero, and the core characteristics, such as power feedback coefficients and power peaking factor, are not changed along burnup. Application of this burnup strategy to neutron rich fast reactors makes excellent performances. Only natural or depleted uranium is required for the replacing fuels. About 40% of natural or depleted uranium undergoes fission without the conventional reprocessing and enrichment. If the LWR produced energy of X Joules, the CANDLE reactor can produce about 50X Joules from the depleted uranium left at the enrichment facility for the LWR fuel. If we can say LWRs have produced energy sufficient for full 20 years, we can produce the energy for 1000 years by using the CANDLE reactors with depleted uranium. We need not mine any uranium ore, and do not need reprocessing facility. The burnup of spent fuel becomes 10 times. Therefore, the spent fuel amount per produced energy is also reduced to one-tenth. The details of the scenario of CANDLE burnup regime after LWR regime will be presented at the symposium. (author)

  13. Detecting spatial regimes in ecosystems

    Science.gov (United States)

    Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.

    2017-01-01

    Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.

  14. Arctic circulation regimes.

    Science.gov (United States)

    Proshutinsky, Andrey; Dukhovskoy, Dmitry; Timmermans, Mary-Louise; Krishfield, Richard; Bamber, Jonathan L

    2015-10-13

    Between 1948 and 1996, mean annual environmental parameters in the Arctic experienced a well-pronounced decadal variability with two basic circulation patterns: cyclonic and anticyclonic alternating at 5 to 7 year intervals. During cyclonic regimes, low sea-level atmospheric pressure (SLP) dominated over the Arctic Ocean driving sea ice and the upper ocean counterclockwise; the Arctic atmosphere was relatively warm and humid, and freshwater flux from the Arctic Ocean towards the subarctic seas was intensified. By contrast, during anticylonic circulation regimes, high SLP dominated driving sea ice and the upper ocean clockwise. Meanwhile, the atmosphere was cold and dry and the freshwater flux from the Arctic to the subarctic seas was reduced. Since 1997, however, the Arctic system has been under the influence of an anticyclonic circulation regime (17 years) with a set of environmental parameters that are atypical for this regime. We discuss a hypothesis explaining the causes and mechanisms regulating the intensity and duration of Arctic circulation regimes, and speculate how changes in freshwater fluxes from the Arctic Ocean and Greenland impact environmental conditions and interrupt their decadal variability. © 2015 The Authors.

  15. THE INFLUENCED FLOW REGIMES

    Directory of Open Access Journals (Sweden)

    Gavril PANDI

    2011-03-01

    Full Text Available The influenced flow regimes. The presence and activities ofhumanity influences the uniform environmental system, and in this context, therivers water resources. In concordance with this, the natural runoff regime suffersbigger and deeper changes. The nature of these changes depending on the type anddegree of water uses. The multitude of the use cause different types of influence,whit different quantitative aspects. In the same time, the influences havequalitative connotations, too, regarding to the modifications of the yearly watervolume runoff. So the natural runoff regime is modified. After analyzing thedistribution laws of the monthly runoff, there have been differenced four types ofinfluenced runoff regimes. In the excess type the influenced runoff is bigger thanthe natural, continuously in the whole year. The deficient type is characterized byinverse rapports like the first type, in the whole year. In the sinusoidal type, theinfluenced runoff is smaller than the natural in the period when the water isretained in the lake reservoirs, and in the depletion period the situation inverts. Atthe irregular type the ratio between influenced and natural runoff is changeable ina random meaner monthly. The recognition of the influenced regime and the gradeof influence are necessary in the evaluation and analysis of the usable hydrologicalriver resources, in the flood defence activities, in the complex scheme of thehydrographic basins, in the environment design and so on.

  16. The Ten-Group Robson Classification: A Single Centre Approach Identifying Strategies to Optimise Caesarean Section Rates

    Directory of Open Access Journals (Sweden)

    Keisuke Tanaka

    2017-01-01

    Full Text Available Caesarean section (CS rates have been increasing worldwide and have caused concerns. For meaningful comparisons to be made World Health Organization recommends the use of the Ten-Group Robson classification as the global standard for assessing CS rates. 2625 women who birthed over a 12-month period were analysed using this classification. Women with previous CS (group 5 comprised 10.9% of the overall 23.5% CS rate. Women with one previous CS who did not attempt VBAC contributed 5.3% of the overall 23.5% CS rate. Second largest contributor was singleton nulliparous women with cephalic presentation at term (5.1% of the total 23.5%. Induction of labour was associated with higher CS rate (groups 1 and 3 (24.5% versus 11.9% and 6.2% versus 2.6%, resp.. For postdates IOL we recommend a gatekeeper booking system to minimise these being performed <41 weeks. We suggest setting up dedicated VBAC clinic to support for women with one previous CS. Furthermore review of definition of failure to progress in labour not only may lower CS rates in groups 1 and 2a but also would reduce the size of group 5 in the future.

  17. Sex reversal of brook trout (Salvelinus fontinalis) by 17α-methyltestosterone exposure: A serial experimental approach to determine optimal timing and delivery regimes.

    Science.gov (United States)

    Fatima, Shafaq; Adams, Mark; Wilkinson, Ryan

    2016-12-01

    Commercial culture of Brook trout (Salvelinus fontinalis) in Tasmania was partly abandoned due to sexual maturation of male fish early on during the estuarine rearing phase. Maturation adversely affects body mass, flesh quality and immunocompetency effectively. Sex reversal techniques such as the in-feed addition of a synthetic androgen have proven difficult to adapt in brook trout. An appropriate timing, duration and delivery vehicle for administration of 17α-methyltestosterone (MT) to produce phenotypic males (neomales) from genotypically female brook trout required further investigation. In this study, groups of brook trout eggs (n=1000) maintained at 9.5±0.15-10±0.14°C, were immersed in MT (400μgL -1 ) for four hours on two alternate days (two immersions/group) staggered over a two week period surrounding the hatch of embryos (control groups excluded). The groups were then split and half received MT-supplemented feed for 60days and the other a standard diet. Following an 11 month on-growing period sex phenotypes were determined by gross & histological gonad morphology. The highest proportion of male phenotypes (75%) was found in fish immersed six and four days pre-hatch and subsequently fed a normal diet. Fish fed a MT supplemented diet and immersed in MT showed significantly higher proportions of sterile fish. These data indicate that a pre-hatch immersion-only regime (4-6days pre-hatch at 9.5°C) should be pursued as a target for optimization studies to further refine the effective concentration and duration of exposure to MT for the successful production of neo-male brook trout. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Comparing two metabolic profiling approaches (liquid chromatography and gas chromatography coupled to mass spectrometry) for extra-virgin olive oil phenolic compounds analysis: A botanical classification perspective.

    Science.gov (United States)

    Bajoub, Aadil; Pacchiarotta, Tiziana; Hurtado-Fernández, Elena; Olmo-García, Lucía; García-Villalba, Rocío; Fernández-Gutiérrez, Alberto; Mayboroda, Oleg A; Carrasco-Pancorbo, Alegría

    2016-01-08

    Over the last decades, the phenolic compounds from virgin olive oil (VOO) have become the subject of intensive research because of their biological activities and their influence on some of the most relevant attributes of this interesting matrix. Developing metabolic profiling approaches to determine them in monovarietal virgin olive oils could help to gain a deeper insight into olive oil phenolic compounds composition as well as to promote their use for botanical origin tracing purposes. To this end, two approaches were comparatively investigated (LC-ESI-TOF MS and GC-APCI-TOF MS) to evaluate their capacity to properly classify 25 olive oil samples belonging to five different varieties (Arbequina, Cornicabra, Hojiblanca, Frantoio and Picual), using the entire chromatographic phenolic profiles combined to chemometrics (principal component analysis (PCA) and partial least square-discriminant analysis (PLS-DA)). The application of PCA to LC-MS and GC-MS data showed the natural clustering of the samples, seeing that 2 varieties were dominating the models (Arbequina and Frantoio), suppressing any possible discrimination among the other cultivars. Afterwards, PLS-DA was used to build four different efficient predictive models for varietal classification of the samples under study. The varietal markers pointed out by each platform were compared. In general, with the exception of one GC-MS model, all exhibited proper quality parameters. The models constructed by using the LC-MS data demonstrated superior classification ability. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A 3D convolutional neural network approach to land cover classification using LiDAR and multi-temporal Landsat imagery

    Science.gov (United States)

    Xu, Z.; Guan, K.; Peng, B.; Casler, N. P.; Wang, S. W.

    2017-12-01

    Landscape has complex three-dimensional features. These 3D features are difficult to extract using conventional methods. Small-footprint LiDAR provides an ideal way for capturing these features. Existing approaches, however, have been relegated to raster or metric-based (two-dimensional) feature extraction from the upper or bottom layer, and thus are not suitable for resolving morphological and intensity features that could be important to fine-scale land cover mapping. Therefore, this research combines airborne LiDAR and multi-temporal Landsat imagery to classify land cover types of Williamson County, Illinois that has diverse and mixed landscape features. Specifically, we applied a 3D convolutional neural network (CNN) method to extract features from LiDAR point clouds by (1) creating occupancy grid, intensity grid at 1-meter resolution, and then (2) normalizing and incorporating data into a 3D CNN feature extractor for many epochs of learning. The learned features (e.g., morphological features, intensity features, etc) were combined with multi-temporal spectral data to enhance the performance of land cover classification based on a Support Vector Machine classifier. We used photo interpretation for training and testing data generation. The classification results show that our approach outperforms traditional methods using LiDAR derived feature maps, and promises to serve as an effective methodology for creating high-quality land cover maps through fusion of complementary types of remote sensing data.

  20. Supply regimes in fisheries

    DEFF Research Database (Denmark)

    Nielsen, Max

    2006-01-01

    Supply in fisheries is traditionally known for its backward bending nature, owing to externalities in production. Such a supply regime, however, exist only for pure open access fisheries. Since most fisheries worldwide are neither pure open access, nor optimally managed, rather between the extremes......, the traditional understanding of supply regimes in fisheries needs modification. This paper identifies through a case study of the East Baltic cod fishery supply regimes in fisheries, taking alternative fisheries management schemes and mesh size limitations into account. An age-structured Beverton-Holt based bio......-economic supply model with mesh sizes is developed. It is found that in the presence of realistic management schemes, the supply curves are close to vertical in the relevant range. Also, the supply curve under open access with mesh size limitations is almost vertical in the relevant range, owing to constant...

  1. Emerging approach for analytical characterization and geographical classification of Moroccan and French honeys by means of a voltammetric electronic tongue.

    Science.gov (United States)

    El Alami El Hassani, Nadia; Tahri, Khalid; Llobet, Eduard; Bouchikhi, Benachir; Errachid, Abdelhamid; Zine, Nadia; El Bari, Nezha

    2018-03-15

    Moroccan and French honeys from different geographical areas were classified and characterized by applying a voltammetric electronic tongue (VE-tongue) coupled to analytical methods. The studied parameters include color intensity, free lactonic and total acidity, proteins, phenols, hydroxymethylfurfural content (HMF), sucrose, reducing and total sugars. The geographical classification of different honeys was developed through three-pattern recognition techniques: principal component analysis (PCA), support vector machines (SVMs) and hierarchical cluster analysis (HCA). Honey characterization was achieved by partial least squares modeling (PLS). All the PLS models developed were able to accurately estimate the correct values of the parameters analyzed using as input the voltammetric experimental data (i.e. r>0.9). This confirms the potential ability of the VE-tongue for performing a rapid characterization of honeys via PLS in which an uncomplicated, cost-effective sample preparation process that does not require the use of additional chemicals is implemented. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Decision making in double-pedicled DIEP and SIEA abdominal free flap breast reconstructions: An algorithmic approach and comprehensive classification.

    Directory of Open Access Journals (Sweden)

    Charles M Malata

    2015-10-01

    Full Text Available Introduction: The deep inferior epigastric artery perforator (DIEP free flap is the gold standard for autologous breast reconstruction. However, using a single vascular pedicle may not yield sufficient tissue in patients with midline scars or insufficient lower abdominal pannus. Double-pedicled free flaps overcome this problem using different vascular arrangements to harvest the entire lower abdominal flap. The literature is, however, sparse regarding technique selection. We therefore reviewed our experience in order to formulate an algorithm and comprehensive classification for this purpose. Methods: All patients undergoing unilateral double-pedicled abdominal perforator free flap breast reconstruction (AFFBR by a single surgeon (CMM over 40 months were reviewed from a prospectively collected database. Results: Of the 112 consecutive breast free flaps performed, 25 (22% utilised two vascular pedicles. The mean patient age was 45 years (range=27-54. All flaps but one (which used the thoracodorsal system were anastomosed to the internal mammary vessels using the rib-preservation technique. The surgical duration was 656 minutes (range=468-690 mins. The median flap weight was 618g (range=432-1275g and the mastectomy weight was 445g (range=220-896g. All flaps were successful and only three patients requested minor liposuction to reduce and reshape their reconstructed breasts.Conclusion: Bipedicled free abdominal perforator flaps, employed in a fifth of all our AFFBRs, are a reliable and safe option for unilateral breast reconstruction. They, however, necessitate clear indications to justify the additional technical complexity and surgical duration. Our algorithm and comprehensive classification facilitate technique selection for the anastomotic permutations and successful execution of these operations.

  3. Cargo liability regimes

    Science.gov (United States)

    2001-01-01

    There are at present at least three international regimes of maritime cargo liability in force in different countries of the world - the original Hague rules (1924), the updated version known as the Hague-Visby rules (1968, further amended 1979), and...

  4. Trust in regulatory regimes

    NARCIS (Netherlands)

    Six, Frédérique; Verhoest, Koen

    2017-01-01

    Within political and administrative sciences generally, trust as a concept is contested, especially in the field of regulatory governance. This groundbreaking book is the first to systematically explore the role and dynamics of trust within regulatory regimes. Conceptualizing, mapping and analyzing

  5. East Asian welfare regime

    DEFF Research Database (Denmark)

    Abrahamson, Peter

    2017-01-01

    The paper asks if East Asian welfare regimes are still productivist and Confucian? And, have they developed public care policies? The literature is split on the first question but (mostly) confirmative on the second. Care has to a large, but insufficient extent, been rolled out in the region...

  6. Rice-planted area extraction by time series analysis of ENVISAT ASAR WS data using a phenology-based classification approach: A case study for Red River Delta, Vietnam

    Science.gov (United States)

    Nguyen, D.; Wagner, W.; Naeimi, V.; Cao, S.

    2015-04-01

    Recent studies have shown the potential of Synthetic Aperture Radars (SAR) for mapping of rice fields and some other vegetation types. For rice field classification, conventional classification techniques have been mostly used including manual threshold-based and supervised classification approaches. The challenge of the threshold-based approach is to find acceptable thresholds to be used for each individual SAR scene. Furthermore, the influence of local incidence angle on backscatter hinders using a single threshold for the entire scene. Similarly, the supervised classification approach requires different training samples for different output classes. In case of rice crop, supervised classification using temporal data requires different training datasets to perform classification procedure which might lead to inconsistent mapping results. In this study we present an automatic method to identify rice crop areas by extracting phonological parameters after performing an empirical regression-based normalization of the backscatter to a reference incidence angle. The method is evaluated in the Red River Delta (RRD), Vietnam using the time series of ENVISAT Advanced SAR (ASAR) Wide Swath (WS) mode data. The results of rice mapping algorithm compared to the reference data indicate the Completeness (User accuracy), Correctness (Producer accuracy) and Quality (Overall accuracies) of 88.8%, 92.5 % and 83.9 % respectively. The total area of the classified rice fields corresponds to the total rice cultivation areas given by the official statistics in Vietnam (R2  0.96). The results indicates that applying a phenology-based classification approach using backscatter time series in optimal incidence angle normalization can achieve high classification accuracies. In addition, the method is not only useful for large scale early mapping of rice fields in the Red River Delta using the current and future C-band Sentinal-1A&B backscatter data but also might be applied for other rice

  7. How many taxa can be recognized within the complex Tillandsia capillaris (Bromeliaceae, Tillandsioideae? Analysis of the available classifications using a multivariate approach

    Directory of Open Access Journals (Sweden)

    Lucía Castello

    2013-05-01

    Full Text Available Tillandsia capillaris Ruiz & Pav., which belongs to the subgenus Diaphoranthema is distributed in Ecuador, Peru, Bolivia, northern and central Argentina, and Chile, and includes forms that are difficult to circumscribe, thus considered to form a complex. The entities of this complex are predominantly small-sized epiphytes, adapted to xeric environments. The most widely used classification defines 5 forms for this complex based on few morphological reproductive traits: T. capillaris Ruiz & Pav. f. capillaris, T. capillaris f. incana (Mez L.B. Sm., T. capillaris f. cordobensis (Hieron. L.B. Sm., T. capillaris f. hieronymi (Mez L.B. Sm. and T. capillaris f. virescens (Ruiz & Pav. L.B. Sm. In this study, 35 floral and vegetative characters were analyzed with a multivariate approach in order to assess and discuss different proposals for classification of the T. capillaris complex, which presents morphotypes that co-occur in central and northern Argentina. To accomplish this, data of quantitative and categorical morphological characters of flowers and leaves were collected from herbarium specimens and field collections and were analyzed with statistical multivariate techniques. The results suggest that the last classification for the complex seems more comprehensive and three taxa were delimited: T. capillaris (=T. capillaris f. incana-hieronymi, T. virescens s. str. (=T. capillaris f. cordobensis and T. virescens s. l. (=T. capillaris f. virescens. While T. capillaris and T. virescens s. str. co-occur, T. virescens s. l. is restricted to altitudes above 2000 m in Argentina. Characters previously used for taxa delimitation showed continuous variation and therefore were not useful. New diagnostic characters are proposed and a key is provided for delimiting these three taxa within the complex.

  8. How many taxa can be recognized within the complex Tillandsia capillaris (Bromeliaceae, Tillandsioideae)? Analysis of the available classifications using a multivariate approach.

    Science.gov (United States)

    Castello, Lucía V; Galetto, Leonardo

    2013-01-01

    Tillandsia capillaris Ruiz & Pav., which belongs to the subgenus Diaphoranthema is distributed in Ecuador, Peru, Bolivia, northern and central Argentina, and Chile, and includes forms that are difficult to circumscribe, thus considered to form a complex. The entities of this complex are predominantly small-sized epiphytes, adapted to xeric environments. The most widely used classification defines 5 forms for this complex based on few morphological reproductive traits: Tillandsia capillaris Ruiz & Pav. f. capillaris, Tillandsia capillaris f. incana (Mez) L.B. Sm., Tillandsia capillaris f. cordobensis (Hieron.) L.B. Sm., Tillandsia capillaris f. hieronymi (Mez) L.B. Sm. and Tillandsia capillaris f. virescens (Ruiz & Pav.) L.B. Sm. In this study, 35 floral and vegetative characters were analyzed with a multivariate approach in order to assess and discuss different proposals for classification of the Tillandsia capillaris complex, which presents morphotypes that co-occur in central and northern Argentina. To accomplish this, data of quantitative and categorical morphological characters of flowers and leaves were collected from herbarium specimens and field collections and were analyzed with statistical multivariate techniques. The results suggest that the last classification for the complex seems more comprehensive and three taxa were delimited: Tillandsia capillaris (=Tillandsia capillaris f. incana-hieronymi), Tillandsia virescens s. str. (=Tillandsia capillaris f. cordobensis) and Tillandsia virescens s. l. (=Tillandsia capillaris f. virescens). While Tillandsia capillaris and Tillandsia virescens s. str. co-occur, Tillandsia virescens s. l. is restricted to altitudes above 2000 m in Argentina. Characters previously used for taxa delimitation showed continuous variation and therefore were not useful. New diagnostic characters are proposed and a key is provided for delimiting these three taxa within the complex.

  9. A re-assessment of gene-tag classification approaches for describing var gene expression patterns during human Plasmodium falciparum malaria parasite infections.

    Science.gov (United States)

    Githinji, George; Bull, Peter C

    2017-01-01

    PfEMP1 are variant parasite antigens that are inserted on the surface of Plasmodium falciparum infected erythrocytes (IE). Through interactions with various host molecules, PfEMP1 mediate IE sequestration in tissues and play a key role in the pathology of severe malaria. PfEMP1 is encoded by a diverse multi-gene family called var . Previous studies have shown that that expression of specific subsets of var genes are associated with low levels of host immunity and severe malaria. However, in most clinical studies to date, full-length var gene sequences were unavailable and various approaches have been used to make comparisons between var gene expression profiles in different parasite isolates using limited information. Several studies have relied on the classification of a 300 - 500 base-pair "DBLα tag" region in the DBLα domain located at the 5' end of most var genes. We assessed the relationship between various DBLα tag classification methods, and sequence features that are only fully assessable through full-length var gene sequences. We compared these different sequence features in full-length var gene from six fully sequenced laboratory isolates. These comparisons show that despite a long history of recombination,   DBLα sequence tag classification can provide functional information on important features of full-length var genes. Notably, a specific subset of DBLα tags previously defined as "group A-like" is associated with CIDRα1 domains proposed to bind to endothelial protein C receptor. This analysis helps to bring together different sources of data that have been used to assess var gene expression in clinical parasite isolates.

  10. Using resistance and resilience concepts to reduce impacts of annual grasses and altered fire regimes on the sagebrush ecosystem and sage-grouse- A strategic multi-scale approach

    Science.gov (United States)

    Chambers, Jeanne C.; Pyke, David A.; Maestas, Jeremy D.; Boyd, Chad S.; Campbell, Steve; Espinosa, Shawn; Havlina, Doug; Mayer, Kenneth F.; Wuenschel, Amarina

    2014-01-01

    This Report provides a strategic approach for conservation of sagebrush ecosystems and Greater Sage- Grouse (sage-grouse) that focuses specifically on habitat threats caused by invasive annual grasses and altered fire regimes. It uses information on factors that influence (1) sagebrush ecosystem resilience to disturbance and resistance to invasive annual grasses and (2) distribution, relative abundance, and persistence of sage-grouse populations to develop management strategies at both landscape and site scales. A sage-grouse habitat matrix links relative resilience and resistance of sagebrush ecosystems with sage-grouse habitat requirements for landscape cover of sagebrush to help decision makers assess risks and determine appropriate management strategies at landscape scales. Focal areas for management are assessed by overlaying matrix components with sage-grouse Priority Areas for Conservation (PACs), breeding bird densities, and specific habitat threats. Decision tools are discussed for determining the suitability of focal areas for treatment and the most appropriate management treatments.

  11. A reliable Raman-spectroscopy-based approach for diagnosis, classification and follow-up of B-cell acute lymphoblastic leukemia

    Science.gov (United States)

    Managò, Stefano; Valente, Carmen; Mirabelli, Peppino; Circolo, Diego; Basile, Filomena; Corda, Daniela; de Luca, Anna Chiara

    2016-04-01

    Acute lymphoblastic leukemia type B (B-ALL) is a neoplastic disorder that shows high mortality rates due to immature lymphocyte B-cell proliferation. B-ALL diagnosis requires identification and classification of the leukemia cells. Here, we demonstrate the use of Raman spectroscopy to discriminate normal lymphocytic B-cells from three different B-leukemia transformed cell lines (i.e., RS4;11, REH, MN60 cells) based on their biochemical features. In combination with immunofluorescence and Western blotting, we show that these Raman markers reflect the relative changes in the potential biological markers from cell surface antigens, cytoplasmic proteins, and DNA content and correlate with the lymphoblastic B-cell maturation/differentiation stages. Our study demonstrates the potential of this technique for classification of B-leukemia cells into the different differentiation/maturation stages, as well as for the identification of key biochemical changes under chemotherapeutic treatments. Finally, preliminary results from clinical samples indicate high consistency of, and potential applications for, this Raman spectroscopy approach.

  12. A novel approach for fault detection and classification of the thermocouple sensor in Nuclear Power Plant using Singular Value Decomposition and Symbolic Dynamic Filter

    International Nuclear Information System (INIS)

    Mandal, Shyamapada; Santhi, B.; Sridhar, S.; Vinolia, K.; Swaminathan, P.

    2017-01-01

    Highlights: • A novel approach to classify the fault pattern using data-driven methods. • Application of robust reconstruction method (SVD) to identify the faulty sensor. • Analysing fault pattern for plenty of sensors using SDF with less time complexity. • An efficient data-driven model is designed to the false and missed alarms. - Abstract: A mathematical model with two layers is developed using data-driven methods for thermocouple sensor fault detection and classification in Nuclear Power Plants (NPP). The Singular Value Decomposition (SVD) based method is applied to detect the faulty sensor from a data set of all sensors, at the first layer. In the second layer, the Symbolic Dynamic Filter (SDF) is employed to classify the fault pattern. If SVD detects any false fault, it is also re-evaluated by the SDF, i.e., the model has two layers of checking to balance the false alarms. The proposed fault detection and classification method is compared with the Principal Component Analysis. Two case studies are taken from Fast Breeder Test Reactor (FBTR) to prove the efficiency of the proposed method.

  13. Floating Exchange Rate Regime

    OpenAIRE

    Quader, Syed Manzur

    2004-01-01

    In recent years, many developing countries having a history of high inflation, unfavorable balance of payment situation and a high level of foreign currencies denominated debt, have switched or are in the process of switching to a more flexible exchange rate regime. Therefore, the stability of the exchange rate and the dynamics of its volatility are more crucial than before to prevent financial crises and macroeconomic disturbances. This paper is designed to find out the reasons behind Bangla...

  14. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  15. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  16. A least square support vector machine-based approach for contingency classification and ranking in a large power system

    Directory of Open Access Journals (Sweden)

    Bhanu Pratap Soni

    2016-12-01

    Full Text Available This paper proposes an effective supervised learning approach for static security assessment of a large power system. Supervised learning approach employs least square support vector machine (LS-SVM to rank the contingencies and predict the system severity level. The severity of the contingency is measured by two scalar performance indices (PIs: line MVA performance index (PIMVA and Voltage-reactive power performance index (PIVQ. SVM works in two steps. Step I is the estimation of both standard indices (PIMVA and PIVQ that is carried out under different operating scenarios and Step II contingency ranking is carried out based on the values of PIs. The effectiveness of the proposed methodology is demonstrated on IEEE 39-bus (New England system. The approach can be beneficial tool which is less time consuming and accurate security assessment and contingency analysis at energy management center.

  17. A new approach to develop computer-aided diagnosis scheme of breast mass classification using deep learning technology.

    Science.gov (United States)

    Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2017-01-01

    To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process.

  18. A New Approach to Develop Computer-aided Diagnosis Scheme of Breast Mass Classification Using Deep Learning Technology

    Science.gov (United States)

    Qiu, Yuchen; Yan, Shiju; Gundreddy, Rohith Reddy; Wang, Yunzhi; Cheng, Samuel; Liu, Hong; Zheng, Bin

    2017-01-01

    PURPOSE To develop and test a deep learning based computer-aided diagnosis (CAD) scheme of mammograms for classifying between malignant and benign masses. METHODS An image dataset involving 560 regions of interest (ROIs) extracted from digital mammograms was used. After down-sampling each ROI from 512×512 to 64×64 pixel size, we applied an 8 layer deep learning network that involves 3 pairs of convolution-max-pooling layers for automatic feature extraction and a multiple layer perceptron (MLP) classifier for feature categorization to process ROIs. The 3 pairs of convolution layers contain 20, 10, and 5 feature maps, respectively. Each convolution layer is connected with a max-pooling layer to improve the feature robustness. The output of the sixth layer is fully connected with a MLP classifier, which is composed of one hidden layer and one logistic regression layer. The network then generates a classification score to predict the likelihood of ROI depicting a malignant mass. A four-fold cross validation method was applied to train and test this deep learning network. RESULTS The results revealed that this CAD scheme yields an area under the receiver operation characteristic curve (AUC) of 0.696±0.044, 0.802±0.037, 0.836±0.036, and 0.822±0.035 for fold 1 to 4 testing datasets, respectively. The overall AUC of the entire dataset is 0.790±0.019. CONCLUSIONS This study demonstrates the feasibility of applying a deep learning based CAD scheme to classify between malignant and benign breast masses without a lesion segmentation, image feature computation and selection process. PMID:28436410

  19. Fire Regime Characteristics along Environmental Gradients in Spain

    Directory of Open Access Journals (Sweden)

    María Vanesa Moreno

    2016-11-01

    Full Text Available Concern regarding global change has increased the need to understand the relationship between fire regime characteristics and the environment. Pyrogeographical theory suggests that fire regimes are constrained by climate, vegetation and fire ignition processes, but it is not obvious how fire regime characteristics are related to those factors. We used a three-matrix approach with a multivariate statistical methodology that combined an ordination method and fourth-corner analysis for hypothesis testing to investigate the relationship between fire regime characteristics and environmental gradients across Spain. Our results suggest that fire regime characteristics (i.e., density and seasonality of fire activity are constrained primarily by direct gradients based on climate, population, and resource gradients based on forest potential productivity. Our results can be used to establish a predictive model for how fire regimes emerge in order to support fire management, particularly as global environmental changes impact fire regime characteristics.

  20. Using Stable Isotopes to Link Nutrient Sources in the Everglades and Biological Sinks in Florida Bay: A Biogeochemical Approach to Evaluate Ecosystem Response to Changing Nutrient Regimes

    Science.gov (United States)

    Hoare, A. M.; Hollander, D. J.; Heil, C.; Glibert, P.; Murasko, S.; Revilla, M.; Alexander, J.

    2005-05-01

    Anthropogenic influences in South Florida have led to deterioration of its two major ecosystems, the Everglades wetlands and the Florida Bay estuary. Consequently, the Comprehensive Everglades Restoration Plan has been proposed to restore the Everglades ecosystem; however, restoration efforts will likely exert new ecological changes in the Everglades and ultimately Florida Bay. The success of the Florida Everglades restoration depends on our understanding and ability to predict how regional changes in the distribution and composition of dissolved organic and inorganic nutrients will direct the downstream biogeochemical dynamics of Florida Bay. While the transport of freshwater and nutrients to Florida Bay have been studied, much work remains to directly link nutrient dynamics in Florida Bay to nutrient sources in the Everglades. Our study uses stable C and N isotopic measurements of chemical and biological materials from the Everglades and Florida Bay as part of a multi-proxy approach to link nutrient sources in the Everglades to biological sinks in Florida Bay. Isotopic analyses of dissolved and particulate species of water, aquatic vegetation and sedimentary organic matter show that the watersheds within the Everglades are chemically distinct and that these signatures are also reflected in the bay. A large east-west gradient in both carbon and nitrogen (as much as 10‰ for δ15N POM) reflect differing nutrient sources for each region of Florida Bay and is strongly correlated with upstream sources in the Everglades. Isotopic signatures also reflect seasonal relationships associated with wet and dry periods. High C and N measurements of DOM and POM measurements suggest significant influence from waste water in Canal C-111 in eastern Florida Bay, particularly during the dry season. These observations show that nutrients from the Everglades watersheds enter Florida Bay and are important in controlling biogeochemical processes in the bay. This study proves that

  1. Active Learning for Text Classification

    OpenAIRE

    Hu, Rong

    2011-01-01

    Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...

  2. Orbitally shaken shallow fluid layers. I. Regime classification

    Science.gov (United States)

    Alpresa, Paola; Sherwin, Spencer; Weinberg, Peter; van Reeuwijk, Maarten

    2018-03-01

    Orbital shakers are simple devices that provide mixing, aeration, and shear stress at multiple scales and high throughput. For this reason, they are extensively used in a wide range of applications from protein production to bacterial biofilms and endothelial cell experiments. This study focuses on the behaviour of orbitally shaken shallow fluid layers in cylindrical containers. In order to investigate the behaviour over a wide range of different conditions, a significant number of numerical simulations are carried out under different configuration parameters. We demonstrate that potential theory—despite the relatively low Reynolds number of the system—describes the free-surface amplitude well and the velocity field reasonably well, except when the forcing frequency is close to a natural frequency and resonance occurs. By classifying the simulations into non-breaking, breaking, and breaking with part of the bottom uncovered, it is shown that the onset of wave breaking is well described by Δh/(2R) = 0.7Γ, where Δh is the free-surface amplitude, R is the container radius, and Γ is the container aspect ratio; Δh can be well approximated using the potential theory. This result is in agreement with standard wave breaking theories although the significant inertial forcing causes wave breaking at lower amplitudes.

  3. Early detection of ecosystem regime shifts

    DEFF Research Database (Denmark)

    Lindegren, Martin; Dakos, Vasilis; Groeger, Joachim P.

    2012-01-01

    methods may have limited utility in ecosystem-based management as they show no or weak potential for early-warning. We therefore propose a multiple method approach for early detection of ecosystem regime shifts in monitoring data that may be useful in informing timely management actions in the face...

  4. Multiple Feature Fusion Based on Co-Training Approach and Time Regularization for Place Classification in Wearable Video

    Directory of Open Access Journals (Sweden)

    Vladislavs Dovgalecs

    2013-01-01

    Full Text Available The analysis of video acquired with a wearable camera is a challenge that multimedia community is facing with the proliferation of such sensors in various applications. In this paper, we focus on the problem of automatic visual place recognition in a weakly constrained environment, targeting the indexing of video streams by topological place recognition. We propose to combine several machine learning approaches in a time regularized framework for image-based place recognition indoors. The framework combines the power of multiple visual cues and integrates the temporal continuity information of video. We extend it with computationally efficient semisupervised method leveraging unlabeled video sequences for an improved indexing performance. The proposed approach was applied on challenging video corpora. Experiments on a public and a real-world video sequence databases show the gain brought by the different stages of the method.

  5. Hydrological Climate Classification: Can We Improve on Köppen-Geiger?

    Science.gov (United States)

    Knoben, W.; Woods, R. A.; Freer, J. E.

    2017-12-01

    Classification is essential in the study of complex natural systems, yet hydrology so far has no formal way to structure the climate forcing which underlies hydrologic response. Various climate classification systems can be borrowed from other disciplines but these are based on different organizing principles than a hydrological classification might use. From gridded global data we calculate a gridded aridity index, an aridity seasonality index and a rain-vs-snow index, which we use to cluster global locations into climate groups. We then define the membership degree of nearly 1100 catchments to each of our climate groups based on each catchment's climate and investigate the extent to which streamflow responses within each climate group are similar. We compare this climate classification approach with the often-used Köppen-Geiger classification, using statistical tests based on streamflow signature values. We find that three climate indices are sufficient to distinguish 18 different climate types world-wide. Climates tend to change gradually in space and catchments can thus belong to multiple climate groups, albeit with different degrees of membership. Streamflow responses within a climate group tend to be similar, regardless of the catchments' geographical proximity. A Wilcoxon two-sample test based on streamflow signature values for each climate group shows that the new classification can distinguish different flow regimes using this classification scheme. The Köppen-Geiger approach uses 29 climate classes but is less able to differentiate streamflow regimes. Climate forcing exerts a strong control on typical hydrologic response and both change gradually in space. This makes arbitrary hard boundaries in any classification scheme difficult to defend. Any hydrological classification should thus acknowledge these gradual changes in forcing. Catchment characteristics (soil or vegetation type, land use, etc) can vary more quickly in space than climate does, which

  6. Extracting preseismic VLF-VHF electromagnetic signatures: A possible way in which the critical regime is reached as the earthquake approaches

    Science.gov (United States)

    Eftaxias, K.; Kapiris, P.; Karamanos, K.; Balasis, G.; Peratzakis, A.

    2005-12-01

    We view earthquakes EQ's as large-scale fracture phenomena in the Earth's heterogeneous crust. Our main observational tool is the monitoring of the microfractures, which occur in the prefocal area before the final break-up, by recording their kHz-MHz electromagnetic (EM) emissions, with the MHz radiation appearing earlier than the kHz. Our model of the focal area consists of a backbone of strong and almost homogeneous large asperities that sustains the system and a strongly heterogeneous medium that surrounds the family of strong asperities. We distinguish two characteristic epochs in the evolution of precursory EM activity and identify them with the equivalent critical stages in the EQ preparation process. Our approach will be in terms of critical phase transitions in statistical physics, drawing on recently published results. We obtain two major results. First, the initial MHz part of the preseismic EM emission, which has antipersistent behavior, is triggered by microfractures in the highly disordered system that surrounds the essentially homogeneous "backbone asperities" within the prefocal area and could be described in analogy with a thermal continuous phase transition. However, the analysis reveals that the system is gradually driven out of equilibrium. Considerations of the symmetry-breaking and ``intermittent dynamics of critical fluctuations" method estimate the time beyond which the process generating the preseismic EM emission could continue only as a nonequilibrium instability. Second, the abrupt emergence of strong kHz EM emission in the tail of the precursory radiation, showing strong persistent behavior, is thought to be due to the fracture of the high strength ``backbones". The associated phase of the EQ nucleation is a nonequilibrium process without any footprint of an equilibrium thermal phase transition. The family of asperities sustains the system. Physically, the appearance of persistent properties may indicate that the process acquires a self

  7. An ecological economic assessment of flow regimes in a hydropower dominated river basin: the case of the lower Zambezi River, Mozambique.

    Science.gov (United States)

    Fanaian, Safa; Graas, Susan; Jiang, Yong; van der Zaag, Pieter

    2015-02-01

    The flow regime of rivers, being an integral part of aquatic ecosystems, provides many important services benefiting humans in catchments. Past water resource developments characterized by river embankments and dams, however, were often dominated by one (or few) economic use(s) of water. This results in a dramatically changed flow regime negatively affecting the provision of other ecosystem services sustained by the river flow. This study is intended to demonstrate the value of alternative flow regimes in a river that is highly modified by the presence of large hydropower dams and reservoirs, explicitly accounting for a broad range of flow-dependent ecosystem services. In this study, we propose a holistic approach for conducting an ecological economic assessment of a river's flow regime. This integrates recent advances in the conceptualization and classification of ecosystem services (UK NEA, 2011) with the flow regime evaluation technique developed by Korsgaard (2006). This integrated approach allows for a systematic comparison of the economic values of alternative flow regimes, including those that are considered beneficial for aquatic ecosystems. As an illustration, we applied this combined approach to the Lower Zambezi Basin, Mozambique. Empirical analysis shows that even though re-operating dams to create environmentally friendly flow regimes reduces hydropower benefits, the gains to goods derived from the aquatic ecosystem may offset the forgone hydropower benefits, thereby increasing the total economic value of river flow to society. The proposed integrated flow assessment approach can be a useful tool for welfare-improving decision-making in managing river basins. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. On Darboux's approach to R-separability of variables. Classification of conformally flat 4-dimensional binary metrics

    International Nuclear Information System (INIS)

    Szereszewski, A; Sym, A

    2015-01-01

    The standard method of separation of variables in PDEs called the Stäckel–Robertson–Eisenhart (SRE) approach originated in the papers by Robertson (1928 Math. Ann. 98 749–52) and Eisenhart (1934 Ann. Math. 35 284–305) on separability of variables in the Schrödinger equation defined on a pseudo-Riemannian space equipped with orthogonal coordinates, which in turn were based on the purely classical mechanics results by Paul Stäckel (1891, Habilitation Thesis, Halle). These still fundamental results have been further extended in diverse directions by e.g. Havas (1975 J. Math. Phys. 16 1461–8; J. Math. Phys. 16 2476–89) or Koornwinder (1980 Lecture Notes in Mathematics 810 (Berlin: Springer) pp 240–63). The involved separability is always ordinary (factor R = 1) and regular (maximum number of independent parameters in separation equations). A different approach to separation of variables was initiated by Gaston Darboux (1878 Ann. Sci. E.N.S. 7 275–348) which has been almost completely forgotten in today’s research on the subject. Darboux’s paper was devoted to the so-called R-separability of variables in the standard Laplace equation. At the outset he did not make any specific assumption about the separation equations (this is in sharp contrast to the SRE approach). After impressive calculations Darboux obtained a complete solution of the problem. He found not only eleven cases of ordinary separability Eisenhart (1934 Ann. Math. 35 284–305) but also Darboux–Moutard–cyclidic metrics (Bôcher 1894 Ueber die Reihenentwickelungen der Potentialtheorie (Leipzig: Teubner)) and non-regularly separable Dupin-cyclidic metrics as well. In our previous paper Darboux’s approach was extended to the case of the stationary Schrödinger equation on Riemannian spaces admitting orthogonal coordinates. In particular the class of isothermic metrics was defined (isothermicity of the metric is a necessary condition for its R-separability). An important sub

  9. Computing symmetrical strength of N-grams: a two pass filtering approach in automatic classification of text documents.

    Science.gov (United States)

    Agnihotri, Deepak; Verma, Kesari; Tripathi, Priyanka

    2016-01-01

    The contiguous sequences of the terms (N-grams) in the documents are symmetrically distributed among different classes. The symmetrical distribution of the N-Grams raises uncertainty in the belongings of the N-Grams towards the class. In this paper, we focused on the selection of most discriminating N-Grams by reducing the effects of symmetrical distribution. In this context, a new text feature selection method named as the symmetrical strength of the N-Grams (SSNG) is proposed using a two pass filtering based feature selection (TPF) approach. Initially, in the first pass of the TPF, the SSNG method chooses various informative N-Grams from the entire extracted N-Grams of the corpus. Subsequently, in the second pass the well-known Chi Square (χ(2)) method is being used to select few most informative N-Grams. Further, to classify the documents the two standard classifiers Multinomial Naive Bayes and Linear Support Vector Machine have been applied on the ten standard text data sets. In most of the datasets, the experimental results state the performance and success rate of SSNG method using TPF approach is superior to the state-of-the-art methods viz. Mutual Information, Information Gain, Odds Ratio, Discriminating Feature Selection and χ(2).

  10. Hydrologic classification of rivers based on cluster analysis of dimensionless hydrologic signatures: Applications for environmental instream flows

    Science.gov (United States)

    Praskievicz, S. J.; Luo, C.

    2017-12-01

    Classification of rivers is useful for a variety of purposes, such as generating and testing hypotheses about watershed controls on hydrology, predicting hydrologic variables for ungaged rivers, and setting goals for river management. In this research, we present a bottom-up (based on machine learning) river classification designed to investigate the underlying physical processes governing rivers' hydrologic regimes. The classification was developed for the entire state of Alabama, based on 248 United States Geological Survey (USGS) stream gages that met criteria for length and completeness of records. Five dimensionless hydrologic signatures were derived for each gage: slope of the flow duration curve (indicator of flow variability), baseflow index (ratio of baseflow to average streamflow), rising limb density (number of rising limbs per unit time), runoff ratio (ratio of long-term average streamflow to long-term average precipitation), and streamflow elasticity (sensitivity of streamflow to precipitation). We used a Bayesian clustering algorithm to classify the gages, based on the five hydrologic signatures, into distinct hydrologic regimes. We then used classification and regression trees (CART) to predict each gaged river's membership in different hydrologic regimes based on climatic and watershed variables. Using existing geospatial data, we applied the CART analysis to classify ungaged streams in Alabama, with the National Hydrography Dataset Plus (NHDPlus) catchment (average area 3 km2) as the unit of classification. The results of the classification can be used for meeting management and conservation objectives in Alabama, such as developing statewide standards for environmental instream flows. Such hydrologic classification approaches are promising for contributing to process-based understanding of river systems.

  11. Beam-hardening correction by a surface fitting and phase classification by a least square support vector machine approach for tomography images of geological samples

    Science.gov (United States)

    Khan, F.; Enzmann, F.; Kersten, M.

    2015-12-01

    In X-ray computed microtomography (μXCT) image processing is the most important operation prior to image analysis. Such processing mainly involves artefact reduction and image segmentation. We propose a new two-stage post-reconstruction procedure of an image of a geological rock core obtained by polychromatic cone-beam μXCT technology. In the first stage, the beam-hardening (BH) is removed applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. The final BH-corrected image is extracted from the residual data, or the difference between the surface elevation values and the original grey-scale values. For the second stage, we propose using a least square support vector machine (a non-linear classifier algorithm) to segment the BH-corrected data as a pixel-based multi-classification task. A combination of the two approaches was used to classify a complex multi-mineral rock sample. The Matlab code for this approach is provided in the Appendix. A minor drawback is that the proposed segmentation algorithm may become computationally demanding in the case of a high dimensional training data set.

  12. Energy-efficiency based classification of the manufacturing workstation

    Science.gov (United States)

    Frumuşanu, G.; Afteni, C.; Badea, N.; Epureanu, A.

    2017-08-01

    EU Directive 92/75/EC established for the first time an energy consumption labelling scheme, further implemented by several other directives. As consequence, nowadays many products (e.g. home appliances, tyres, light bulbs, houses) have an EU Energy Label when offered for sale or rent. Several energy consumption models of manufacturing equipments have been also developed. This paper proposes an energy efficiency - based classification of the manufacturing workstation, aiming to characterize its energetic behaviour. The concept of energy efficiency of the manufacturing workstation is defined. On this base, a classification methodology has been developed. It refers to specific criteria and their evaluation modalities, together to the definition & delimitation of energy efficiency classes. The energy class position is defined after the amount of energy needed by the workstation in the middle point of its operating domain, while its extension is determined by the value of the first coefficient from the Taylor series that approximates the dependence between the energy consume and the chosen parameter of the working regime. The main domain of interest for this classification looks to be the optimization of the manufacturing activities planning and programming. A case-study regarding an actual lathe classification from energy efficiency point of view, based on two different approaches (analytical and numerical) is also included.

  13. Quantitative diagnosis of breast tumors by morphometric classification of microenvironmental myoepithelial cells using a machine learning approach.

    Science.gov (United States)

    Yamamoto, Yoichiro; Saito, Akira; Tateishi, Ayako; Shimojo, Hisashi; Kanno, Hiroyuki; Tsuchiya, Shinichi; Ito, Ken-Ichi; Cosatto, Eric; Graf, Hans Peter; Moraleda, Rodrigo R; Eils, Roland; Grabe, Niels

    2017-04-25

    Machine learning systems have recently received increased attention for their broad applications in several fields. In this study, we show for the first time that histological types of breast tumors can be classified using subtle morphological differences of microenvironmental myoepithelial cell nuclei without any direct information about neoplastic tumor cells. We quantitatively measured 11661 nuclei on the four histological types: normal cases, usual ductal hyperplasia and low/high grade ductal carcinoma in situ (DCIS). Using a machine learning system, we succeeded in classifying the four histological types with 90.9% accuracy. Electron microscopy observations suggested that the activity of typical myoepithelial cells in DCIS was lowered. Through these observations as well as meta-analytic database analyses, we developed a paracrine cross-talk-based biological mechanism of DCIS progressing to invasive cancer. Our observations support novel approaches in clinical computational diagnostics as well as in therapy development against progression.

  14. A Non Linear Scoring Approach for Evaluating Balance: Classification of Elderly as Fallers and Non-Fallers.

    Science.gov (United States)

    Audiffren, Julien; Bargiotas, Ioannis; Vayatis, Nicolas; Vidal, Pierre-Paul; Ricard, Damien

    2016-01-01

    Almost one third of population 65 years-old and older faces at least one fall per year. An accurate evaluation of the risk of fall through simple and easy-to-use measurements is an important issue in current clinic. A common way to evaluate balance in posturography is through the recording of the centre-of-pressure (CoP) displacement (statokinesigram) with force platforms. A variety of indices have been proposed to differentiate fallers from non fallers. However, no agreement has been reached whether these analyses alone can explain sufficiently the complex synergies of postural control. In this work, we study the statokinesigrams of 84 elderly subjects (80.3+- 6.4 years old), which had no impairment related to balance control. Each subject was recorded 25 seconds with eyes open and 25 seconds with eyes closed and information pertaining to the presence of problems of balance, such as fall, in the last six months, was collected. Five descriptors of the statokinesigrams were computed for each record, and a Ranking Forest algorithm was used to combine those features in order to evaluate each subject's balance with a score. A classical train-test split approach was used to evaluate the performance of the method through ROC analysis. ROC analysis showed that the performance of each descriptor separately was close to a random classifier (AUC between 0.49 and 0.54). On the other hand, the score obtained by our method reached an AUC of 0.75 on the test set, consistent over multiple train-test split. This non linear multi-dimensional approach seems appropriate in evaluating complex postural control.

  15. A Non Linear Scoring Approach for Evaluating Balance: Classification of Elderly as Fallers and Non-Fallers.

    Directory of Open Access Journals (Sweden)

    Julien Audiffren

    Full Text Available Almost one third of population 65 years-old and older faces at least one fall per year. An accurate evaluation of the risk of fall through simple and easy-to-use measurements is an important issue in current clinic. A common way to evaluate balance in posturography is through the recording of the centre-of-pressure (CoP displacement (statokinesigram with force platforms. A variety of indices have been proposed to differentiate fallers from non fallers. However, no agreement has been reached whether these analyses alone can explain sufficiently the complex synergies of postural control. In this work, we study the statokinesigrams of 84 elderly subjects (80.3+- 6.4 years old, which had no impairment related to balance control. Each subject was recorded 25 seconds with eyes open and 25 seconds with eyes closed and information pertaining to the presence of problems of balance, such as fall, in the last six months, was collected. Five descriptors of the statokinesigrams were computed for each record, and a Ranking Forest algorithm was used to combine those features in order to evaluate each subject's balance with a score. A classical train-test split approach was used to evaluate the performance of the method through ROC analysis. ROC analysis showed that the performance of each descriptor separately was close to a random classifier (AUC between 0.49 and 0.54. On the other hand, the score obtained by our method reached an AUC of 0.75 on the test set, consistent over multiple train-test split. This non linear multi-dimensional approach seems appropriate in evaluating complex postural control.

  16. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  17. Classification of iconic images

    OpenAIRE

    Zrianina, Mariia; Kopf, Stephan

    2016-01-01

    Iconic images represent an abstract topic and use a presentation that is intuitively understood within a certain cultural context. For example, the abstract topic “global warming” may be represented by a polar bear standing alone on an ice floe. Such images are widely used in media and their automatic classification can help to identify high-level semantic concepts. This paper presents a system for the classification of iconic images. It uses a variation of the Bag of Visual Words approach wi...

  18. Classification of protein fold classes by knot theory and prediction of folds by neural networks: A combined theoretical and experimental approach

    DEFF Research Database (Denmark)

    Ramnarayan, K.; Bohr, Henrik; Jalkanen, Karl J.

    2008-01-01

    We present different means of classifying protein structure. One is made rigorous by mathematical knot invariants that coincide reasonably well with ordinary graphical fold classification and another classification is by packing analysis. Furthermore when constructing our mathematical fold...... classifications, we utilize standard neural network methods for predicting protein fold classes from amino acid sequences. We also make an analysis of the redundancy of the structural classifications in relation to function and ligand binding. Finally we advocate the use of combining the measurement of the VA...

  19. Multi-Cohort Stand Structural Classification: Ground- and LiDAR-based Approaches for Boreal Mixedwood and Black Spruce Forest Types of Northeastern Ontario

    Science.gov (United States)

    Kuttner, Benjamin George

    Natural fire return intervals are relatively long in eastern Canadian boreal forests and often allow for the development of stands with multiple, successive cohorts of trees. Multi-cohort forest management (MCM) provides a strategy to maintain such multi-cohort stands that focuses on three broad phases of increasingly complex, post-fire stand development, termed "cohorts", and recommends different silvicultural approaches be applied to emulate different cohort types. Previous research on structural cohort typing has relied upon primarily subjective classification methods; in this thesis, I develop more comprehensive and objective methods for three common boreal mixedwood and black spruce forest types in northeastern Ontario. Additionally, I examine relationships between cohort types and stand age, productivity, and disturbance history and the utility of airborne LiDAR to retrieve ground-based classifications and to extend structural cohort typing from plot- to stand-levels. In both mixedwood and black spruce forest types, stand age and age-related deadwood features varied systematically with cohort classes in support of an age-based interpretation of increasing cohort complexity. However, correlations of stand age with cohort classes were surprisingly weak. Differences in site productivity had a significant effect on the accrual of increasingly complex multi-cohort stand structure in both forest types, especially in black spruce stands. The effects of past harvesting in predictive models of class membership were only significant when considered in isolation of age. As an age-emulation strategy, the three cohort model appeared to be poorly suited to black spruce forests where the accrual of structural complexity appeared to be more a function of site productivity than age. Airborne LiDAR data appear to be particularly useful in recovering plot-based cohort types and extending them to the stand-level. The main gradients of structural variability detected using Li

  20. The Effect of Creative Tasks on Electrocardiogram: Using Linear and Nonlinear Features in Combination with Classification Approaches

    Directory of Open Access Journals (Sweden)

    Sahar Zakeri

    2017-02-01

    Full Text Available Objective: Interest in the subject of creativity and its impacts on human life is growing extensively. However, only a few surveys pay attention to the relation between creativity and physiological changes. This paper presents a novel approach to distinguish between creativity states from electrocardiogram signals. Nineteen linear and nonlinear features of the cardiac signal were extracted to detect creativity states. Method: ECG signals of 52 participants were recorded while doing three tasks of Torrance Tests of Creative Thinking (TTCT/ figural B. To remove artifacts, notch filter 50 Hz and Chebyshev II were applied. According to TTCT scores, participants were categorized into the high and low creativity groups: Participants with scores higher than 70 were assigned into the high creativity group and those with scores less than 30 were considered as low creativity group. Some linear and nonlinear features were extracted from the ECGs. Then, Support Vector Machine (SVM and Adaptive Neuro-Fuzzy Inference System (ANFIS were used to classify the groups.Results: Applying the Wilcoxon test, significant differences were observed between rest and each three tasks of creativity. However, better discrimination was performed between rest and the first task. In addition, there were no statistical differences between the second and third task of the test. The results indicated that the SVM effectively detects all the three tasks from the rest, particularly the task 1 and reached the maximum accuracy of 99.63% in the linear analysis. In addition, the high creative group was separated from the low creative group with the accuracy of 98.41%.Conclusion: the combination of SVM classifier with linear features can be useful to show the relation between creativity and physiological changes.

  1. Detecting epileptic seizure with different feature extracting strategies using robust machine learning classification techniques by applying advance parameter optimization approach.

    Science.gov (United States)

    Hussain, Lal

    2018-06-01

    Epilepsy is a neurological disorder produced due to abnormal excitability of neurons in the brain. The research reveals that brain activity is monitored through electroencephalogram (EEG) of patients suffered from seizure to detect the epileptic seizure. The performance of EEG detection based epilepsy require feature extracting strategies. In this research, we have extracted varying features extracting strategies based on time and frequency domain characteristics, nonlinear, wavelet based entropy and few statistical features. A deeper study was undertaken using novel machine learning classifiers by considering multiple factors. The support vector machine kernels are evaluated based on multiclass kernel and box constraint level. Likewise, for K-nearest neighbors (KNN), we computed the different distance metrics, Neighbor weights and Neighbors. Similarly, the decision trees we tuned the paramours based on maximum splits and split criteria and ensemble classifiers are evaluated based on different ensemble methods and learning rate. For training/testing tenfold Cross validation was employed and performance was evaluated in form of TPR, NPR, PPV, accuracy and AUC. In this research, a deeper analysis approach was performed using diverse features extracting strategies using robust machine learning classifiers with more advanced optimal options. Support Vector Machine linear kernel and KNN with City block distance metric give the overall highest accuracy of 99.5% which was higher than using the default parameters for these classifiers. Moreover, highest separation (AUC = 0.9991, 0.9990) were obtained at different kernel scales using SVM. Additionally, the K-nearest neighbors with inverse squared distance weight give higher performance at different Neighbors. Moreover, to distinguish the postictal heart rate oscillations from epileptic ictal subjects, and highest performance of 100% was obtained using different machine learning classifiers.

  2. Machine learning approaches for integrating clinical and imaging features in late-life depression classification and response prediction.

    Science.gov (United States)

    Patel, Meenal J; Andreescu, Carmen; Price, Julie C; Edelman, Kathryn L; Reynolds, Charles F; Aizenstein, Howard J

    2015-10-01

    Currently, depression diagnosis relies primarily on behavioral symptoms and signs, and treatment is guided by trial and error instead of evaluating associated underlying brain characteristics. Unlike past studies, we attempted to estimate accurate prediction models for late-life depression diagnosis and treatment response using multiple machine learning methods with inputs of multi-modal imaging and non-imaging whole brain and network-based features. Late-life depression patients (medicated post-recruitment) (n = 33) and older non-depressed individuals (n = 35) were recruited. Their demographics and cognitive ability scores were recorded, and brain characteristics were acquired using multi-modal magnetic resonance imaging pretreatment. Linear and nonlinear learning methods were tested for estimating accurate prediction models. A learning method called alternating decision trees estimated the most accurate prediction models for late-life depression diagnosis (87.27% accuracy) and treatment response (89.47% accuracy). The diagnosis model included measures of age, Mini-mental state examination score, and structural imaging (e.g. whole brain atrophy and global white mater hyperintensity burden). The treatment response model included measures of structural and functional connectivity. Combinations of multi-modal imaging and/or non-imaging measures may help better predict late-life depression diagnosis and treatment response. As a preliminary observation, we speculate that the results may also suggest that different underlying brain characteristics defined by multi-modal imaging measures-rather than region-based differences-are associated with depression versus depression recovery because to our knowledge this is the first depression study to accurately predict both using the same approach. These findings may help better understand late-life depression and identify preliminary steps toward personalized late-life depression treatment. Copyright © 2015 John Wiley

  3. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  4. About the high flow regime of the rivers of Kosovo and Metohia

    Directory of Open Access Journals (Sweden)

    Živković Nenad

    2009-01-01

    Full Text Available The examples from Kosovo and Metohia attempted to point to some problems in the domain of hydrogeographic regionalization. The river water regime, especially the phase of high flows which marks this regime, has been the topic of almost all researches which treat water resources of drainage basins. However, the thing that has not been achieved till now is the unique solution by which the classification of rivers would be made according to this feature. On this example it has been shown that even some older methods, based on genetic analysis of hydrograms and of global type, as well as some recent ones, with lot of quantitative entry and regional approaches, cannot with certainty answer all the challenges which river regimes bring with themselves. This work shows that apart from climate, orographic and physiognomic features of drainage basins, the periods of data processing and the analysis of individual intra-annual series of discharges are very important as well. Discretization on time periods shorter than one month, as well as elimination of the extreme values of discharges in the longtime series is recommended for the future research.

  5. The global safety regime - Setting the stage

    International Nuclear Information System (INIS)

    Meserve, R.A.

    2005-01-01

    The existing global safety regime has arisen from the exercise of sovereign authority, with an overlay of voluntary international cooperation from a network of international and regional organizations and intergovernmental agreements. This system has, in the main, served us well. For several reasons, the time is ripe to consider the desired shape of a future global safety regime and to take steps to achieve it. First, every nation's reliance on nuclear power is hostage to some extent to safety performance elsewhere in the world because of the effects on public attitudes and hence there is an interest in ensuring achievement of common standards. Second, the world is increasingly interdependent and the vendors of nuclear power plants seek to market their products throughout the globe. Efficiency would arise from the avoidance of needless differences in approach that require custom modifications from country to country. Finally, we have much to learn from each other and a common effort would strengthen us all. Such an effort might also serve to enhance public confidence. Some possible characteristics of such a regime can be identified. The regime should reflect a global consensus on the level of safety that should be achieved. There should be sufficient standardization of approach so that expertise and equipment can be used everywhere without significant modification. There should be efforts to ensure a fundamental commitment to safety and the encouragement of a safety culture. And there should be efforts to adopt more widely the best regulatory practices, recognizing that some modifications in approach may be necessary to reflect each nation's legal and social culture. At the same type, the regime should have the characteristics of flexibility, transparency, stability, practicality, and encouragement of competence. (author)

  6. Typification of the thermal regime of the air in Nicaragua

    International Nuclear Information System (INIS)

    Lecha Estela, Luis; Hernandez Perez, Vidal; Prado Zambrana, Carmen

    1994-01-01

    In this work it is applied the method of thermal regime classification in order to evaluate the heat resources of the country, as a first step to know and to employ, rationally, the national climatic resources. It is analyzed the interaction between the spatio-temporal distribution of the thermal regime and the main climatic factors, showing the differences encountered between each geographic zone of the country and, moreover, they vertical structure. The results have applied utility in several branches of the national economy and they were included in the work to prepare the Climatic Atlas of Nicaragua

  7. Weather types and the regime of wildfires in Portugal

    Science.gov (United States)

    Pereira, M. G.; Trigo, R. M.; Dacamara, C. C.

    2009-04-01

    An objective classification scheme, as developed by Trigo and DaCamara (2000), was applied to classify the daily atmospheric circulation affecting Portugal between 1980 and 2007 into a set of 10 basic weather types (WTs). The classification scheme relies on a set of atmospheric circulation indices, namely southerly flow (SF), westerly flow (WF), total flow (F), southerly shear vorticity (ZS), westerly shear vorticity (ZW) and total vorticity (Z). The weather-typing approach, together with surfacemeteorological variables (e.g. intensity and direction of geostrophic wind, maximum and minimum temperature and precipitation) were then associated to wildfire events as recorded in the official Portuguese fire database consisting of information on each fire occurred in the 18 districts of Continental Portugal within the same period (>450.000 events). The objective of this study is to explore the dependence of wildfire activity on weather and climate and then evaluate the potential of WTs to discriminate among recorded wildfires on what respects to their occurrence and development. Results show that days characterised by surface flow with an eastern component (i.e. NE, E and SE) account for a high percentage of daily burnt area, as opposed to surface westerly flow (NW, W and SW), which represents about a quarter of the total number of days but only accounts for a very low percentage of active fires and of burnt area. Meteorological variables such as minimum and maximum temperatures, that are closely associated to surface wind intensity and direction, also present a good ability to discriminate between the different types of fire events.. Trigo R.M., DaCamara C. (2000) "Circulation Weather Types and their impact on the precipitation regime in Portugal". Int J of Climatology, 20, 1559-1581.

  8. Effects of heat waves on daily excess mortality in 14 Korean cities during the past 20 years (1991-2010): an application of the spatial synoptic classification approach

    Science.gov (United States)

    Lee, Dae-Geun; Kim, Kyu Rang; Kim, Jiyoung; Kim, Baek-Jo; Cho, Chun-Ho; Sheridan, Scott C.; Kalkstein, Laurence S.; Kim, Ho; Yi, Seung-Muk

    2017-11-01

    The aims of this study are to explore the "offensive" summer weather types classified under the spatial synoptic classification (SSC) system and to evaluate their impacts on excess mortality in 14 Korean cities. All-cause deaths per day for the entire population were examined over the summer months (May-September) of 1991-2010. Daily deaths were standardized to account for long-term trends of subcycles (annual, seasonal, and weekly) at the mid-latitudes. In addition, a mortality prediction model was constructed through multiple stepwise regression to develop a heat-health warning system based on synoptic climatology. The result showed that dry tropical (DT) days during early summer caused excess mortality due to non-acclimatization by inhabitants, and moist tropical (MT) plus and double plus resulted in greater spikes of excess mortality due to extremely hot and humid conditions. Among the 14 Korean cities, highly excess mortality for the elderly was observed in Incheon (23.2%, 95%CI 5.6), Seoul (15.8%, 95%CI 2.6), and Jeonju (15.8%, 95%CI 4.6). No time lag effect was observed, and excess mortality gradually increased with time and hot weather simultaneously. The model showed weak performance as its predictions were underestimated for the validation period (2011-2015). Nevertheless, the results clearly revealed the efficiency of relative and multiple-variable approaches better than absolute and single-variable approaches. The results indicate the potential of the SSC as a suitable system for investigating heat vulnerability in South Korea, where hot summers could be a significant risk factor.

  9. COMBINED ANALYSIS OF SENTINEL-1 AND RAPIDEYE DATA FOR IMPROVED CROP TYPE CLASSIFICATION: AN EARLY SEASON APPROACH FOR RAPESEED AND CEREALS

    Directory of Open Access Journals (Sweden)

    U. Lussem

    2016-06-01

    Full Text Available Timely availability of crop acreage estimation is crucial for maintaining economic and ecological sustainability or modelling purposes. Remote sensing data has proven to be a reliable source for crop mapping and acreage estimation on parcel-level. However, when relying on a single source of remote sensing data, e.g. multispectral sensors like RapidEye or Landsat, several obstacles can hamper the desired outcome, for example cloud cover or haze. Another limitation may be a similarity in optical reflectance patterns of crops, especially in an early season approach by the end of March, early April. Usually, a reliable crop type map for winter-crops (winter wheat/rye, winter barley and rapeseed in Central Europe can be obtained by using optical remote sensing data from late April to early May, given a full coverage of the study area and cloudless conditions. These prerequisites can often not be met. By integrating dual-polarimetric SAR-sensors with high temporal and spatial resolution, these limitations can be overcome. SAR-sensors are not influenced by clouds or haze and provide an additional source of information due to the signal-interaction with plant-architecture. The overall goal of this study is to investigate the contribution of Sentinel-1 SAR-data to regional crop type mapping for an early season map of disaggregated winter-crops for a subset of the Rur-Catchment in North Rhine-Westphalia (Germany. For this reason, RapidEye data and Sentinel-1 data are combined and the performance of Support Vector Machine and Maximum Likelihood classifiers are compared. Our results show that a combination of Sentinel-1 and RapidEye is a promising approach for most crops, but consideration of phenology for data selection can improve results. Thus the combination of optical and radar remote sensing data indicates advances for crop-type classification, especially when optical data availability is limited.

  10. Combined Analysis of SENTINEL-1 and Rapideye Data for Improved Crop Type Classification: AN Early Season Approach for Rapeseed and Cereals

    Science.gov (United States)

    Lussem, U.; Hütt, C.; Waldhoff, G.

    2016-06-01

    Timely availability of crop acreage estimation is crucial for maintaining economic and ecological sustainability or modelling purposes. Remote sensing data has proven to be a reliable source for crop mapping and acreage estimation on parcel-level. However, when relying on a single source of remote sensing data, e.g. multispectral sensors like RapidEye or Landsat, several obstacles can hamper the desired outcome, for example cloud cover or haze. Another limitation may be a similarity in optical reflectance patterns of crops, especially in an early season approach by the end of March, early April. Usually, a reliable crop type map for winter-crops (winter wheat/rye, winter barley and rapeseed) in Central Europe can be obtained by using optical remote sensing data from late April to early May, given a full coverage of the study area and cloudless conditions. These prerequisites can often not be met. By integrating dual-polarimetric SAR-sensors with high temporal and spatial resolution, these limitations can be overcome. SAR-sensors are not influenced by clouds or haze and provide an additional source of information due to the signal-interaction with plant-architecture. The overall goal of this study is to investigate the contribution of Sentinel-1 SAR-data to regional crop type mapping for an early season map of disaggregated winter-crops for a subset of the Rur-Catchment in North Rhine-Westphalia (Germany). For this reason, RapidEye data and Sentinel-1 data are combined and the performance of Support Vector Machine and Maximum Likelihood classifiers are compared. Our results show that a combination of Sentinel-1 and RapidEye is a promising approach for most crops, but consideration of phenology for data selection can improve results. Thus the combination of optical and radar remote sensing data indicates advances for crop-type classification, especially when optical data availability is limited.

  11. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  12. Recognition Using Classification and Segmentation Scoring

    National Research Council Canada - National Science Library

    Kimball, Owen; Ostendorf, Mari; Rohlicek, Robin

    1992-01-01

    .... We describe an approach to connected word recognition that allows the use of segmental information through an explicit decomposition of the recognition criterion into classification and segmentation scoring...

  13. Improvement of Classification of Enterprise Circulating Funds

    OpenAIRE

    Rohanova Hanna O.

    2014-01-01

    The goal of the article lies in revelation of possibilities of increase of efficiency of managing enterprise circulating funds by means of improvement of their classification features. Having analysed approaches of many economists to classification of enterprise circulating funds, systemised and supplementing them, the article offers grouping classification features of enterprise circulating funds. In the result of the study the article offers an expanded classification of circulating funds, ...

  14. The concept of regime values: Are revitalization and regime change possible?

    NARCIS (Netherlands)

    Overeem, P.

    2015-01-01

    Among the plethora of public values, one special class is that of “regime values.” This notion plays a central role in the constitutional approach to public administration mainly developed by the late John A. Rohr. In this article, an attempt is made to assess the viability of Rohr’s concept of

  15. Water regime of steam power plants

    International Nuclear Information System (INIS)

    Oesz, Janos

    2011-01-01

    The water regime of water-steam thermal power plants (secondary side of pressurized water reactors (PWR); fossil-fired thermal power plants - referred to as steam power plants) has changed in the past 30 years, due to a shift from water chemistry to water regime approach. The article summarizes measures (that have been realised by chemists of NPP Paks) on which the secondary side of NPP Paks has become a high purity water-steam power plant and by which the water chemistry stress corrosion risk of heat transfer tubes in the VVER-440 steam generators was minimized. The measures can also be applied to the water regime of fossil-fired thermal power plants with super- and subcritical steam pressure. Based on the reliability analogue of PWR steam generators, water regime can be defined as the harmony of construction, material(s) and water chemistry, which needs to be provided in not only the steam generators (boiler) but in each heat exchanger of steam power plant: - Construction determines the processes of flow, heat and mass transfer and their local inequalities; - Material(s) determines the minimal rate of general corrosion and the sensitivity for local corrosion damage; - Water chemistry influences the general corrosion of material(s) and the corrosion products transport, as well as the formation of local corrosion environment. (orig.)

  16. CLASSIFICATION OF CRIMINAL GROUPS

    OpenAIRE

    Natalia Romanova

    2013-01-01

    New types of criminal groups are emerging in modern society.  These types have their special criminal subculture. The research objective is to develop new parameters of classification of modern criminal groups, create a new typology of criminal groups and identify some features of their subculture. Research methodology is based on the system approach that includes using the method of analysis of documentary sources (materials of a criminal case), method of conversations with themembers of the...

  17. Decimal Classification Editions

    Directory of Open Access Journals (Sweden)

    Zenovia Niculescu

    2009-01-01

    Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  18. Decimal Classification Editions

    OpenAIRE

    Zenovia Niculescu

    2009-01-01

    The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  19. Efficient Fingercode Classification

    Science.gov (United States)

    Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang

    In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.

  20. Differential Classification of Dementia

    Directory of Open Access Journals (Sweden)

    E. Mohr

    1995-01-01

    Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.

  1. The development of a classification system for inland aquatic ...

    African Journals Online (AJOL)

    A classification system is described that was developed for inland aquatic ecosystems in South Africa, including wetlands. The six-tiered classification system is based on a top-down, hierarchical classification of aquatic ecosystems, following the functionally-oriented hydrogeomorphic (HGM) approach to classification but ...

  2. Classification and description of world formation types

    Science.gov (United States)

    D. Faber-Langendoen; T. Keeler-Wolf; D. Meidinger; C. Josse; A. Weakley; D. Tart; G. Navarro; B. Hoagland; S. Ponomarenko; G. Fults; Eileen Helmer

    2016-01-01

    An ecological vegetation classification approach has been developed in which a combination of vegetation attributes (physiognomy, structure, and floristics) and their response to ecological and biogeographic factors are used as the basis for classifying vegetation types. This approach can help support international, national, and subnational classification efforts. The...

  3. Current US nuclear liability regime

    International Nuclear Information System (INIS)

    Brown, O.F.

    2000-01-01

    The Price-Anderson Act Adopted by US Congress in 1957 as the world's first national nuclear liability regime. It is a comprehensive, complicated and unique system and stems from special features of US legal regime and federal system of government. It differs from other systems by providing for 'economic', not legal; channeling of liability to facility operator and not recommended as model for other states, but most features adopted by other states and international conventions

  4. Endogenous Monetary Policy Regime Change

    OpenAIRE

    Troy Davig; Eric M. Leeper

    2006-01-01

    This paper makes changes in monetary policy rules (or regimes) endogenous. Changes are triggered when certain endogenous variables cross specified thresholds. Rational expectations equilibria are examined in three models of threshold switching to illustrate that (i) expectations formation effects generated by the possibility of regime change can be quantitatively important; (ii) symmetric shocks can have asymmetric effects; (iii) endogenous switching is a natural way to formally model preempt...

  5. NEW CLASSIFICATION OF ECOPOLICES

    Directory of Open Access Journals (Sweden)

    VOROBYOV V. V.

    2016-09-01

    Full Text Available Problem statement. Ecopolices are the newest stage of the urban planning. They have to be consideredsuchas material and energy informational structures, included to the dynamic-evolutionary matrix netsofex change processes in the ecosystems. However, there are not made the ecopolice classifications, developing on suchapproaches basis. And this determined the topicality of the article. Analysis of publications on theoretical and applied aspects of the ecopolices formation showed, that the work on them is managed mainly in the context of the latest scientific and technological achievements in the various knowledge fields. These settlements are technocratic. They are connected with the morphology of space, network structures of regional and local natural ecosystems, without independent stability, can not exist without continuous man support. Another words, they do not work in with an ecopolices idea. It is come to a head for objective, symbiotic searching of ecopolices concept with the development of their classifications. Purpose statement is to develop the objective evidence for ecopolices and to propose their new classification. Conclusion. On the base of the ecopolices classification have to lie an elements correlation idea of their general plans and men activity type according with natural mechanism of accepting, reworking and transmission of material, energy and information between geo-ecosystems, planet, man, ecopolices material part and Cosmos. New ecopolices classification should be based on the principles of multi-dimensional, time-spaced symbiotic clarity with exchange ecosystem networks. The ecopolice function with this approach comes not from the subjective anthropocentric economy but from the holistic objective of Genesis paradigm. Or, otherwise - not from the Consequence, but from the Cause.

  6. Classifications of track structures

    International Nuclear Information System (INIS)

    Paretzke, H.G.

    1984-01-01

    When ionizing particles interact with matter they produce random topological structures of primary activations which represent the initial boundary conditions for all subsequent physical, chemical and/or biological reactions. There are two important aspects of research on such track structures, namely their experimental or theoretical determination on one hand and the quantitative classification of these complex structures which is a basic pre-requisite for the understanding of mechanisms of radiation actions. This paper deals only with the latter topic, i.e. the problems encountered in and possible approaches to quantitative ordering and grouping of these multidimensional objects by their degrees of similarity with respect to their efficiency in producing certain final radiation effects, i.e. to their ''radiation quality.'' Various attempts of taxonometric classification with respect to radiation efficiency have been made in basic and applied radiation research including macro- and microdosimetric concepts as well as track entities and stopping power based theories. In this paper no review of those well-known approaches is given but rather an outline and discussion of alternative methods new to this field of radiation research which have some very promising features and which could possibly solve at least some major classification problems

  7. The book classification of William Torrey Harris: influences of Bacon and Hegel in library classification

    Directory of Open Access Journals (Sweden)

    Rodrigo de Sales

    2017-09-01

    Full Text Available The studies of library classification generally interact with the historical contextualization approach and with the classification ideas typical of Philosophy. In the 19th century, the North-American philosopher and educator William Torrey Harris developed a book classification at the St. Louis Public School, based on Francis Bacon and Georg Wilhelm Friedrich Hegel. The objective of this essay is to analyze Harris’s classification, reflecting upon his theoretical and philosophical backgrounds. To achieve such objective, this essay adopts a critical-descriptive approach for analysis. Results show some influences of Bacon and Hegel in Harris’s classification.

  8. The predisposition, infection, response and organ failure (Piro sepsis classification system: results of hospital mortality using a novel concept and methodological approach.

    Directory of Open Access Journals (Sweden)

    Cristina Granja

    Full Text Available INTRODUCTION: PIRO is a conceptual classification system in which a number of demographic, clinical, biological and laboratory variables are used to stratify patients with sepsis in categories with different outcomes, including mortality rates. OBJECTIVES: To identify variables to be included in each component of PIRO aiming to improve the hospital mortality prediction. METHODS: Patients were selected from the Portuguese ICU-admitted community-acquired sepsis study (SACiUCI. Variables concerning the R and O component included repeated measurements along the first five days in ICU stay. The trends of these variables were summarized as the initial value at day 1 (D1 and the slope of the tendency during the five days, using a linear mixed model. Logistic regression models were built to assess the best set of covariates that predicted hospital mortality. RESULTS: A total of 891 patients (age 60±17 years, 64% men, 38% hospital mortality were studied. Factors significantly associated with mortality for P component were gender, age, chronic liver failure, chronic renal failure and metastatic cancer; for I component were positive blood cultures, guideline concordant antibiotic therapy and health-care associated sepsis; for R component were C-reactive protein slope, D1 heart rate, heart rate slope, D1 neutrophils and neutrophils slope; for O component were D1 serum lactate, serum lactate slope, D1 SOFA and SOFA slope. The relative weight of each component of PIRO was calculated. The combination of these four results into a single-value predictor of hospital mortality presented an AUC-ROC 0.84 (IC(95%:0.81-0.87 and a test of goodness-of-fit (Hosmer and Lemeshow of p = 0.368. CONCLUSIONS: We identified specific variables associated with each of the four components of PIRO, including biomarkers and a dynamic view of the patient daily clinical course. This novel approach to PIRO concept and overall score can be a better predictor of mortality for

  9. The Displacement of Regimes of Action in the Armed Forces

    DEFF Research Database (Denmark)

    Holsting, Vilhelm Stefan

    The aim of this paper is to explore how one might approach the profession of military command using the concepts of regimes of justification by Luc Boltanski and Laurent Thévenot in an analysis of the contemporary disputes and tensions between political and professional actors and in a time where...... of their contemporary role, responsibility and challenges) and an analysis of the displacement of military regimes in the light of economic and security changes at the societal level. I am arguing that the entrance of new regimes of justification are challenging and displacing the traditional professional justificatory...

  10. TYPES OF POLITICAL REGIMES IN THE IRKUTSK REGION

    Directory of Open Access Journals (Sweden)

    И В Орлова

    2017-12-01

    Full Text Available The authors consider contemporary western and Russian classifications of regional political regimes and their applicability for Russia. Based on the analysis of political theories, the authors chose the traditional typology of regional political regimes focusing on the minimalist interpretation of democracy (electoral competition and methods for identifying regional scenarios introduced by V.Ya. Gelman. The authors study the case of the Irkutsk Region as a region with conflicting elites, in which in a short period several regional heads were replaced. Based on the contemporary political history, the authors analyze the regional political regime using the following criteria: democracy/autocracy, consolidation/oligo-poly, compromise/conflict relations within the ruling elite. The results of the analysis prove the existence of checks and balances in the political system of the Irkutsk Region. Such a system restrains strong politicians attempts to monopolize the political power in the region. When any political player gains too much influence, other centers of power unite against him and together return the situation to the status quo. The political regime of the Irkutsk Region ensures a relatively high level of political competition, at the same time it is a part of the uncompetitive political regime of the Russian Federation, therefore it is a ‘hybrid democracy’. The authors’ analysis of intra-elite relations in the region revealed a high predisposition to conflicts with the dominant scenario of ‘war of all against all’.

  11. Transition from weak wave turbulence regime to solitonic regime

    Science.gov (United States)

    Hassani, Roumaissa; Mordant, Nicolas

    2017-11-01

    The Weak Turbulence Theory (WTT) is a statistical theory describing the interaction of a large ensemble of random waves characterized by very different length scales. For both weak non-linearity and weak dispersion a different regime is predicted where solitons propagate while keeping their shape unchanged. The question under investigation here is which regime between weak turbulence or soliton gas does the system choose ? We report an experimental investigation of wave turbulence at the surface of finite depth water in the gravity-capillary range. We tune the wave dispersion and the level of nonlinearity by modifying the depth of water and the forcing respectively. We use space-time resolved profilometry to reconstruct the deformed surface of water. When decreasing the water depth, we observe a drastic transition between weak turbulence at the weakest forcing and a solitonic regime at stronger forcing. We characterize the transition between both states by studying their Fourier Spectra. We also study the efficiency of energy transfer in the weak turbulence regime. We report a loss of efficiency of angular transfer as the dispersion of the wave is reduced until the system bifurcates into the solitonic regime. This project has recieved funding from the European Research Council (ERC, Grant Agreement No. 647018-WATU).

  12. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  13. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  14. An Accurate CT Saturation Classification Using a Deep Learning Approach Based on Unsupervised Feature Extraction and Supervised Fine-Tuning Strategy

    Directory of Open Access Journals (Sweden)

    Muhammad Ali

    2017-11-01

    Full Text Available Current transformer (CT saturation is one of the significant problems for protection engineers. If CT saturation is not tackled properly, it can cause a disastrous effect on the stability of the power system, and may even create a complete blackout. To cope with CT saturation properly, an accurate detection or classification should be preceded. Recently, deep learning (DL methods have brought a subversive revolution in the field of artificial intelligence (AI. This paper presents a new DL classification method based on unsupervised feature extraction and supervised fine-tuning strategy to classify the saturated and unsaturated regions in case of CT saturation. In other words, if protection system is subjected to a CT saturation, proposed method will correctly classify the different levels of saturation with a high accuracy. Traditional AI methods are mostly based on supervised learning and rely heavily on human crafted features. This paper contributes to an unsupervised feature extraction, using autoencoders and deep neural networks (DNNs to extract features automatically without prior knowledge of optimal features. To validate the effectiveness of proposed method, a variety of simulation tests are conducted, and classification results are analyzed using standard classification metrics. Simulation results confirm that proposed method classifies the different levels of CT saturation with a remarkable accuracy and has unique feature extraction capabilities. Lastly, we provided a potential future research direction to conclude this paper.

  15. An efficient swarm intelligence approach to feature selection based on invasive weed optimization: Application to multivariate calibration and classification using spectroscopic data

    Science.gov (United States)

    Sheykhizadeh, Saheleh; Naseri, Abdolhossein

    2018-04-01

    Variable selection plays a key role in classification and multivariate calibration. Variable selection methods are aimed at choosing a set of variables, from a large pool of available predictors, relevant to the analyte concentrations estimation, or to achieve better classification results. Many variable selection techniques have now been introduced among which, those which are based on the methodologies of swarm intelligence optimization have been more respected during a few last decades since they are mainly inspired by nature. In this work, a simple and new variable selection algorithm is proposed according to the invasive weed optimization (IWO) concept. IWO is considered a bio-inspired metaheuristic mimicking the weeds ecological behavior in colonizing as well as finding an appropriate place for growth and reproduction; it has been shown to be very adaptive and powerful to environmental changes. In this paper, the first application of IWO, as a very simple and powerful method, to variable selection is reported using different experimental datasets including FTIR and NIR data, so as to undertake classification and multivariate calibration tasks. Accordingly, invasive weed optimization - linear discrimination analysis (IWO-LDA) and invasive weed optimization- partial least squares (IWO-PLS) are introduced for multivariate classification and calibration, respectively.

  16. Defining active sacroiliitis on magnetic resonance imaging (MRI) for classification of axial spondyloarthritis: a consensual approach by the ASAS/OMERACT MRI group

    NARCIS (Netherlands)

    Rudwaleit, M.; Jurik, A. G.; Hermann, K.-G. A.; Landewé, R.; van der Heijde, D.; Baraliakos, X.; Marzo-Ortega, H.; Ostergaard, M.; Braun, J.; Sieper, J.

    2009-01-01

    Magnetic resonance imaging (MRI) of sacroiliac joints has evolved as the most relevant imaging modality for diagnosis and classification of early axial spondyloarthritis (SpA) including early ankylosing spondylitis. To identify and describe MRI findings in sacroiliitis and to reach consensus on

  17. Detecting spatial regimes in ecosystems | Science Inventory ...

    Science.gov (United States)

    Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning

  18. Hazard classification methodology