Flow Regime Identification of Co-Current Downward Two-Phase Flow With Neural Network Approach
International Nuclear Information System (INIS)
Hiroshi Goda; Seungjin Kim; Ye Mi; Finch, Joshua P.; Mamoru Ishii; Jennifer Uhle
2002-01-01
Flow regime identification for an adiabatic vertical co-current downward air-water two-phase flow in the 25.4 mm ID and the 50.8 mm ID round tubes was performed by employing an impedance void meter coupled with the neural network classification approach. This approach minimizes the subjective judgment in determining the flow regimes. The signals obtained by an impedance void meter were applied to train the self-organizing neural network to categorize these impedance signals into a certain number of groups. The characteristic parameters set into the neural network classification included the mean, standard deviation and skewness of impedance signals in the present experiment. The classification categories adopted in the present investigation were four widely accepted flow regimes, viz. bubbly, slug, churn-turbulent, and annular flows. These four flow regimes were recognized based upon the conventional flow visualization approach by a high-speed motion analyzer. The resulting flow regime maps classified by the neural network were compared with the results obtained through the flow visualization method, and consequently the efficiency of the neural network classification for flow regime identification was demonstrated. (authors)
De Facto Exchange Rate Regime Classifications Are Better Than You Think
Michael Bleaney; Mo Tian; Lin Yin
2015-01-01
Several de facto exchange rate regime classifications have been widely used in empirical research, but they are known to disagree with one another to a disturbing extent. We dissect the algorithms employed and argue that they can be significantly improved. We implement the improvements, and show that there is a far higher agreement rate between the modified classifications. We conclude that the current pessimism about de facto exchange rate regime classification schemes is unwarranted.
Flow regime classification in air-magnetic fluid two-phase flow.
Kuwahara, T; De Vuyst, F; Yamaguchi, H
2008-05-21
A new experimental/numerical technique of classification of flow regimes (flow patterns) in air-magnetic fluid two-phase flow is proposed in the present paper. The proposed technique utilizes the electromagnetic induction to obtain time-series signals of the electromotive force, allowing us to make a non-contact measurement. Firstly, an experiment is carried out to obtain the time-series signals in a vertical upward air-magnetic fluid two-phase flow. The signals obtained are first treated using two kinds of wavelet transforms. The data sets treated are then used as input vectors for an artificial neural network (ANN) with supervised training. In the present study, flow regimes are classified into bubbly, slug, churn and annular flows, which are generally the main flow regimes. To validate the flow regimes, a visualization experiment is also performed with a glycerin solution that has roughly the same physical properties, i.e., kinetic viscosity and surface tension, as a magnetic fluid used in the present study. The flow regimes from the visualization are used as targets in an ANN and also used in the estimation of the accuracy of the present method. As a result, ANNs using radial basis functions are shown to be the most appropriate for the present classification of flow regimes, leading to small classification errors.
Energy Technology Data Exchange (ETDEWEB)
Gallego, C. J.
2010-03-08
Abstract: This technical report is focused on the analysis of stochastic processes that switch between different dynamics (also called regimes or mechanisms) over time. The so-called Switching-regime models consider several underlying functions instead of one. In this case, a classification problem arises as the current regime has to be assessed at each time-step. The identification of the regimes allows the performance of regime-switching models for short-term forecasting purposes. Within this framework, identifying different regimes showed by time-series is the aim of this work. The proposed approach is based on a statistical tool called Gamma-test. One of the main advantages of this methodology is the absence of a mathematical definition for the different underlying functions. Applications with both simulated and real wind power data have been considered. Results on simulated time series show that regimes can be successfully identified under certain hypothesis. Nevertheless, this work highlights that further research has to be done when considering real wind power time-series, which usually show different behaviours (e.g. fluctuations or ramps, followed by low variance periods). A better understanding of these events eventually will improve wind power forecasting. (Author) 15 refs.
Directory of Open Access Journals (Sweden)
Fernán eSáenz
2015-04-01
Full Text Available Based on the potential of the weather types classification method to study synoptic features, this study proposes the application of such methodology for the identification of the main large scale patterns related with weather in Central America. Using ERA Interim low-level winds in a domain that encompasses the intra-Americas sea, the eastern tropical Pacific, southern North America, Central America and northern South America; the K-means clustering algorithm was applied to find recurrent regimes of low-level winds. Eleven regimes were identified and good coherency between the results and known features of regional circulation was found. It was determined that the main large scale patterns can be either locally forced or a response to tropical-extratropical interactions. Moreover, the local forcing dominates the summer regimes whereas mid latitude interactions lead winter regimes. The study of the relationship between the large scale patterns and regional precipitation shows that winter regimes are related with the Caribbean-Pacific precipitation seesaw. Summer regimes, on the other hand, enhance the Caribbean-Pacific precipitation contrasting distribution as a function of the dominant regimes. A strong influence of ENSO on the frequency and duration of the regimes was found. It was determined that the specific effect of ENSO on the regimes depends on whether the circulation is locally forced or lead by the interaction between the tropics and the mid-latitudes. The study of the cold surges using the information of the identified regimes revealed that three regimes are linkable with the occurrence of cold surges that affect Central America and its precipitation. As the winter regimes are largely dependent of mid-latitude interaction with the tropics, the effect that ENSO has on the Jet Stream is reflected in the winter regimes. An automated analysis of large scale conditions based on reanalysis and/or model data seems useful for both dynamical
International Nuclear Information System (INIS)
Abbagoni, Baba Musa; Yeung, Hoi
2016-01-01
The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas–liquid flow regimes objectively with the gas–liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the
Musa Abbagoni, Baba; Yeung, Hoi
2016-08-01
The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas-liquid flow regimes objectively with the gas-liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the
A New Classification Approach Based on Multiple Classification Rules
Zhongmei Zhou
2014-01-01
A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...
Automatic Classification of Offshore Wind Regimes With Weather Radar Observations
DEFF Research Database (Denmark)
Trombe, Pierre-Julien; Pinson, Pierre; Madsen, Henrik
2014-01-01
Weather radar observations are called to play an important role in offshore wind energy. In particular, they can enable the monitoring of weather conditions in the vicinity of large-scale offshore wind farms and thereby notify the arrival of precipitation systems associated with severe wind...... and amplitude) using reflectivity observations from a single weather radar system. A categorical sequence of most likely wind regimes is estimated from a wind speed time series by combining a Markov-Switching model and a global decoding technique, the Viterbi algorithm. In parallel, attributes of precipitation...... systems are extracted from weather radar images. These attributes describe the global intensity, spatial continuity and motion of precipitation echoes on the images. Finally, a CART classification tree is used to find the broad relationships between precipitation attributes and wind regimes...
Synergies between nonproliferation regimes: A pragmatic approach
International Nuclear Information System (INIS)
Findlay, Trevor; Meier, Oliver
2001-01-01
Full text: With the recent progress in establishing international nonproliferation regimes, the question of synergies between different verification and monitoring regimes is becoming more acute. Three multilateral and universal nonproliferation organisations covering safeguards on civil nuclear materials, nuclear testing, and chemical weapons are up and running. A regime on biological weapons is under negotiation. Several regional organisations concerned with monitoring nonproliferation commitments in the nuclear field are in place; others are being established. Past discussions on synergies between these regimes have suffered from being too far-reaching. These discussions often have not reflected adequately the political difficulties of cooperation between regimes with different membership, scope and institutional set-up. This paper takes a pragmatic look at exploiting synergies and identifies some potential and real overlaps in the work between different verification regimes. It argues for a bottom-up approach and identifies building blocks for collaboration between verification regimes. By realising such, more limited potential for cooperation, the ground could be prepared for exploiting other synergies between these regimes. (author)
Exchange rate regimes and monetary arrangements
Directory of Open Access Journals (Sweden)
Ivan Ribnikar
2005-06-01
Full Text Available There is a close relationship between a country’s exchange rate regime and monetary arrangement and if we are to examine monetary arrangements then exchange rate regimes must first be analysed. Within the conventional and most widely used classification of exchange rate regimes into rigid and flexible or into polar regimes (hard peg and float on one side, and intermediate regimes on the other there, is a much greater variety among intermediate regimes. A more precise and, as will be seen, more useful classification of exchange rate regimes is the first topic of the paper. The second topic is how exchange rate regimes influence or determine monetary arrangements and monetary policy or monetary policy regimes: monetary autonomy versus monetary nonautonomy and discretion in monetary policy versus commitment in monetary policy. Both topics are important for countries on their path to the EU and the euro area
Directory of Open Access Journals (Sweden)
Xin-Gang Dai
2017-03-01
Full Text Available This study aims to develop a large-scale climate classification for investigating the characteristics of the climate regimes around the Tibetan Plateau based on seasonal precipitation, moisture transport and moisture divergence using in situ observations and ERA40 reanalysis data. The results indicate that the climate can be attributed to four regimes around the Plateau. They situate in East Asia, South Asia, Central Asia and the semi-arid zone in northern Central Asia throughout the dryland of northwestern China, in addition to the Köppen climate classification. There are different collocations of seasonal temperature and precipitation: 1 in phase for the East and South Asia monsoon regimes, 2 anti-phase for the Central Asia regime, 3 out-of-phase for the westerly regime. The seasonal precipitation concentrations are coupled with moisture divergence, i.e., moisture convergence coincides with the Asian monsoon zone and divergence appears over the Mediterranean-like arid climate region and westerly controlled area in the warm season, while it reverses course in the cold season. In addition, moisture divergence is associated with meridional moisture transport. The northward/southward moisture transport corresponds to moisture convergence/divergence, indicating that the wet and dry seasons are, to a great extent, dominated by meridional moisture transport in these regions. The climate mean southward transport results in the dry-cold season of the Asian monsoon zone and the dry-warm season, leading to desertification or land degradation in Central Asia and the westerly regime zone. The mean-wind moisture transport (MMT is the major contributor to total moisture transport, while persistent northward transient eddy moisture transport (TEMT plays a key role in dry season precipitation, especially in the Asian monsoon zone. The persistent TEMT divergence is an additional mechanism of the out-of-phase collocation in the westerly regime zone. In addition
Directory of Open Access Journals (Sweden)
M. Sivapalan
2012-11-01
Full Text Available Predictions of hydrological responses in ungauged catchments can benefit from a classification scheme that can organize and pool together catchments that exhibit a level of hydrologic similarity, especially similarity in some key variable or signature of interest. Since catchments are complex systems with a level of self-organization arising from co-evolution of climate and landscape properties, including vegetation, there is much to be gained from developing a classification system based on a comparative study of a population of catchments across climatic and landscape gradients. The focus of this paper is on climate seasonality and seasonal runoff regime, as characterized by the ensemble mean of within-year variation of climate and runoff. The work on regime behavior is part of an overall study of the physical controls on regional patterns of flow duration curves (FDCs, motivated by the fact that regime behavior leaves a major imprint upon the shape of FDCs, especially the slope of the FDCs. As an exercise in comparative hydrology, the paper seeks to assess the regime behavior of 428 catchments from the MOPEX database simultaneously, classifying and regionalizing them into homogeneous or hydrologically similar groups. A decision tree is developed on the basis of a metric chosen to characterize similarity of regime behavior, using a variant of the Iterative Dichotomiser 3 (ID3 algorithm to form a classification tree and associated catchment classes. In this way, several classes of catchments are distinguished, in which the connection between the five catchments' regime behavior and climate and catchment properties becomes clearer. Only four similarity indices are entered into the algorithm, all of which are obtained from smoothed daily regime curves of climatic variables and runoff. Results demonstrate that climate seasonality plays the most significant role in the classification of US catchments, with rainfall timing and climatic aridity index
International Nuclear Information System (INIS)
Kh'yuitt, G.
1980-01-01
An introduction into the problem of two-phase flows is presented. Flow regimes arizing in two-phase flows are described, and classification of these regimes is given. Structures of vertical and horizontal two-phase flows and a method of their identification using regime maps are considered. The limits of this method application are discussed. The flooding phenomena and phenomena of direction change (flow reversal) of the flow and interrelation of these phenomena as well as transitions from slug regime to churn one and from churn one to annular one in vertical flows are described. Problems of phase transitions and equilibrium are discussed. Flow regimes in tubes where evaporating liquid is running, are described [ru
Effective Exchange Rate Classifications and Growth
Justin M. Dubas; Byung-Joo Lee; Nelson C. Mark
2005-01-01
We propose an econometric procedure for obtaining de facto exchange rate regime classifications which we apply to study the relationship between exchange rate regimes and economic growth. Our classification method models the de jure regimes as outcomes of a multinomial logit choice problem conditional on the volatility of a country's effective exchange rate, a bilateral exchange rate and international reserves. An `effective' de facto exchange rate regime classification is then obtained by as...
Abbagoni, Baba Musa; Yeung, Hoi
2016-01-01
The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas–liquid flow regimes objectively with the gas–liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase ...
Knowledge-based approach to video content classification
Chen, Yu; Wong, Edward K.
2001-01-01
A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.
The decision tree approach to classification
Wu, C.; Landgrebe, D. A.; Swain, P. H.
1975-01-01
A class of multistage decision tree classifiers is proposed and studied relative to the classification of multispectral remotely sensed data. The decision tree classifiers are shown to have the potential for improving both the classification accuracy and the computation efficiency. Dimensionality in pattern recognition is discussed and two theorems on the lower bound of logic computation for multiclass classification are derived. The automatic or optimization approach is emphasized. Experimental results on real data are reported, which clearly demonstrate the usefulness of decision tree classifiers.
An ordinal classification approach for CTG categorization.
Georgoulas, George; Karvelis, Petros; Gavrilis, Dimitris; Stylios, Chrysostomos D; Nikolakopoulos, George
2017-07-01
Evaluation of cardiotocogram (CTG) is a standard approach employed during pregnancy and delivery. But, its interpretation requires high level expertise to decide whether the recording is Normal, Suspicious or Pathological. Therefore, a number of attempts have been carried out over the past three decades for development automated sophisticated systems. These systems are usually (multiclass) classification systems that assign a category to the respective CTG. However most of these systems usually do not take into consideration the natural ordering of the categories associated with CTG recordings. In this work, an algorithm that explicitly takes into consideration the ordering of CTG categories, based on binary decomposition method, is investigated. Achieved results, using as a base classifier the C4.5 decision tree classifier, prove that the ordinal classification approach is marginally better than the traditional multiclass classification approach, which utilizes the standard C4.5 algorithm for several performance criteria.
Regime identification in ASDEX Upgrade
International Nuclear Information System (INIS)
Giannone, L; Sips, A C C; Kardaun, O; Spreitler, F; Suttrop, W
2004-01-01
The ability to recognize the transition from the L-mode to the H-mode or from the H-mode to the improved H-mode reliably from a conveniently small number of measurements in real time is of increasing importance for machine control. Discriminant analysis has been applied to regime identification of plasma discharges in the ASDEX Upgrade tokamak. An observation consists of a set of plasma parameters averaged over a time slice in a discharge. The data set consists of all observations over different discharges and time slices. Discriminant analysis yields coefficients allowing the classification of a new observation. The results of a frequentist and a formal Bayesian approach to discriminant analysis are compared. With five plasma variables, a failure rate of 1.3% for predicting the L-mode and the H-mode confinement regime was achieved. With five plasma variables, a failure rate of 5.3% for predicting the H-mode and the improved H-mode confinement regime was achieved. The coefficients derived by discriminant analysis have been applied subsequently to discharges to illustrate the operation of regime identification in a real time control system
A Data Mining Classification Approach for Behavioral Malware Detection
Directory of Open Access Journals (Sweden)
Monire Norouzi
2016-01-01
Full Text Available Data mining techniques have numerous applications in malware detection. Classification method is one of the most popular data mining techniques. In this paper we present a data mining classification approach to detect malware behavior. We proposed different classification methods in order to detect malware based on the feature and behavior of each malware. A dynamic analysis method has been presented for identifying the malware features. A suggested program has been presented for converting a malware behavior executive history XML file to a suitable WEKA tool input. To illustrate the performance efficiency as well as training data and test, we apply the proposed approaches to a real case study data set using WEKA tool. The evaluation results demonstrated the availability of the proposed data mining approach. Also our proposed data mining approach is more efficient for detecting malware and behavioral classification of malware can be useful to detect malware in a behavioral antivirus.
Hydrological Climate Classification: Can We Improve on Köppen-Geiger?
Knoben, W.; Woods, R. A.; Freer, J. E.
2017-12-01
Classification is essential in the study of complex natural systems, yet hydrology so far has no formal way to structure the climate forcing which underlies hydrologic response. Various climate classification systems can be borrowed from other disciplines but these are based on different organizing principles than a hydrological classification might use. From gridded global data we calculate a gridded aridity index, an aridity seasonality index and a rain-vs-snow index, which we use to cluster global locations into climate groups. We then define the membership degree of nearly 1100 catchments to each of our climate groups based on each catchment's climate and investigate the extent to which streamflow responses within each climate group are similar. We compare this climate classification approach with the often-used Köppen-Geiger classification, using statistical tests based on streamflow signature values. We find that three climate indices are sufficient to distinguish 18 different climate types world-wide. Climates tend to change gradually in space and catchments can thus belong to multiple climate groups, albeit with different degrees of membership. Streamflow responses within a climate group tend to be similar, regardless of the catchments' geographical proximity. A Wilcoxon two-sample test based on streamflow signature values for each climate group shows that the new classification can distinguish different flow regimes using this classification scheme. The Köppen-Geiger approach uses 29 climate classes but is less able to differentiate streamflow regimes. Climate forcing exerts a strong control on typical hydrologic response and both change gradually in space. This makes arbitrary hard boundaries in any classification scheme difficult to defend. Any hydrological classification should thus acknowledge these gradual changes in forcing. Catchment characteristics (soil or vegetation type, land use, etc) can vary more quickly in space than climate does, which
Flow Regime Classification and Hydrological Characterization: A Case Study of Ethiopian Rivers
Directory of Open Access Journals (Sweden)
Belete Berhanu
2015-06-01
Full Text Available The spatiotemporal variability of a stream flow due to the complex interaction of catchment attributes and rainfall induce complexity in hydrology. Researchers have been trying to address this complexity with a number of approaches; river flow regime is one of them. The flow regime can be quantified by means of hydrological indices characterizing five components: magnitude, frequency, duration, timing, and rate of change of flow. Similarly, this study aimed to understand the flow variability of Ethiopian Rivers using the observed daily flow data from 208 gauging stations in the country. With this process, the Hierarchical Ward Clustering method was implemented to group the streams into three flow regimes (1 ephemeral, (2 intermittent, and (3 perennial. Principal component analysis (PCA is also applied as the second multivariate analysis tool to identify dominant hydrological indices that cause the variability in the streams. The mean flow per unit catchment area (QmAR and Base flow index (BFI show an incremental trend with ephemeral, intermittent and perennial streams. Whereas the number of mean zero flow days ratio (ZFI and coefficient of variation (CV show a decreasing trend with ephemeral to perennial flow regimes. Finally, the streams in the three flow regimes were characterized with the mean and standard deviation of the hydrological variables and the shape, slope, and scale of the flow duration curve. Results of this study are the basis for further understanding of the ecohydrological processes of the river basins in Ethiopia.
Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.
Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel
2017-06-01
Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.
DETERMINANTS OF FOREIGN DIRECT INVESTMENT IN NIGERIA: A MARKOV REGIME-SWITCHING APPROACH
Directory of Open Access Journals (Sweden)
Akinlo A. Enisan
2017-04-01
Full Text Available Several studies have analyzed the movement of foreign direct investment in Nigeria using linear approach. In contrast with all existing studies in Nigeria, this paper runs several non linear FDI equations where the main determinants of FDI are determined using Markov- Regime Switching Model (MSMs. The approach enables us to observe structural changes, where exist, in FDI equations through time. Asides, where FDI regression equation is truly nonlinear, MSMs fit data better than the linear models. The paper adopts maximum likelihood methodology of Markov-Regime Model (MSM to identify possible structural changes in level and/or trends and possible changes in parameters of independent variables through the transition probabilities. The results show that FDI process in Nigeria is governed by two different regimes and a shift from one regime to another regime depends on transition probabilities. The results show that the main determinants of FDI are GDP growth, macro instability, financial development, exchange rate, inflation and discount rate. This implies liberalization that stems inflation and enhance the value of domestic currency will attract more FDI into the country.
National-Scale Hydrologic Classification & Agricultural Decision Support: A Multi-Scale Approach
Coopersmith, E. J.; Minsker, B.; Sivapalan, M.
2012-12-01
Classification frameworks can help organize catchments exhibiting similarity in hydrologic and climatic terms. Focusing this assessment of "similarity" upon specific hydrologic signatures, in this case the annual regime curve, can facilitate the prediction of hydrologic responses. Agricultural decision-support over a diverse set of catchments throughout the United States depends upon successful modeling of the wetting/drying process without necessitating separate model calibration at every site where such insights are required. To this end, a holistic classification framework is developed to describe both climatic variability (humid vs. arid, winter rainfall vs. summer rainfall) and the draining, storing, and filtering behavior of any catchment, including ungauged or minimally gauged basins. At the national scale, over 400 catchments from the MOPEX database are analyzed to construct the classification system, with over 77% of these catchments ultimately falling into only six clusters. At individual locations, soil moisture models, receiving only rainfall as input, produce correlation values in excess of 0.9 with respect to observed soil moisture measurements. By deploying physical models for predicting soil moisture exclusively from precipitation that are calibrated at gauged locations, overlaying machine learning techniques to improve these estimates, then generalizing the calibration parameters for catchments in a given class, agronomic decision-support becomes available where it is needed rather than only where sensing data are located.lassifications of 428 U.S. catchments on the basis of hydrologic regime data, Coopersmith et al, 2012.
An objective and parsimonious approach for classifying natural flow regimes at a continental scale
Archfield, S. A.; Kennen, J.; Carlisle, D.; Wolock, D.
2013-12-01
Hydroecological stream classification--the process of grouping streams by similar hydrologic responses and, thereby, similar aquatic habitat--has been widely accepted and is often one of the first steps towards developing ecological flow targets. Despite its importance, the last national classification of streamgauges was completed about 20 years ago. A new classification of 1,534 streamgauges in the contiguous United States is presented using a novel and parsimonious approach to understand similarity in ecological streamflow response. This new classification approach uses seven fundamental daily streamflow statistics (FDSS) rather than winnowing down an uncorrelated subset from 200 or more ecologically relevant streamflow statistics (ERSS) commonly used in hydroecological classification studies. The results of this investigation demonstrate that the distributions of 33 tested ERSS are consistently different among the classes derived from the seven FDSS. It is further shown that classification based solely on the 33 ERSS generally does a poorer job in grouping similar streamgauges than the classification based on the seven FDSS. This new classification approach has the additional advantages of overcoming some of the subjectivity associated with the selection of the classification variables and provides a set of robust continental-scale classes of US streamgauges.
A Risk-Based Ecohydrological Approach to Assessing Environmental Flow Regimes
Mcgregor, Glenn B.; Marshall, Jonathan C.; Lobegeiger, Jaye S.; Holloway, Dean; Menke, Norbert; Coysh, Julie
2018-03-01
For several decades there has been recognition that water resource development alters river flow regimes and impacts ecosystem values. Determining strategies to protect or restore flow regimes to achieve ecological outcomes is a focus of water policy and legislation in many parts of the world. However, consideration of existing environmental flow assessment approaches for application in Queensland identified deficiencies precluding their adoption. Firstly, in managing flows and using ecosystem condition as an indicator of effectiveness, many approaches ignore the fact that river ecosystems are subjected to threatening processes other than flow regime alteration. Secondly, many focus on providing flows for responses without considering how often they are necessary to sustain ecological values in the long-term. Finally, few consider requirements at spatial-scales relevant to the desired outcomes, with frequent focus on individual places rather than the regions supporting sustainability. Consequently, we developed a risk-based ecohydrological approach that identifies ecosystem values linked to desired ecological outcomes, is sensitive to flow alteration and uses indicators of broader ecosystem requirements. Monitoring and research is undertaken to quantify flow-dependencies and ecological modelling is used to quantify flow-related ecological responses over an historical flow period. The relative risk from different flow management scenarios can be evaluated at relevant spatial-scales. This overcomes the deficiencies identified above and provides a robust and useful foundation upon which to build the information needed to support water planning decisions. Application of the risk assessment approach is illustrated here by two case studies.
Approaches to Substance of Social Infrastructure and to Its Classification
Directory of Open Access Journals (Sweden)
Kyrychenko Sergiy О. –
2016-03-01
Full Text Available The article is concerned with studying and analyzing approaches to both substance and classification of social infrastructure objects as a specific constellation of subsystems and components. To address the purpose set, the following tasks have been formulated: analysis of existing methods for determining the classification of social infrastructure; classification of the branches of social infrastructure using functional-dedicated approach; formulation of author's own definition of substance of social infrastructure. It has been determined that to date most often a social infrastructure classification is carried out depending on its functional tasks, although there are other approaches to classification. The author's definition of substance of social infrastructure has been formulated as follows: social infrastructure is a body of economy branches (public utilities, management, public safety and environment, socio-economic services, the purpose of which is to impact on reproductive potential and overall conditions of human activity in the spheres of work, everyday living, family, social-political, spiritual and intellectual development as well as life activity.
Energy Technology Data Exchange (ETDEWEB)
Gueye, A.K. [ESP, UCAD, Dakar (Senegal); Janicot, Serge; Sultan, Benjamin [LOCEAN/IPSL, IRD, Universite Pierre et Marie Curie, Paris cedex 05 (France); Niang, A. [LTI, ESP/UCAD, Dakar (Senegal); Sawadogo, S. [LTI, EPT, Thies (Senegal); Diongue-Niang, A. [ANACIM, Dakar (Senegal); Thiria, S. [LOCEAN/IPSL, UPMC, Paris (France)
2012-11-15
The aim of this work is to define over the period 1979-2002 the main synoptic weather regimes relevant for understanding the daily variability of rainfall during the summer monsoon season over Senegal. ''Interannual'' synoptic weather regimes are defined by removing the influence of the mean 1979-2002 seasonal cycle. This is different from Part I where the seasonal evolution of each year was removed, then removing also the contribution of interannual variability. As in Part I, the self-organizing maps approach, a clustering methodology based on non-linear artificial neural network, is combined with a hierarchical ascendant classification to compute these regimes. Nine weather regimes are identified using the mean sea level pressure and 850 hPa wind field as variables. The composite circulation patterns of all these nine weather regimes are very consistent with the associated anomaly patterns of precipitable water, mid-troposphere vertical velocity and rainfall. They are also consistent with the distribution of rainfall extremes. These regimes have been then gathered into different groups. A first group of four regimes is included in an inner circuit and is characterized by a modulation of the semi-permanent trough located along the western coast of West Africa and an opposite modulation on the east. This circuit is important because it associates the two wettest and highly persistent weather regimes over Senegal with the driest and the most persistent one. One derivation of this circuit is highlighted, including the two driest regimes and the most persistent one, what can provide important dry sequences occurrence. An exit of this circuit is characterised by a filling of the Saharan heat low. An entry into the main circuit includes a southward location of the Saharan heat low followed by its deepening. The last weather regime is isolated from the other ones and it has no significant impact on Senegal. It is present in June and September, and
Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J
2013-01-01
Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.
Merkel, Wolfgang
2004-01-01
"The development of the term and the analytical concept of totalitarianism have gone through several stages since the 1920s. However, even in its most sophisticated form, the version seen in Friedrich/ Brzezinski, the concept exhibits substantial systematic classification problems and analytical weaknesses. This article attempts to frame the type of totalitarian regime within a general typology of political regimes. Special attention is dedicated to the problem of distinguishing autocra...
An Examination of the Nature of Global MODIS Cloud Regimes
Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin; Kato, Seiji; Huffman, George J.
2014-01-01
We introduce global cloud regimes (previously also referred to as "weather states") derived from cloud retrievals that use measurements by the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Aqua and Terra satellites. The regimes are obtained by applying clustering analysis on joint histograms of retrieved cloud top pressure and cloud optical thickness. By employing a compositing approach on data sets from satellites and other sources, we examine regime structural and thermodynamical characteristics. We establish that the MODIS cloud regimes tend to form in distinct dynamical and thermodynamical environments and have diverse profiles of cloud fraction and water content. When compositing radiative fluxes from the Clouds and the Earth's Radiant Energy System instrument and surface precipitation from the Global Precipitation Climatology Project, we find that regimes with a radiative warming effect on the atmosphere also produce the largest implied latent heat. Taken as a whole, the results of the study corroborate the usefulness of the cloud regime concept, reaffirm the fundamental nature of the regimes as appropriate building blocks for cloud system classification, clarify their association with standard cloud types, and underscore their distinct radiative and hydrological signatures.
Multivariate Approaches to Classification in Extragalactic Astronomy
Directory of Open Access Journals (Sweden)
Didier eFraix-Burnet
2015-08-01
Full Text Available Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.
Belmar, Oscar; Velasco, Josefa; Martinez-Capel, Francisco
2011-05-01
Hydrological classification constitutes the first step of a new holistic framework for developing regional environmental flow criteria: the "Ecological Limits of Hydrologic Alteration (ELOHA)". The aim of this study was to develop a classification for 390 stream sections of the Segura River Basin based on 73 hydrological indices that characterize their natural flow regimes. The hydrological indices were calculated with 25 years of natural monthly flows (1980/81-2005/06) derived from a rainfall-runoff model developed by the Spanish Ministry of Environment and Public Works. These indices included, at a monthly or annual basis, measures of duration of droughts and central tendency and dispersion of flow magnitude (average, low and high flow conditions). Principal Component Analysis (PCA) indicated high redundancy among most hydrological indices, as well as two gradients: flow magnitude for mainstream rivers and temporal variability for tributary streams. A classification with eight flow-regime classes was chosen as the most easily interpretable in the Segura River Basin, which was supported by ANOSIM analyses. These classes can be simplified in 4 broader groups, with different seasonal discharge pattern: large rivers, perennial stable streams, perennial seasonal streams and intermittent and ephemeral streams. They showed a high degree of spatial cohesion, following a gradient associated with climatic aridity from NW to SE, and were well defined in terms of the fundamental variables in Mediterranean streams: magnitude and temporal variability of flows. Therefore, this classification is a fundamental tool to support water management and planning in the Segura River Basin. Future research will allow us to study the flow alteration-ecological response relationship for each river type, and set the basis to design scientifically credible environmental flows following the ELOHA framework.
Exploring different approaches for music genre classification
Directory of Open Access Journals (Sweden)
Antonio Jose Homsi Goulart
2012-07-01
Full Text Available In this letter, we present different approaches for music genre classification. The proposed techniques, which are composed of a feature extraction stage followed by a classification procedure, explore both the variations of parameters used as input and the classifier architecture. Tests were carried out with three styles of music, namely blues, classical, and lounge, which are considered informally by some musicians as being “big dividers” among music genres, showing the efficacy of the proposed algorithms and establishing a relationship between the relevance of each set of parameters for each music style and each classifier. In contrast to other works, entropies and fractal dimensions are the features adopted for the classifications.
Music Genre Classification Systems - A Computational Approach
DEFF Research Database (Denmark)
Ahrendt, Peter
2006-01-01
Automatic music genre classification is the classification of a piece of music into its corresponding genre (such as jazz or rock) by a computer. It is considered to be a cornerstone of the research area Music Information Retrieval (MIR) and closely linked to the other areas in MIR. It is thought...... that MIR will be a key element in the processing, searching and retrieval of digital music in the near future. This dissertation is concerned with music genre classification systems and in particular systems which use the raw audio signal as input to estimate the corresponding genre. This is in contrast...... to systems which use e.g. a symbolic representation or textual information about the music. The approach to music genre classification systems has here been system-oriented. In other words, all the different aspects of the systems have been considered and it is emphasized that the systems should...
An Extended Spectral-Spatial Classification Approach for Hyperspectral Data
Akbari, D.
2017-11-01
In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.
STAR POLYMERS IN GOOD SOLVENTS FROM DILUTE TO CONCENTRATED REGIMES: CROSSOVER APPROACH
Directory of Open Access Journals (Sweden)
S.B.Kiselev
2002-01-01
Full Text Available An introduction is given to the crossover theory of the conformational and thermodynamic properties of star polymers in good solvents. The crossover theory is tested against Monte Carlo simulation data for the structure and thermodynamics of model star polymers. In good solvent conditions, star polymers approach a "universal" limit as N → ∞, however, there are two types of approach towards this limit. In the dilute regime, a critical degree of polymerization N* is found to play a similar role as the Ginzburg number in the crossover theory for critical phenomena in simple fluids. A rescaled penetration function is found to control the free energy of star polymer solutions in the dilute and semidilute regions. This equation of state captures the scaling behaviour of polymer solutions in the dilute/semidilute regimes and also performs well in the concentrated regimes, where the details of the monomer-monomer interactions become important.
Hoekstra, H; Kempenaers, K; Nijs, S
2017-10-01
Variable angle locking compression plates allow for lateral buttress and support of the posterolateral joint surface of tibial plateau fractures. This gives room for improvement of the surgical 3-column classification approach. Our aim was to revise and validate the 3-column classification approach to better guide the surgical planning of tibial plateau fractures extending into the posterolateral corner. In contrast to the 3-column classification approach, in the revised approach the posterior border of the lateral column in the revised approach lies posterior instead of anterior of the fibula. According to the revised 3-column classification approach, extended lateral column fractures are defined as single lateral column fractures extending posteriorly into the posterolateral corner. CT-images of 36 patients were reviewed and classified twice online according to Schatzker and revised 3-column classification approach by five observers. The intraobserver reliability was calculated using the Cohen's kappa and the interobserver reliability was calculated using the Fleiss' kappa. The intraobserver reliability showed substantial agreement according to Landis and Koch for both Schatzker and the revised 3-column classification approach (0.746 vs. 0.782 p = 0.37, Schatzker vs. revised 3-column, respectively). However, the interobserver reliability of the revised 3-column classification approach was significantly higher as compared to the Schatzker classification (0.531 vs. 0.669 p column, respectively). With the introduction of variable angle locking compression plates, the revised 3-column classification approach is a very helpful tool in the preoperative surgical planning of tibial plateau fractures, in particular, lateral column fractures that extend into the posterolateral corner. The revised 3-column classification approach is rather a practical supplement to the Schatzker classification. It has a significantly higher interobserver reliability as compared to the
A statistical approach to root system classification.
Directory of Open Access Journals (Sweden)
Gernot eBodner
2013-08-01
Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement
AN EXTENDED SPECTRAL–SPATIAL CLASSIFICATION APPROACH FOR HYPERSPECTRAL DATA
Directory of Open Access Journals (Sweden)
D. Akbari
2017-11-01
Full Text Available In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1 unsupervised feature extraction methods including principal component analysis (PCA, independent component analysis (ICA, and minimum noise fraction (MNF; (2 supervised feature extraction including decision boundary feature extraction (DBFE, discriminate analysis feature extraction (DAFE, and nonparametric weighted feature extraction (NWFE; (3 genetic algorithm (GA. The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.
An automated cirrus classification
Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias
2018-05-01
Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.
Bridging interest, classification and technology gaps in the climate change regime
International Nuclear Information System (INIS)
Gupta, J.; Van der Werff, P.; Gagnon-Lebrun, F.; Van Dijk, I.; Verspeek, F.; Arkesteijn, E.; Van der Meer, J.
2002-01-01
The climate change regime is affected by a major credibility gap; there is a gap between what countries have been stating that they are willing to do and what they actually do. This is visible not just in the inability of the developed countries to stabilise their emissions at 1990 levels by the year 2000 as provided for in the United Nations Framework Convention on Climate Change (FCCC), but by the general reluctance of all countries to ratify the Kyoto Protocol to the Convention (KPFCCC). This research postulates that this credibility gap is affected further by three other types of gaps: 1) the interest gap; 2) the classification gap; and 3) the technology gap. The purpose of this research is thus to identify ways and means to promote industrial transformation in developing countries as a method to address the climate change problem. The title of this project is: Bridging Gaps - Enhancing Domestic and International Technological Collaboration to Enable the Adoption of Climate Relevant Technologies and Practices (CT and Ps) and thereby Foster Participation and Implementation of the Climate Convention (FCCC) by Developing Countries (DCs). In order to enhance technology co-operation, we believe that graduation profiles are needed at the international level and stakeholder involvement at both the national and international levels. refs
A discrimlnant function approach to ecological site classification in northern New England
James M. Fincher; Marie-Louise Smith
1994-01-01
Describes one approach to ecologically based classification of upland forest community types of the White and Green Mountain physiographic regions. The classification approach is based on an intensive statistical analysis of the relationship between the communities and soil-site factors. Discriminant functions useful in distinguishing between types based on soil-site...
Integrative Chemical-Biological Read-Across Approach for Chemical Hazard Classification
Low, Yen; Sedykh, Alexander; Fourches, Denis; Golbraikh, Alexander; Whelan, Maurice; Rusyn, Ivan; Tropsha, Alexander
2013-01-01
Traditional read-across approaches typically rely on the chemical similarity principle to predict chemical toxicity; however, the accuracy of such predictions is often inadequate due to the underlying complex mechanisms of toxicity. Here we report on the development of a hazard classification and visualization method that draws upon both chemical structural similarity and comparisons of biological responses to chemicals measured in multiple short-term assays (”biological” similarity). The Chemical-Biological Read-Across (CBRA) approach infers each compound's toxicity from those of both chemical and biological analogs whose similarities are determined by the Tanimoto coefficient. Classification accuracy of CBRA was compared to that of classical RA and other methods using chemical descriptors alone, or in combination with biological data. Different types of adverse effects (hepatotoxicity, hepatocarcinogenicity, mutagenicity, and acute lethality) were classified using several biological data types (gene expression profiling and cytotoxicity screening). CBRA-based hazard classification exhibited consistently high external classification accuracy and applicability to diverse chemicals. Transparency of the CBRA approach is aided by the use of radial plots that show the relative contribution of analogous chemical and biological neighbors. Identification of both chemical and biological features that give rise to the high accuracy of CBRA-based toxicity prediction facilitates mechanistic interpretation of the models. PMID:23848138
International Nuclear Information System (INIS)
Berman, Jules
2005-01-01
For over 150 years, pathologists have relied on histomorphology to classify and diagnose neoplasms. Their success has been stunning, permitting the accurate diagnosis of thousands of different types of neoplasms using only a microscope and a trained eye. In the past two decades, cancer genomics has challenged the supremacy of histomorphology by identifying genetic alterations shared by morphologically diverse tumors and by finding genetic features that distinguish subgroups of morphologically homogeneous tumors. The Developmental Lineage Classification and Taxonomy of Neoplasms groups neoplasms by their embryologic origin. The putative value of this classification is based on the expectation that tumors of a common developmental lineage will share common metabolic pathways and common responses to drugs that target these pathways. The purpose of this manuscript is to show that grouping tumors according to their developmental lineage can reconcile certain fundamental discrepancies resulting from morphologic and molecular approaches to neoplasm classification. In this study, six issues in tumor classification are described that exemplify the growing rift between morphologic and molecular approaches to tumor classification: 1) the morphologic separation between epithelial and non-epithelial tumors; 2) the grouping of tumors based on shared cellular functions; 3) the distinction between germ cell tumors and pluripotent tumors of non-germ cell origin; 4) the distinction between tumors that have lost their differentiation and tumors that arise from uncommitted stem cells; 5) the molecular properties shared by morphologically disparate tumors that have a common developmental lineage, and 6) the problem of re-classifying morphologically identical but clinically distinct subsets of tumors. The discussion of these issues in the context of describing different methods of tumor classification is intended to underscore the clinical value of a robust tumor classification. A
Change classification in SAR time series: a functional approach
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2017-10-01
Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.
ADHD classification using bag of words approach on network features
Solmaz, Berkan; Dey, Soumyabrata; Rao, A. Ravishankar; Shah, Mubarak
2012-02-01
Attention Deficit Hyperactivity Disorder (ADHD) is receiving lots of attention nowadays mainly because it is one of the common brain disorders among children and not much information is known about the cause of this disorder. In this study, we propose to use a novel approach for automatic classification of ADHD conditioned subjects and control subjects using functional Magnetic Resonance Imaging (fMRI) data of resting state brains. For this purpose, we compute the correlation between every possible voxel pairs within a subject and over the time frame of the experimental protocol. A network of voxels is constructed by representing a high correlation value between any two voxels as an edge. A Bag-of-Words (BoW) approach is used to represent each subject as a histogram of network features; such as the number of degrees per voxel. The classification is done using a Support Vector Machine (SVM). We also investigate the use of raw intensity values in the time series for each voxel. Here, every subject is represented as a combined histogram of network and raw intensity features. Experimental results verified that the classification accuracy improves when the combined histogram is used. We tested our approach on a highly challenging dataset released by NITRC for ADHD-200 competition and obtained promising results. The dataset not only has a large size but also includes subjects from different demography and edge groups. To the best of our knowledge, this is the first paper to propose BoW approach in any functional brain disorder classification and we believe that this approach will be useful in analysis of many brain related conditions.
Melles, S. J.; Jones, N. E.; Schmidt, B. J.
2014-03-01
Conservation and management of fresh flowing waters involves evaluating and managing effects of cumulative impacts on the aquatic environment from disturbances such as: land use change, point and nonpoint source pollution, the creation of dams and reservoirs, mining, and fishing. To assess effects of these changes on associated biotic communities it is necessary to monitor and report on the status of lotic ecosystems. A variety of stream classification methods are available to assist with these tasks, and such methods attempt to provide a systematic approach to modeling and understanding complex aquatic systems at various spatial and temporal scales. Of the vast number of approaches that exist, it is useful to group them into three main types. The first involves modeling longitudinal species turnover patterns within large drainage basins and relating these patterns to environmental predictors collected at reach and upstream catchment scales; the second uses regionalized hierarchical classification to create multi-scale, spatially homogenous aquatic ecoregions by grouping adjacent catchments together based on environmental similarities; and the third approach groups sites together on the basis of similarities in their environmental conditions both within and between catchments, independent of their geographic location. We review the literature with a focus on more recent classifications to examine the strengths and weaknesses of the different approaches. We identify gaps or problems with the current approaches, and we propose an eight-step heuristic process that may assist with development of more flexible and integrated aquatic classifications based on the current understanding, network thinking, and theoretical underpinnings.
The development of a classification schema for arts-based approaches to knowledge translation.
Archibald, Mandy M; Caine, Vera; Scott, Shannon D
2014-10-01
Arts-based approaches to knowledge translation are emerging as powerful interprofessional strategies with potential to facilitate evidence uptake, communication, knowledge, attitude, and behavior change across healthcare provider and consumer groups. These strategies are in the early stages of development. To date, no classification system for arts-based knowledge translation exists, which limits development and understandings of effectiveness in evidence syntheses. We developed a classification schema of arts-based knowledge translation strategies based on two mechanisms by which these approaches function: (a) the degree of precision in key message delivery, and (b) the degree of end-user participation. We demonstrate how this classification is necessary to explore how context, time, and location shape arts-based knowledge translation strategies. Classifying arts-based knowledge translation strategies according to their core attributes extends understandings of the appropriateness of these approaches for various healthcare settings and provider groups. The classification schema developed may enhance understanding of how, where, and for whom arts-based knowledge translation approaches are effective, and enable theorizing of essential knowledge translation constructs, such as the influence of context, time, and location on utilization strategies. The classification schema developed may encourage systematic inquiry into the effectiveness of these approaches in diverse interprofessional contexts. © 2014 Sigma Theta Tau International.
AUTOMATIC APPROACH TO VHR SATELLITE IMAGE CLASSIFICATION
Directory of Open Access Journals (Sweden)
P. Kupidura
2016-06-01
Full Text Available In this paper, we present a proposition of a fully automatic classification of VHR satellite images. Unlike the most widespread approaches: supervised classification, which requires prior defining of class signatures, or unsupervised classification, which must be followed by an interpretation of its results, the proposed method requires no human intervention except for the setting of the initial parameters. The presented approach bases on both spectral and textural analysis of the image and consists of 3 steps. The first step, the analysis of spectral data, relies on NDVI values. Its purpose is to distinguish between basic classes, such as water, vegetation and non-vegetation, which all differ significantly spectrally, thus they can be easily extracted basing on spectral analysis. The second step relies on granulometric maps. These are the product of local granulometric analysis of an image and present information on the texture of each pixel neighbourhood, depending on the texture grain. The purpose of texture analysis is to distinguish between different classes, spectrally similar, but yet of different texture, e.g. bare soil from a built-up area, or low vegetation from a wooded area. Due to the use of granulometric analysis, based on mathematical morphology opening and closing, the results are resistant to the border effect (qualifying borders of objects in an image as spaces of high texture, which affect other methods of texture analysis like GLCM statistics or fractal analysis. Therefore, the effectiveness of the analysis is relatively high. Several indices based on values of different granulometric maps have been developed to simplify the extraction of classes of different texture. The third and final step of the process relies on a vegetation index, based on near infrared and blue bands. Its purpose is to correct partially misclassified pixels. All the indices used in the classification model developed relate to reflectance values, so the
Toward a common classification approach for biorefinery systems
Cherubini, F.; Jungmeier, G.; Wellisch, M.; Wilke, T.; Skiadas, I.; Ree, van R.; Jong, de E.
2009-01-01
This paper deals with a biorefinery classification approach developed within International Energy Agency (IEA) Bioenergy Task 42. Since production of transportation biofuels is seen as the driving force for future biorefinery developments, a selection of the most interesting transportation biofuels
An ensemble classification approach for improved Land use/cover change detection
Chellasamy, M.; Ferré, T. P. A.; Humlekrog Greve, M.; Larsen, R.; Chinnasamy, U.
2014-11-01
Change Detection (CD) methods based on post-classification comparison approaches are claimed to provide potentially reliable results. They are considered to be most obvious quantitative method in the analysis of Land Use Land Cover (LULC) changes which provides from - to change information. But, the performance of post-classification comparison approaches highly depends on the accuracy of classification of individual images used for comparison. Hence, we present a classification approach that produce accurate classified results which aids to obtain improved change detection results. Machine learning is a part of broader framework in change detection, where neural networks have drawn much attention. Neural network algorithms adaptively estimate continuous functions from input data without mathematical representation of output dependence on input. A common practice for classification is to use Multi-Layer-Perceptron (MLP) neural network with backpropogation learning algorithm for prediction. To increase the ability of learning and prediction, multiple inputs (spectral, texture, topography, and multi-temporal information) are generally stacked to incorporate diversity of information. On the other hand literatures claims backpropagation algorithm to exhibit weak and unstable learning in use of multiple inputs, while dealing with complex datasets characterized by mixed uncertainty levels. To address the problem of learning complex information, we propose an ensemble classification technique that incorporates multiple inputs for classification unlike traditional stacking of multiple input data. In this paper, we present an Endorsement Theory based ensemble classification that integrates multiple information, in terms of prediction probabilities, to produce final classification results. Three different input datasets are used in this study: spectral, texture and indices, from SPOT-4 multispectral imagery captured on 1998 and 2003. Each SPOT image is classified
Analysis of approaches to classification of forms of non-standard employment
Directory of Open Access Journals (Sweden)
N. V. Dorokhova
2017-01-01
Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.
Toward a common classification approach for biorefinery systems
DEFF Research Database (Denmark)
Cherubini, Francesco; Jungmeier, Gerfried; Wellisch, Maria
2009-01-01
until 2020 is based on their characteristics to be mixed with gasoline, diesel and natural gas, reflecting the main advantage of using the already-existing infrastructure for easier market introduction. This classification approach relies on four main features: (1) platforms; (2) products; (3) feedstock...
Decision Making under Ecological Regime Shift: An Experimental Economic Approach
Kawata, Yukichika
2011-01-01
Environmental economics postulates the assumption of homo economicus and presumes that externality occurs as a result of the rational economic activities of economic agents. This paper examines this assumption using an experimental economic approach in the context of regime shift, which has been receiving increasing attention. We observe that when externality does not exist, economic agents (subjects of experimemt) act economically rationally, but when externality exists, economic agents avoi...
Praskievicz, S. J.; Luo, C.
2017-12-01
Classification of rivers is useful for a variety of purposes, such as generating and testing hypotheses about watershed controls on hydrology, predicting hydrologic variables for ungaged rivers, and setting goals for river management. In this research, we present a bottom-up (based on machine learning) river classification designed to investigate the underlying physical processes governing rivers' hydrologic regimes. The classification was developed for the entire state of Alabama, based on 248 United States Geological Survey (USGS) stream gages that met criteria for length and completeness of records. Five dimensionless hydrologic signatures were derived for each gage: slope of the flow duration curve (indicator of flow variability), baseflow index (ratio of baseflow to average streamflow), rising limb density (number of rising limbs per unit time), runoff ratio (ratio of long-term average streamflow to long-term average precipitation), and streamflow elasticity (sensitivity of streamflow to precipitation). We used a Bayesian clustering algorithm to classify the gages, based on the five hydrologic signatures, into distinct hydrologic regimes. We then used classification and regression trees (CART) to predict each gaged river's membership in different hydrologic regimes based on climatic and watershed variables. Using existing geospatial data, we applied the CART analysis to classify ungaged streams in Alabama, with the National Hydrography Dataset Plus (NHDPlus) catchment (average area 3 km2) as the unit of classification. The results of the classification can be used for meeting management and conservation objectives in Alabama, such as developing statewide standards for environmental instream flows. Such hydrologic classification approaches are promising for contributing to process-based understanding of river systems.
An Ultrasonic Pattern Recognition Approach to Welding Defect Classification
International Nuclear Information System (INIS)
Song, Sung Jin
1995-01-01
Classification of flaws in weldments from their ultrasonic scattering signals is very important in quantitative nondestructive evaluation. This problem is ideally suited to a modern ultrasonic pattern recognition technique. Here brief discussion on systematic approach to this methodology is presented including ultrasonic feature extraction, feature selection and classification. A stronger emphasis is placed on probabilistic neural networks as efficient classifiers for many practical classification problems. In an example probabilistic neural networks are applied to classify flaws in weldments into 3 classes such as cracks, porosity and slag inclusions. Probabilistic nets are shown to be able to exhibit high performance of other classifiers without any training time overhead. In addition, forward selection scheme for sensitive features is addressed to enhance network performance
A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry
Forster, J.; Entrup, B.
2017-10-01
In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.
A Novel Imbalanced Data Classification Approach Based on Logistic Regression and Fisher Discriminant
Directory of Open Access Journals (Sweden)
Baofeng Shi
2015-01-01
Full Text Available We introduce an imbalanced data classification approach based on logistic regression significant discriminant and Fisher discriminant. First of all, a key indicators extraction model based on logistic regression significant discriminant and correlation analysis is derived to extract features for customer classification. Secondly, on the basis of the linear weighted utilizing Fisher discriminant, a customer scoring model is established. And then, a customer rating model where the customer number of all ratings follows normal distribution is constructed. The performance of the proposed model and the classical SVM classification method are evaluated in terms of their ability to correctly classify consumers as default customer or nondefault customer. Empirical results using the data of 2157 customers in financial engineering suggest that the proposed approach better performance than the SVM model in dealing with imbalanced data classification. Moreover, our approach contributes to locating the qualified customers for the banks and the bond investors.
Diagnosing Unemployment: The 'Classification' Approach to Multiple Causation
Rodenburg, P.
2002-01-01
The establishment of appropriate policy measures for fighting unemployment has always been difficult since causes of unemployment are hard to identify. This paper analyses an approach used mainly in the 1960s and 1970s in economics, in which classification is used as a way to deal with such a
a Two-Step Classification Approach to Distinguishing Similar Objects in Mobile LIDAR Point Clouds
He, H.; Khoshelham, K.; Fraser, C.
2017-09-01
Nowadays, lidar is widely used in cultural heritage documentation, urban modeling, and driverless car technology for its fast and accurate 3D scanning ability. However, full exploitation of the potential of point cloud data for efficient and automatic object recognition remains elusive. Recently, feature-based methods have become very popular in object recognition on account of their good performance in capturing object details. Compared with global features describing the whole shape of the object, local features recording the fractional details are more discriminative and are applicable for object classes with considerable similarity. In this paper, we propose a two-step classification approach based on point feature histograms and the bag-of-features method for automatic recognition of similar objects in mobile lidar point clouds. Lamp post, street light and traffic sign are grouped as one category in the first-step classification for their inter similarity compared with tree and vehicle. A finer classification of the lamp post, street light and traffic sign based on the result of the first-step classification is implemented in the second step. The proposed two-step classification approach is shown to yield a considerable improvement over the conventional one-step classification approach.
A novel deep learning approach for classification of EEG motor imagery signals.
Tabar, Yousef Rezaei; Halici, Ugur
2017-02-01
Signal classification is an important issue in brain computer interface (BCI) systems. Deep learning approaches have been used successfully in many recent studies to learn features and classify different types of data. However, the number of studies that employ these approaches on BCI applications is very limited. In this study we aim to use deep learning methods to improve classification performance of EEG motor imagery signals. In this study we investigate convolutional neural networks (CNN) and stacked autoencoders (SAE) to classify EEG Motor Imagery signals. A new form of input is introduced to combine time, frequency and location information extracted from EEG signal and it is used in CNN having one 1D convolutional and one max-pooling layers. We also proposed a new deep network by combining CNN and SAE. In this network, the features that are extracted in CNN are classified through the deep network SAE. The classification performance obtained by the proposed method on BCI competition IV dataset 2b in terms of kappa value is 0.547. Our approach yields 9% improvement over the winner algorithm of the competition. Our results show that deep learning methods provide better classification performance compared to other state of art approaches. These methods can be applied successfully to BCI systems where the amount of data is large due to daily recording.
MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH
National Aeronautics and Space Administration — MULTI-TEMPORAL REMOTE SENSING IMAGE CLASSIFICATION - A MULTI-VIEW APPROACH VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Multispectral remote sensing images have...
Sows’ activity classification device using acceleration data – A resource constrained approach
DEFF Research Database (Denmark)
Marchioro, Gilberto Fernandes; Cornou, Cécile; Kristensen, Anders Ringgaard
2011-01-01
This paper discusses the main architectural alternatives and design decisions in order to implement a sows’ activity classification model on electronic devices. The different possibilities are analyzed in practical and technical aspects, focusing on the implementation metrics, like cost......, performance, complexity and reliability. The target architectures are divided into: server based, where the main processing element is a central computer; and embedded based, where the processing is distributed on devices attached to the animals. The initial classification model identifies the activities...... of a heuristic classification approach, focusing on the resource constrained characteristics of embedded systems. The new approach classifies the activities performed by the sows with accuracy close to 90%. It was implemented as a hardware module that can easily be instantiated to provide preprocessed...
Data classification and MTBF prediction with a multivariate analysis approach
International Nuclear Information System (INIS)
Braglia, Marcello; Carmignani, Gionata; Frosolini, Marco; Zammori, Francesco
2012-01-01
The paper presents a multivariate statistical approach that supports the classification of mechanical components, subjected to specific operating conditions, in terms of the Mean Time Between Failure (MTBF). Assessing the influence of working conditions and/or environmental factors on the MTBF is a prerequisite for the development of an effective preventive maintenance plan. However, this task may be demanding and it is generally performed with ad-hoc experimental methods, lacking of statistical rigor. To solve this common problem, a step by step multivariate data classification technique is proposed. Specifically, a set of structured failure data are classified in a meaningful way by means of: (i) cluster analysis, (ii) multivariate analysis of variance, (iii) feature extraction and (iv) predictive discriminant analysis. This makes it possible not only to define the MTBF of the analyzed components, but also to identify the working parameters that explain most of the variability of the observed data. The approach is finally demonstrated on 126 centrifugal pumps installed in an oil refinery plant; obtained results demonstrate the quality of the final discrimination, in terms of data classification and failure prediction.
Domain Adaptation for Opinion Classification: A Self-Training Approach
Directory of Open Access Journals (Sweden)
Yu, Ning
2013-03-01
Full Text Available Domain transfer is a widely recognized problem for machine learning algorithms because models built upon one data domain generally do not perform well in another data domain. This is especially a challenge for tasks such as opinion classification, which often has to deal with insufficient quantities of labeled data. This study investigates the feasibility of self-training in dealing with the domain transfer problem in opinion classification via leveraging labeled data in non-target data domain(s and unlabeled data in the target-domain. Specifically, self-training is evaluated for effectiveness in sparse data situations and feasibility for domain adaptation in opinion classification. Three types of Web content are tested: edited news articles, semi-structured movie reviews, and the informal and unstructured content of the blogosphere. Findings of this study suggest that, when there are limited labeled data, self-training is a promising approach for opinion classification, although the contributions vary across data domains. Significant improvement was demonstrated for the most challenging data domain-the blogosphere-when a domain transfer-based self-training strategy was implemented.
Assessing the Approaches to Classification of the State Financial Control
Baraniuk Yurii R.
2017-01-01
The article is aimed at assessing the approaches to classification of the State financial control, as well as disclosing the relationship and differences between its forms, types and methods. The results of comparative analysis of existing classifications of the State financial control have been covered. The substantiation of its identification by forms, types and methods of control was explored. Clarification of the interpretation of the concepts of «form of control», «type of control», «sub...
A TWO-STEP CLASSIFICATION APPROACH TO DISTINGUISHING SIMILAR OBJECTS IN MOBILE LIDAR POINT CLOUDS
Directory of Open Access Journals (Sweden)
H. He
2017-09-01
Full Text Available Nowadays, lidar is widely used in cultural heritage documentation, urban modeling, and driverless car technology for its fast and accurate 3D scanning ability. However, full exploitation of the potential of point cloud data for efficient and automatic object recognition remains elusive. Recently, feature-based methods have become very popular in object recognition on account of their good performance in capturing object details. Compared with global features describing the whole shape of the object, local features recording the fractional details are more discriminative and are applicable for object classes with considerable similarity. In this paper, we propose a two-step classification approach based on point feature histograms and the bag-of-features method for automatic recognition of similar objects in mobile lidar point clouds. Lamp post, street light and traffic sign are grouped as one category in the first-step classification for their inter similarity compared with tree and vehicle. A finer classification of the lamp post, street light and traffic sign based on the result of the first-step classification is implemented in the second step. The proposed two-step classification approach is shown to yield a considerable improvement over the conventional one-step classification approach.
Classification of cancerous cells based on the one-class problem approach
Murshed, Nabeel A.; Bortolozzi, Flavio; Sabourin, Robert
1996-03-01
One of the most important factors in reducing the effect of cancerous diseases is the early diagnosis, which requires a good and a robust method. With the advancement of computer technologies and digital image processing, the development of a computer-based system has become feasible. In this paper, we introduce a new approach for the detection of cancerous cells. This approach is based on the one-class problem approach, through which the classification system need only be trained with patterns of cancerous cells. This reduces the burden of the training task by about 50%. Based on this approach, a computer-based classification system is developed, based on the Fuzzy ARTMAP neural networks. Experimental results were performed using a set of 542 patterns taken from a sample of breast cancer. Results of the experiment show 98% correct identification of cancerous cells and 95% correct identification of non-cancerous cells.
Precipitation regime classification for the Mojave Desert: Implications for fire occurrence
Tagestad, Jerry; Brooks, Matthew L.; Cullinan, Valerie; Downs, Janelle; McKinley, Randy
2016-01-01
Long periods of drought or above-average precipitation affect Mojave Desert vegetation condition, biomass and susceptibility to fire. Changes in the seasonality of precipitation alter the likelihood of lightning, a key ignition source for fires. The objectives of this study were to characterize the relationship between recent, historic, and future precipitation patterns and fire. Classifying monthly precipitation data from 1971 to 2010 reveals four precipitation regimes: low winter/low summer, moderate winter/moderate summer, high winter/low summer and high winter/high summer. Two regimes with summer monsoonal precipitation covered only 40% of the Mojave Desert ecoregion but contain 88% of the area burned and 95% of the repeat burn area. Classifying historic precipitation for early-century (wet) and mid-century (drought) periods reveals distinct shifts in regime boundaries. Early-century results are similar to current, while the mid-century results show a sizeable reduction in area of regimes with a strong monsoonal component. Such a shift would suggest that fires during the mid-century period would be minimal and anecdotal records confirm this. Predicted precipitation patterns from downscaled global climate models indicate numerous epochs of high winter precipitation, inferring higher fire potential for many multi-decade periods during the next century.
Directory of Open Access Journals (Sweden)
Frank Pennekamp
Full Text Available The development of video-based monitoring methods allows for rapid, dynamic and accurate monitoring of individuals or communities, compared to slower traditional methods, with far reaching ecological and evolutionary applications. Large amounts of data are generated using video-based methods, which can be effectively processed using machine learning (ML algorithms into meaningful ecological information. ML uses user defined classes (e.g. species, derived from a subset (i.e. training data of video-observed quantitative features (e.g. phenotypic variation, to infer classes in subsequent observations. However, phenotypic variation often changes due to environmental conditions, which may lead to poor classification, if environmentally induced variation in phenotypes is not accounted for. Here we describe a framework for classifying species under changing environmental conditions based on the random forest classification. A sliding window approach was developed that restricts temporal and environmentally conditions to improve the classification. We tested our approach by applying the classification framework to experimental data. The experiment used a set of six ciliate species to monitor changes in community structure and behavior over hundreds of generations, in dozens of species combinations and across a temperature gradient. Differences in biotic and abiotic conditions caused simplistic classification approaches to be unsuccessful. In contrast, the sliding window approach allowed classification to be highly successful, as phenotypic differences driven by environmental change, could be captured by the classifier. Importantly, classification using the random forest algorithm showed comparable success when validated against traditional, slower, manual identification. Our framework allows for reliable classification in dynamic environments, and may help to improve strategies for long-term monitoring of species in changing environments. Our
Fanaian, Safa; Graas, Susan; Jiang, Yong; van der Zaag, Pieter
2015-02-01
The flow regime of rivers, being an integral part of aquatic ecosystems, provides many important services benefiting humans in catchments. Past water resource developments characterized by river embankments and dams, however, were often dominated by one (or few) economic use(s) of water. This results in a dramatically changed flow regime negatively affecting the provision of other ecosystem services sustained by the river flow. This study is intended to demonstrate the value of alternative flow regimes in a river that is highly modified by the presence of large hydropower dams and reservoirs, explicitly accounting for a broad range of flow-dependent ecosystem services. In this study, we propose a holistic approach for conducting an ecological economic assessment of a river's flow regime. This integrates recent advances in the conceptualization and classification of ecosystem services (UK NEA, 2011) with the flow regime evaluation technique developed by Korsgaard (2006). This integrated approach allows for a systematic comparison of the economic values of alternative flow regimes, including those that are considered beneficial for aquatic ecosystems. As an illustration, we applied this combined approach to the Lower Zambezi Basin, Mozambique. Empirical analysis shows that even though re-operating dams to create environmentally friendly flow regimes reduces hydropower benefits, the gains to goods derived from the aquatic ecosystem may offset the forgone hydropower benefits, thereby increasing the total economic value of river flow to society. The proposed integrated flow assessment approach can be a useful tool for welfare-improving decision-making in managing river basins. Copyright © 2014 Elsevier B.V. All rights reserved.
A three-way approach for protein function classification.
Directory of Open Access Journals (Sweden)
Hafeez Ur Rehman
Full Text Available The knowledge of protein functions plays an essential role in understanding biological cells and has a significant impact on human life in areas such as personalized medicine, better crops and improved therapeutic interventions. Due to expense and inherent difficulty of biological experiments, intelligent methods are generally relied upon for automatic assignment of functions to proteins. The technological advancements in the field of biology are improving our understanding of biological processes and are regularly resulting in new features and characteristics that better describe the role of proteins. It is inevitable to neglect and overlook these anticipated features in designing more effective classification techniques. A key issue in this context, that is not being sufficiently addressed, is how to build effective classification models and approaches for protein function prediction by incorporating and taking advantage from the ever evolving biological information. In this article, we propose a three-way decision making approach which provides provisions for seeking and incorporating future information. We considered probabilistic rough sets based models such as Game-Theoretic Rough Sets (GTRS and Information-Theoretic Rough Sets (ITRS for inducing three-way decisions. An architecture of protein functions classification with probabilistic rough sets based three-way decisions is proposed and explained. Experiments are carried out on Saccharomyces cerevisiae species dataset obtained from Uniprot database with the corresponding functional classes extracted from the Gene Ontology (GO database. The results indicate that as the level of biological information increases, the number of deferred cases are reduced while maintaining similar level of accuracy.
Liu, Haijian; Wu, Changshan
2018-06-01
Crown-level tree species classification is a challenging task due to the spectral similarity among different tree species. Shadow, underlying objects, and other materials within a crown may decrease the purity of extracted crown spectra and further reduce classification accuracy. To address this problem, an innovative pixel-weighting approach was developed for tree species classification at the crown level. The method utilized high density discrete LiDAR data for individual tree delineation and Airborne Imaging Spectrometer for Applications (AISA) hyperspectral imagery for pure crown-scale spectra extraction. Specifically, three steps were included: 1) individual tree identification using LiDAR data, 2) pixel-weighted representative crown spectra calculation using hyperspectral imagery, with which pixel-based illuminated-leaf fractions estimated using a linear spectral mixture analysis (LSMA) were employed as weighted factors, and 3) representative spectra based tree species classification was performed through applying a support vector machine (SVM) approach. Analysis of results suggests that the developed pixel-weighting approach (OA = 82.12%, Kc = 0.74) performed better than treetop-based (OA = 70.86%, Kc = 0.58) and pixel-majority methods (OA = 72.26, Kc = 0.62) in terms of classification accuracy. McNemar tests indicated the differences in accuracy between pixel-weighting and treetop-based approaches as well as that between pixel-weighting and pixel-majority approaches were statistically significant.
Ahmed, H. M.; Al-azawi, R. J.; Abdulhameed, A. A.
2018-05-01
Huge efforts have been put in the developing of diagnostic methods to skin cancer disease. In this paper, two different approaches have been addressed for detection the skin cancer in dermoscopy images. The first approach uses a global method that uses global features for classifying skin lesions, whereas the second approach uses a local method that uses local features for classifying skin lesions. The aim of this paper is selecting the best approach for skin lesion classification. The dataset has been used in this paper consist of 200 dermoscopy images from Pedro Hispano Hospital (PH2). The achieved results are; sensitivity about 96%, specificity about 100%, precision about 100%, and accuracy about 97% for globalization approach while, sensitivity about 100%, specificity about 100%, precision about 100%, and accuracy about 100% for Localization Approach, these results showed that the localization approach achieved acceptable accuracy and better than globalization approach for skin cancer lesions classification.
A texton-based approach for the classification of lung parenchyma in CT images
DEFF Research Database (Denmark)
Gangeh, Mehrdad J.; Sørensen, Lauge; Shaker, Saher B.
2010-01-01
In this paper, a texton-based classification system based on raw pixel representation along with a support vector machine with radial basis function kernel is proposed for the classification of emphysema in computed tomography images of the lung. The proposed approach is tested on 168 annotated...... regions of interest consisting of normal tissue, centrilobular emphysema, and paraseptal emphysema. The results show the superiority of the proposed approach to common techniques in the literature including moments of the histogram of filter responses based on Gaussian derivatives. The performance...
A hybrid ensemble learning approach to star-galaxy classification
Kim, Edward J.; Brunner, Robert J.; Carrasco Kind, Matias
2015-10-01
There exist a variety of star-galaxy classification techniques, each with their own strengths and weaknesses. In this paper, we present a novel meta-classification framework that combines and fully exploits different techniques to produce a more robust star-galaxy classification. To demonstrate this hybrid, ensemble approach, we combine a purely morphological classifier, a supervised machine learning method based on random forest, an unsupervised machine learning method based on self-organizing maps, and a hierarchical Bayesian template-fitting method. Using data from the CFHTLenS survey (Canada-France-Hawaii Telescope Lensing Survey), we consider different scenarios: when a high-quality training set is available with spectroscopic labels from DEEP2 (Deep Extragalactic Evolutionary Probe Phase 2 ), SDSS (Sloan Digital Sky Survey), VIPERS (VIMOS Public Extragalactic Redshift Survey), and VVDS (VIMOS VLT Deep Survey), and when the demographics of sources in a low-quality training set do not match the demographics of objects in the test data set. We demonstrate that our Bayesian combination technique improves the overall performance over any individual classification method in these scenarios. Thus, strategies that combine the predictions of different classifiers may prove to be optimal in currently ongoing and forthcoming photometric surveys, such as the Dark Energy Survey and the Large Synoptic Survey Telescope.
Jonczyk, Jennine; Haygarth, Phil; Quinn, Paul; Reaney, Sim
2014-05-01
A high temporal resolution data set from the Eden Demonstration Test Catchment (DTC) project is used to investigate the processes causing pollution and the influence of temporal sampling regime on the WFD classification of three catchments. This data highlights WFD standards may not be fit for purpose. The Eden DTC project is part of a UK government-funded project designed to provide robust evidence regarding how diffuse pollution can be cost-effectively controlled to improve and maintain water quality in rural river catchments. The impact of multiple water quality parameters on ecosystems and sustainable food production are being studied at the catchment scale. Three focus catchments approximately 10 km2 each, have been selected to represent the different farming practices and geophysical characteristics across the Eden catchment, Northern England. A field experimental programme has been designed to monitor the dynamics of agricultural diffuse pollution at multiple scales using state of the art sensors providing continuous real time data. The data set, which includes Total Phosphorus and Total Reactive Phosphorus, Nitrate, Ammonium, pH, Conductivity, Turbidity and Chlorophyll a reveals the frequency and duration of nutrient concentration target exceedance which arises from the prevalence of storm events of increasing magnitude. This data set is sub-sampled at different time intervals to explore how different sampling regimes affects our understanding of nutrient dynamics and the ramification of the different regimes to WFD chemical status. This presentation seeks to identify an optimum temporal resolution of data for effective catchment management and to question the usefulness of the WFD status metric for determining health of a system. Criteria based on high frequency short duration events needs to be accounted for.
Assessing the Approaches to Classification of the State Financial Control
Directory of Open Access Journals (Sweden)
Baraniuk Yurii R.
2017-11-01
Full Text Available The article is aimed at assessing the approaches to classification of the State financial control, as well as disclosing the relationship and differences between its forms, types and methods. The results of comparative analysis of existing classifications of the State financial control have been covered. The substantiation of its identification by forms, types and methods of control was explored. Clarification of the interpretation of the concepts of «form of control», «type of control», «subtype of control», «method of control», «methodical reception of control» has been provided. It has been determined that the form of the State financial control is a manifestation of the internal organization of control and the methods of its carrying out; a model of classification of the State financial control has been substantiated; attributes of the first and second order have been allocated; substantiation of methods and techniques has been improved; their composition and structure have been identified. This approach allows to divide general questions of the State financial control into theoretical and practical and, taking into consideration the expansion of the list of objects of the State financial control, will help to improve its methodology.
A practicable approach for periodontal classification
Mittal, Vishnu; Bhullar, Raman Preet K.; Bansal, Rachita; Singh, Karanprakash; Bhalodi, Anand; Khinda, Paramjit K.
2013-01-01
The Diagnosis and classification of periodontal diseases has remained a dilemma since long. Two distinct concepts have been used to define diseases: Essentialism and Nominalism. Essentialistic concept implies the real existence of disease whereas; nominalistic concept states that the names of diseases are the convenient way of stating concisely the endpoint of a diagnostic process. It generally advances from assessment of symptoms and signs toward knowledge of causation and gives a feasible option to name the disease for which etiology is either unknown or it is too complex to access in routine clinical practice. Various classifications have been proposed by the American Academy of Periodontology (AAP) in 1986, 1989 and 1999. The AAP 1999 classification is among the most widely used classification. But this classification also has demerits which provide impediment for its use in day to day practice. Hence a classification and diagnostic system is required which can help the clinician to access the patient's need and provide a suitable treatment which is in harmony with the diagnosis for that particular case. Here is an attempt to propose a practicable classification and diagnostic system of periodontal diseases for better treatment outcome. PMID:24379855
Optimization of a Non-traditional Unsupervised Classification Approach for Land Cover Analysis
Boyd, R. K.; Brumfield, J. O.; Campbell, W. J.
1982-01-01
The conditions under which a hybrid of clustering and canonical analysis for image classification produce optimum results were analyzed. The approach involves generation of classes by clustering for input to canonical analysis. The importance of the number of clusters input and the effect of other parameters of the clustering algorithm (ISOCLS) were examined. The approach derives its final result by clustering the canonically transformed data. Therefore the importance of number of clusters requested in this final stage was also examined. The effect of these variables were studied in terms of the average separability (as measured by transformed divergence) of the final clusters, the transformation matrices resulting from different numbers of input classes, and the accuracy of the final classifications. The research was performed with LANDSAT MSS data over the Hazleton/Berwick Pennsylvania area. Final classifications were compared pixel by pixel with an existing geographic information system to provide an indication of their accuracy.
Effect of a standardized treatment regime for infection after osteosynthesis.
Hellebrekers, Pien; Leenen, Luke P H; Hoekstra, Meriam; Hietbrink, Falco
2017-03-09
Infection after osteosynthesis is an important complication with significant morbidity and even mortality. These infections are often caused by biofilm-producing bacteria. Treatment algorithms dictate an aggressive approach with surgical debridement and antibiotic treatment. The aim of this study is to analyze the effect of such an aggressive standardized treatment regime with implant retention for acute, existing regime consisted of implant retention, thorough surgical debridement, and immediate antibiotic combination therapy with rifampicin. The primary outcome was success. Success was defined as consolidation of the fracture and resolved symptoms of infection. Culture and susceptibility testing were performed to identify bacteria and resistance patterns. Univariate analysis was conducted on patient-related factors in association with primary success and antibiotic resistance. Forty-nine patients were included for analysis. The primary success rate was 63% and overall success rate 88%. Factors negatively associated with primary success were the following: Gustilo classification (P = 0.023), higher number of debridements needed (P = 0.015), inability of primary closure (P = 0.017), and subsequent application of vacuum therapy (P = 0.030). Adherence to the treatment regime was positively related to primary success (P = 0.034). The described treatment protocol results in high success rates, comparable with success rates achieved in staged exchange in prosthetic joint infection treatment.
A study of earthquake-induced building detection by object oriented classification approach
Sabuncu, Asli; Damla Uca Avci, Zehra; Sunar, Filiz
2017-04-01
Among the natural hazards, earthquakes are the most destructive disasters and cause huge loss of lives, heavily infrastructure damages and great financial losses every year all around the world. According to the statistics about the earthquakes, more than a million earthquakes occur which is equal to two earthquakes per minute in the world. Natural disasters have brought more than 780.000 deaths approximately % 60 of all mortality is due to the earthquakes after 2001. A great earthquake took place at 38.75 N 43.36 E in the eastern part of Turkey in Van Province on On October 23th, 2011. 604 people died and about 4000 buildings seriously damaged and collapsed after this earthquake. In recent years, the use of object oriented classification approach based on different object features, such as spectral, textural, shape and spatial information, has gained importance and became widespread for the classification of high-resolution satellite images and orthophotos. The motivation of this study is to detect the collapsed buildings and debris areas after the earthquake by using very high-resolution satellite images and orthophotos with the object oriented classification and also see how well remote sensing technology was carried out in determining the collapsed buildings. In this study, two different land surfaces were selected as homogenous and heterogeneous case study areas. In the first step of application, multi-resolution segmentation was applied and optimum parameters were selected to obtain the objects in each area after testing different color/shape and compactness/smoothness values. In the next step, two different classification approaches, namely "supervised" and "unsupervised" approaches were applied and their classification performances were compared. Object-based Image Analysis (OBIA) was performed using e-Cognition software.
A Transform-Based Feature Extraction Approach for Motor Imagery Tasks Classification
Khorshidtalab, Aida; Mesbah, Mostefa; Salami, Momoh J. E.
2015-01-01
In this paper, we present a new motor imagery classification method in the context of electroencephalography (EEG)-based brain–computer interface (BCI). This method uses a signal-dependent orthogonal transform, referred to as linear prediction singular value decomposition (LP-SVD), for feature extraction. The transform defines the mapping as the left singular vectors of the LP coefficient filter impulse response matrix. Using a logistic tree-based model classifier; the extracted features are classified into one of four motor imagery movements. The proposed approach was first benchmarked against two related state-of-the-art feature extraction approaches, namely, discrete cosine transform (DCT) and adaptive autoregressive (AAR)-based methods. By achieving an accuracy of 67.35%, the LP-SVD approach outperformed the other approaches by large margins (25% compared with DCT and 6 % compared with AAR-based methods). To further improve the discriminatory capability of the extracted features and reduce the computational complexity, we enlarged the extracted feature subset by incorporating two extra features, namely, Q- and the Hotelling’s \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$T^{2}$ \\end{document} statistics of the transformed EEG and introduced a new EEG channel selection method. The performance of the EEG classification based on the expanded feature set and channel selection method was compared with that of a number of the state-of-the-art classification methods previously reported with the BCI IIIa competition data set. Our method came second with an average accuracy of 81.38%. PMID:27170898
Classification of innovations: approaches and consequences
Directory of Open Access Journals (Sweden)
Jakub Tabas
2011-01-01
Full Text Available Currently, innovations are perceived as a life blood of businesses. The inevitable fact is that even if the innovations have a potential to transform the companies or all the industries, the innovations are high risky. Even though, the second fact is that in order to companies’ development and their survival on the markets, the innovations have become the necessity. In the theory, it is rather difficult to find a comprehensive definition of innovation, and to settle down a general definition of innovation becomes more and more difficult with the growing number of domains where the innovations, or possible innovations start to appear in a form of added value to something that already exist. Definition of innovation has come through a long process of development; from early definition of Schumpeter who has connected innovation especially with changes in products or production processes, to recent definitions based on the added value for a society. One of possible approaches to define the content of innovation is to base the definition on classification of innovation. In the article, the authors provide the analysis of existing classifications of innovations in order to find, respectively in order to define the general content of innovation that would confirm (or reject their definition of innovation derived in the frame of their previous work where they state that innovation is a change that leads to gaining profit for an individual, for business entity, or for society, while the profit is not only the accounting one, but it is the economic profit.The article is based especially on the secondary research while the authors employ the method of analysis with the aim to confront various classification-based definitions of innovation. Then the methods used are especially comparison, analysis and synthesis.
Head Pose Estimation on Eyeglasses Using Line Detection and Classification Approach
Setthawong, Pisal; Vannija, Vajirasak
This paper proposes a unique approach for head pose estimation of subjects with eyeglasses by using a combination of line detection and classification approaches. Head pose estimation is considered as an important non-verbal form of communication and could also be used in the area of Human-Computer Interface. A major improvement of the proposed approach is that it allows estimation of head poses at a high yaw/pitch angle when compared with existing geometric approaches, does not require expensive data preparation and training, and is generally fast when compared with other approaches.
About the high flow regime of the rivers of Kosovo and Metohia
Directory of Open Access Journals (Sweden)
Živković Nenad
2009-01-01
Full Text Available The examples from Kosovo and Metohia attempted to point to some problems in the domain of hydrogeographic regionalization. The river water regime, especially the phase of high flows which marks this regime, has been the topic of almost all researches which treat water resources of drainage basins. However, the thing that has not been achieved till now is the unique solution by which the classification of rivers would be made according to this feature. On this example it has been shown that even some older methods, based on genetic analysis of hydrograms and of global type, as well as some recent ones, with lot of quantitative entry and regional approaches, cannot with certainty answer all the challenges which river regimes bring with themselves. This work shows that apart from climate, orographic and physiognomic features of drainage basins, the periods of data processing and the analysis of individual intra-annual series of discharges are very important as well. Discretization on time periods shorter than one month, as well as elimination of the extreme values of discharges in the longtime series is recommended for the future research.
Directory of Open Access Journals (Sweden)
A. V. Malov
2018-01-01
Full Text Available The review article reveals the content of the concept of Food Regime, which is little-known in the Russian academic reference. The author monitored and codified the semantic dynamic of the terminological unit from its original interpretations to modern formulations based on the retrospective analysis. The rehabilitation of the academic merits of D. Puchala and R. Hopkins — authors who used the concept Food Regime for a few years before its universally recognized origin and official scientific debut, was accomplished with help of historical and comparative methods. The author implemented the method of ascension from the abstract to the concrete to demonstrating the classification of Food Regimes compiled on the basis of geopolitical interests in the sphere of international production, consumption, and distribution of foodstuffs. The characteristic features of historically formed Food Regime were described in the chronological order, as well as modern tendencies possessing reformist potential were identified. In particular, it has been established that the idea of Food Sovereignty (which is an alternative to the modern Corporate Food Regime is the subject for acute academic disputes. The discussion between P. McMichael P. and H. Bernstein devoted to the “peasant question” — mobilization frame of the Food Sovereignty strategy was analyzed using the secondary data processing method. Due to the critical analysis, the author comes to the conclusion that it is necessary to follow the principles of the Food Sovereignty strategy to prevent the catastrophic prospects associated with ecosystem degradation, accelerated erosion of soils, the complete disappearance of biodiversity and corporate autoc racy successfully. The author is convinced that the idea of Food Sovereignty can ward off energetic liberalization of nature, intensive privatization of life and rapid monetization of unconditioned human reflexes.
A structuralist approach in the study of evolution and classification
Hammen, van der L.
1985-01-01
A survey is given of structuralism as a method that can be applied in the study of evolution and classification. The results of a structuralist approach are illustrated by examples from the laws underlying numerical changes, from the laws underlying changes in the chelicerate life-cycle, and from
International Nuclear Information System (INIS)
Esh, D.W.; Pinkston, K.E.; Barr, C.S.; Bradford, A.H.; Ridge, A.Ch.
2009-01-01
Nuclear Regulatory Commission (NRC) staff has developed a concentration averaging approach and guidance for the review of Department of Energy (DOE) non-HLW determinations. Although the approach was focused on this specific application, concentration averaging is generally applicable to waste classification and thus has implications for waste management decisions as discussed in more detail in this paper. In the United States, radioactive waste has historically been classified into various categories for the purpose of ensuring that the disposal system selected is commensurate with the hazard of the waste such that public health and safety will be protected. However, the risk from the near-surface disposal of radioactive waste is not solely a function of waste concentration but is also a function of the volume (quantity) of waste and its accessibility. A risk-informed approach to waste classification for near-surface disposal of low-level waste would consider the specific characteristics of the waste, the quantity of material, and the disposal system features that limit accessibility to the waste. NRC staff has developed example analytical approaches to estimate waste concentration, and therefore waste classification, for waste disposed in facilities or with configurations that were not anticipated when the regulation for the disposal of commercial low-level waste (i.e. 10 CFR Part 61) was developed. (authors)
Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; de Girolamo, A. M.; Lo Porto, A.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Froebrich, J.
2011-10-01
Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. The use of the aquatic fauna structural and functional characteristics to assess the ecological quality of a temporary stream reach can not therefore be made without taking into account the controls imposed by the hydrological regime. This paper develops some methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: flood, riffles, connected, pools, dry and arid. We used the water discharge records from gauging stations or simulations using rainfall-runoff models to infer the temporal patterns of occurrence of these states using the developed aquatic states frequency graph. The visual analysis of this graph is complemented by the development of two metrics based on the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of the aquatic regimes of temporary streams in terms of their influence over the development of aquatic life is put forward, defining Permanent, Temporary-pools, Temporary-dry and Episodic regime types. All these methods were tested with data from eight temporary streams around the Mediterranean from MIRAGE project and its application was a precondition to assess the ecological quality of these streams using the current methods prescribed in the European Water Framework Directive for macroinvertebrate communities.
Paul, Subir; Nagesh Kumar, D.
2018-04-01
Hyperspectral (HS) data comprises of continuous spectral responses of hundreds of narrow spectral bands with very fine spectral resolution or bandwidth, which offer feature identification and classification with high accuracy. In the present study, Mutual Information (MI) based Segmented Stacked Autoencoder (S-SAE) approach for spectral-spatial classification of the HS data is proposed to reduce the complexity and computational time compared to Stacked Autoencoder (SAE) based feature extraction. A non-parametric dependency measure (MI) based spectral segmentation is proposed instead of linear and parametric dependency measure to take care of both linear and nonlinear inter-band dependency for spectral segmentation of the HS bands. Then morphological profiles are created corresponding to segmented spectral features to assimilate the spatial information in the spectral-spatial classification approach. Two non-parametric classifiers, Support Vector Machine (SVM) with Gaussian kernel and Random Forest (RF) are used for classification of the three most popularly used HS datasets. Results of the numerical experiments carried out in this study have shown that SVM with a Gaussian kernel is providing better results for the Pavia University and Botswana datasets whereas RF is performing better for Indian Pines dataset. The experiments performed with the proposed methodology provide encouraging results compared to numerous existing approaches.
Energy-efficiency based classification of the manufacturing workstation
Frumuşanu, G.; Afteni, C.; Badea, N.; Epureanu, A.
2017-08-01
EU Directive 92/75/EC established for the first time an energy consumption labelling scheme, further implemented by several other directives. As consequence, nowadays many products (e.g. home appliances, tyres, light bulbs, houses) have an EU Energy Label when offered for sale or rent. Several energy consumption models of manufacturing equipments have been also developed. This paper proposes an energy efficiency - based classification of the manufacturing workstation, aiming to characterize its energetic behaviour. The concept of energy efficiency of the manufacturing workstation is defined. On this base, a classification methodology has been developed. It refers to specific criteria and their evaluation modalities, together to the definition & delimitation of energy efficiency classes. The energy class position is defined after the amount of energy needed by the workstation in the middle point of its operating domain, while its extension is determined by the value of the first coefficient from the Taylor series that approximates the dependence between the energy consume and the chosen parameter of the working regime. The main domain of interest for this classification looks to be the optimization of the manufacturing activities planning and programming. A case-study regarding an actual lathe classification from energy efficiency point of view, based on two different approaches (analytical and numerical) is also included.
Directory of Open Access Journals (Sweden)
Íñigo Molina
2012-11-01
Full Text Available This paper proposes the optimization relaxation approach based on the analogue Hopfield Neural Network (HNN for cluster refinement of pre-classified Polarimetric Synthetic Aperture Radar (PolSAR image data. We consider the initial classification provided by the maximum-likelihood classifier based on the complex Wishart distribution, which is then supplied to the HNN optimization approach. The goal is to improve the classification results obtained by the Wishart approach. The classification improvement is verified by computing a cluster separability coefficient and a measure of homogeneity within the clusters. During the HNN optimization process, for each iteration and for each pixel, two consistency coefficients are computed, taking into account two types of relations between the pixel under consideration and its corresponding neighbors. Based on these coefficients and on the information coming from the pixel itself, the pixel under study is re-classified. Different experiments are carried out to verify that the proposed approach outperforms other strategies, achieving the best results in terms of separability and a trade-off with the homogeneity preserving relevant structures in the image. The performance is also measured in terms of computational central processing unit (CPU times.
A Systematic Approach to Food Variety Classification as a Tool in ...
African Journals Online (AJOL)
A Systematic Approach to Food Variety Classification as a Tool in Dietary ... and food variety (count of all dietary items consumed during the recall period up to the ... This paper presents a pilot study carried out with an aim of demonstrating the ...
International Nuclear Information System (INIS)
Liu, Yaming; Chen, Sheng; Liu, Shi; Feng, Yongxin; Xu, Kai; Zheng, Chuguang
2016-01-01
MILD oxyfuel combustion has been attracting increasing attention as a promising clean combustion technology. How to design a pathway to reach MILD oxyfuel combustion regime and what can provide a theoretical guide to design such a pathway are two critical questions that need to be answered. So far there has been no open literature on these issues. A type of combustion regime classification map proposed in our previous work, based on the so-called ”Hot Diluted Diffusion Ignition” (HDDI) configuration, is adopted here as a simple but useful tool to solve these problems. Firstly, we analyze comprehensively the influences of various dilution atmosphere and fuel type on combustion regimes. The combustion regime classification maps are made out according to the analyses. In succession, we conduct a comparison between the map in air-firing condition and its oxyfuel counterpart. With the aid of the second thermodynamic-law analysis on the maps, it is easy to identify the major contributors to entropy generation in various combustion regimes in advance, which is crucial for combustion system optimization. Moreover, we find that, for the first time, a combustion regime classification map also may be used as a safety indicator. With the aid of these maps, some conclusions in previous publications can be explained more straightforwardly. - Highlights: • Analyze the influences of different fuels and dilution atmosphere on combustion regimes for the first time. • Provide a theoretical guide for practical operation to establish MILD oxyfuel combustion for the first time. • A new finding to expand the purposes of combustion regime maps for practical operation and combustion optimization.
Improving oil classification quality from oil spill fingerprint beyond six sigma approach.
Juahir, Hafizan; Ismail, Azimah; Mohamed, Saiful Bahri; Toriman, Mohd Ekhwan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wah, Wong Kok; Zali, Munirah Abdul; Retnam, Ananthy; Taib, Mohd Zaki Mohd; Mokhtar, Mazlin
2017-07-15
This study involves the use of quality engineering in oil spill classification based on oil spill fingerprinting from GC-FID and GC-MS employing the six-sigma approach. The oil spills are recovered from various water areas of Peninsular Malaysia and Sabah (East Malaysia). The study approach used six sigma methodologies that effectively serve as the problem solving in oil classification extracted from the complex mixtures of oil spilled dataset. The analysis of six sigma link with the quality engineering improved the organizational performance to achieve its objectivity of the environmental forensics. The study reveals that oil spills are discriminated into four groups' viz. diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) according to the similarity of the intrinsic chemical properties. Through the validation, it confirmed that four discriminant component, diesel, hydrocarbon fuel oil (HFO), mixture oil lubricant and fuel oil (MOLFO) and waste oil (WO) dominate the oil types with a total variance of 99.51% with ANOVA giving F stat >F critical at 95% confidence level and a Chi Square goodness test of 74.87. Results obtained from this study reveals that by employing six-sigma approach in a data-driven problem such as in the case of oil spill classification, good decision making can be expedited. Copyright © 2017. Published by Elsevier Ltd.
A Biologically Inspired Approach to Frequency Domain Feature Extraction for EEG Classification
Directory of Open Access Journals (Sweden)
Nurhan Gursel Ozmen
2018-01-01
Full Text Available Classification of electroencephalogram (EEG signal is important in mental decoding for brain-computer interfaces (BCI. We introduced a feature extraction approach based on frequency domain analysis to improve the classification performance on different mental tasks using single-channel EEG. This biologically inspired method extracts the most discriminative spectral features from power spectral densities (PSDs of the EEG signals. We applied our method on a dataset of six subjects who performed five different imagination tasks: (i resting state, (ii mental arithmetic, (iii imagination of left hand movement, (iv imagination of right hand movement, and (v imagination of letter “A.” Pairwise and multiclass classifications were performed in single EEG channel using Linear Discriminant Analysis and Support Vector Machines. Our method produced results (mean classification accuracy of 83.06% for binary classification and 91.85% for multiclassification that are on par with the state-of-the-art methods, using single-channel EEG with low computational cost. Among all task pairs, mental arithmetic versus letter imagination yielded the best result (mean classification accuracy of 90.29%, indicating that this task pair could be the most suitable pair for a binary class BCI. This study contributes to the development of single-channel BCI, as well as finding the best task pair for user defined applications.
DEFF Research Database (Denmark)
Chellasamy, Menaka; Ferre, Ty; Greve, Mogens Humlekrog
2016-01-01
Beginning in 2015, Danish farmers are obliged to meet specific crop diversification rules based on total land area and number of crops cultivated to be eligible for new greening subsidies. Hence, there is a need for the Danish government to extend their subsidy control system to verify farmers......’ declarations to war-rant greening payments under the new crop diversification rules. Remote Sensing (RS) technology has been used since 1992 to control farmers’ subsidies in Denmark. However, a proper RS-based approach is yet to be finalised to validate new crop diversity requirements designed for assessing...... compliance under the recent subsidy scheme (2014–2020); This study uses an ensemble classification approach(proposed by the authors in previous studies) for validating the crop diversity requirements of the new rules. The approach uses a neural network ensemble classification system with bi-temporal (spring...
Delavarian, Mona; Towhidkhah, Farzad; Gharibzadeh, Shahriar; Dibajnia, Parvin
2011-07-12
Automatic classification of different behavioral disorders with many similarities (e.g. in symptoms) by using an automated approach will help psychiatrists to concentrate on correct disorder and its treatment as soon as possible, to avoid wasting time on diagnosis, and to increase the accuracy of diagnosis. In this study, we tried to differentiate and classify (diagnose) 306 children with many similar symptoms and different behavioral disorders such as ADHD, depression, anxiety, comorbid depression and anxiety and conduct disorder with high accuracy. Classification was based on the symptoms and their severity. With examining 16 different available classifiers, by using "Prtools", we have proposed nearest mean classifier as the most accurate classifier with 96.92% accuracy in this research. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Sustainability in product development: a proposal for classification of approaches
Directory of Open Access Journals (Sweden)
Patrícia Flores Magnago
2012-06-01
Full Text Available The product development is a process that addresses sustainability issues inside companies. Many approaches have been discussed in academy concerning sustainability, as Natural Capitalism, Design for Environment (DfE and Life Cycle Analysis (LCA, but a question arises: which is indicated for what circumstance? This article aim is the proposition of a classification, based on a literature review, for 15 of these approaches. The criteria were: (i approach nature, (ii organization level, (iii integration level in Product Development Process (PDP, and (iv approach relevance for sustainability dimensions. Common terms allowed the establishment of connections among the approaches. As a result the researchers concluded that, despite they come from distinct knowledge areas they are not mutually excludent, on the contrary, the approaches may be used in a complementary way by managers. The combined use of complementary approaches is finally suggested in the paper.
A dictionary learning approach for human sperm heads classification.
Shaker, Fariba; Monadjemi, S Amirhassan; Alirezaie, Javad; Naghsh-Nilchi, Ahmad Reza
2017-12-01
To diagnose infertility in men, semen analysis is conducted in which sperm morphology is one of the factors that are evaluated. Since manual assessment of sperm morphology is time-consuming and subjective, automatic classification methods are being developed. Automatic classification of sperm heads is a complicated task due to the intra-class differences and inter-class similarities of class objects. In this research, a Dictionary Learning (DL) technique is utilized to construct a dictionary of sperm head shapes. This dictionary is used to classify the sperm heads into four different classes. Square patches are extracted from the sperm head images. Columnized patches from each class of sperm are used to learn class-specific dictionaries. The patches from a test image are reconstructed using each class-specific dictionary and the overall reconstruction error for each class is used to select the best matching class. Average accuracy, precision, recall, and F-score are used to evaluate the classification method. The method is evaluated using two publicly available datasets of human sperm head shapes. The proposed DL based method achieved an average accuracy of 92.2% on the HuSHeM dataset, and an average recall of 62% on the SCIAN-MorphoSpermGS dataset. The results show a significant improvement compared to a previously published shape-feature-based method. We have achieved high-performance results. In addition, our proposed approach offers a more balanced classifier in which all four classes are recognized with high precision and recall. In this paper, we use a Dictionary Learning approach in classifying human sperm heads. It is shown that the Dictionary Learning method is far more effective in classifying human sperm heads than classifiers using shape-based features. Also, a dataset of human sperm head shapes is introduced to facilitate future research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gallart, F.; Prat, N.; García-Roger, E. M.; Latron, J.; Rieradevall, M.; Llorens, P.; Barberá, G. G.; Brito, D.; De Girolamo, A. M.; Lo Porto, A.; Buffagni, A.; Erba, S.; Neves, R.; Nikolaidis, N. P.; Perrin, J. L.; Querner, E. P.; Quiñonero, J. M.; Tournoud, M. G.; Tzoraki, O.; Skoulikidis, N.; Gómez, R.; Sánchez-Montoya, M. M.; Froebrich, J.
2012-09-01
Temporary streams are those water courses that undergo the recurrent cessation of flow or the complete drying of their channel. The structure and composition of biological communities in temporary stream reaches are strongly dependent on the temporal changes of the aquatic habitats determined by the hydrological conditions. Therefore, the structural and functional characteristics of aquatic fauna to assess the ecological quality of a temporary stream reach cannot be used without taking into account the controls imposed by the hydrological regime. This paper develops methods for analysing temporary streams' aquatic regimes, based on the definition of six aquatic states that summarize the transient sets of mesohabitats occurring on a given reach at a particular moment, depending on the hydrological conditions: Hyperrheic, Eurheic, Oligorheic, Arheic, Hyporheic and Edaphic. When the hydrological conditions lead to a change in the aquatic state, the structure and composition of the aquatic community changes according to the new set of available habitats. We used the water discharge records from gauging stations or simulations with rainfall-runoff models to infer the temporal patterns of occurrence of these states in the Aquatic States Frequency Graph we developed. The visual analysis of this graph is complemented by the development of two metrics which describe the permanence of flow and the seasonal predictability of zero flow periods. Finally, a classification of temporary streams in four aquatic regimes in terms of their influence over the development of aquatic life is updated from the existing classifications, with stream aquatic regimes defined as Permanent, Temporary-pools, Temporary-dry and Episodic. While aquatic regimes describe the long-term overall variability of the hydrological conditions of the river section and have been used for many years by hydrologists and ecologists, aquatic states describe the availability of mesohabitats in given periods that
A Similarity-Based Approach for Audiovisual Document Classification Using Temporal Relation Analysis
Directory of Open Access Journals (Sweden)
Ferrane Isabelle
2011-01-01
Full Text Available Abstract We propose a novel approach for video classification that bases on the analysis of the temporal relationships between the basic events in audiovisual documents. Starting from basic segmentation results, we define a new representation method that is called Temporal Relation Matrix (TRM. Each document is then described by a set of TRMs, the analysis of which makes events of a higher level stand out. This representation has been first designed to analyze any audiovisual document in order to find events that may well characterize its content and its structure. The aim of this work is to use this representation to compute a similarity measure between two documents. Approaches for audiovisual documents classification are presented and discussed. Experimentations are done on a set of 242 video documents and the results show the efficiency of our proposals.
International Nuclear Information System (INIS)
Pilat, Joseph F.; Budlong-Sylvester, K.W.
2004-01-01
Following the 1998 nuclear tests in South Asia and later reinforced by revelations about North Korean and Iraqi nuclear activities, there has been growing concern about increasing proliferation dangers. At the same time, the prospects of radiological/nuclear terrorism are seen to be rising - since 9/11, concern over a proliferation/terrorism nexus has never been higher. In the face of this growing danger, there are urgent calls for stronger measures to strengthen the current international nuclear nonproliferation regime, including recommendations to place civilian processing of weapon-useable material under multinational control. As well, there are calls for entirely new tools, including military options. As proliferation and terrorism concerns grow, the regime is under pressure and there is a temptation to consider fundamental changes to the regime. In this context, this paper will address the following: Do we need to change the regime centered on the Treaty on the Nonproliferation of Nuclear Weapons (NPT) and the International Atomic Energy Agency (IAEA)? What improvements could ensure it will be the foundation for the proliferation resistance and physical protection needed if nuclear power grows? What will make it a viable centerpiece of future nonproliferation and counterterrorism approaches?
Classification of gene expression data: A hubness-aware semi-supervised approach.
Buza, Krisztian
2016-04-01
Classification of gene expression data is the common denominator of various biomedical recognition tasks. However, obtaining class labels for large training samples may be difficult or even impossible in many cases. Therefore, semi-supervised classification techniques are required as semi-supervised classifiers take advantage of unlabeled data. Gene expression data is high-dimensional which gives rise to the phenomena known under the umbrella of the curse of dimensionality, one of its recently explored aspects being the presence of hubs or hubness for short. Therefore, hubness-aware classifiers have been developed recently, such as Naive Hubness-Bayesian k-Nearest Neighbor (NHBNN). In this paper, we propose a semi-supervised extension of NHBNN which follows the self-training schema. As one of the core components of self-training is the certainty score, we propose a new hubness-aware certainty score. We performed experiments on publicly available gene expression data. These experiments show that the proposed classifier outperforms its competitors. We investigated the impact of each of the components (classification algorithm, semi-supervised technique, hubness-aware certainty score) separately and showed that each of these components are relevant to the performance of the proposed approach. Our results imply that our approach may increase classification accuracy and reduce computational costs (i.e., runtime). Based on the promising results presented in the paper, we envision that hubness-aware techniques will be used in various other biomedical machine learning tasks. In order to accelerate this process, we made an implementation of hubness-aware machine learning techniques publicly available in the PyHubs software package (http://www.biointelligence.hu/pyhubs) implemented in Python, one of the most popular programming languages of data science. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Preparing for Upheaval in North Korea: Assuming North Korean Regime Collapse
2013-12-01
Reintegration), Stabilization Operation, Regime collapse, Songbun, Juche 15. NUMBER OF PAGES 87 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...corn and wheat flour . On special occasions, distinguished individuals are even given luxury consumer goods such as wristwatches and TV sets as
The Name of the Rose: Classifying 1930s Exchange-Rate Regimes
Scott Andrew Urban
2009-01-01
There is an implicit consensus that 1930s exchange-rate regimes can be characterised as some variant of ‘floating’. This paper applies an adaptation of modern methodologies of exchange-rate regime classification to a panel of 47 countries in weekly observations between January 1919 and August 1939. On the basis of modern benchmarks, the 1930s world monetary system would not be considered ‘floating’ or even ‘managed floating’. One implication is that today’s fiat-based, managed-floating intern...
Heuristic approach to the classification of postpartum endometritis and its forms
Directory of Open Access Journals (Sweden)
E. A. Balashova
2017-01-01
Full Text Available Тhe work is dedicated to the development of a method of automated medical diagnosis based on the description of biomedical systems using two parameters: energy, reflecting the interaction of its elements, and entropy characterizing the organization of the system. The violations of the energy-entropy cycle of biomedical systems is reflected in the symptoms of the disease. Statistical link between the symptoms of the condition of the body and the nature of excitation of its elements best expressed in the heuristic description of the system state. High accuracy classification of the patient's condition is achieved by using heuristic detection methods. In the proposed approach, allowing to estimate the probability of correct diagnosis increases the accuracy of the classification, and the estimated minimum amount of training samples and the capacity of its constituent signs. Classification technique consists in averaging the characteristic values in the selected classes, the preparation of the complex of symptoms of the most important signs of the disease, to conduct a "rough" diagnostic threshold rules that allow to distinguish severe forms of the disease, then differential diagnosis the severity of the disease. The proposed method was tested for classification of the forms of puerperal endometritis (mild, moderate, severe. The training sample contained 70 case histories. Syndrome to classify the patient's condition was composed of 17 characteristics. Threshold diagnosis has allowed to establish the presence of disease and to separate heavy. Differential diagnosis was used for classification of mild and moderate severity of postpartum endometritis. The accuracy of the classification of forms of postpartum endometritis amounted to 97.1%.
Bottom-up and Top-down: An alternate classification of LD authoring approaches
Sodhi, Tim; Miao, Yongwu; Brouns, Francis; Koper, Rob
2007-01-01
Sodhi, T., Miao, Y., Brouns, F., & Koper, E. J. R. (2007). Bottom-up and Top-down: An alternate classification of LD authoring approaches. Paper presented at the TENCompetence Open Workshop on Current research on IMS Learning Design and Lifelong Competence Development Infrastructures. June, 21-22,
Directory of Open Access Journals (Sweden)
Aoife eRoebuck
2015-08-01
Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.
Directory of Open Access Journals (Sweden)
Sebastian D’Oleire-Oltmanns
2011-08-01
Full Text Available Urban areas develop on formal and informal levels. Informal development is often highly dynamic, leading to a lag of spatial information about urban structure types. In this work, an object-based remote sensing approach will be presented to map the migrant housing urban structure type in the Pearl River Delta, China. SPOT5 data were utilized for the classification (auxiliary data, particularly up-to-date cadastral data, were not available. A hierarchically structured classification process was used to create (spectral independence from single satellite scenes and to arrive at a transferrable classification process. Using the presented classification approach, an overall classification accuracy of migrant housing of 68.0% is attained.
Manners, R.; Wilcox, A. C.; Merritt, D. M.
2016-12-01
The ecogeomorphic response of riparian ecosystems to a change in hydrologic properties is difficult to predict because of the interactions and feedbacks among plants, water, and sediment. Most riparian models of community dynamics assume a static channel, yet geomorphic processes strongly control the establishment and survival of riparian vegetation. Using a combination of approaches that includes empirical relationships and hydrodynamic models, we model the coupled vegetation-topographic response of three cross-sections on the Yampa and Green Rivers in Dinosaur National Monument, to a shift in the flow regime. The locations represent the variable geomorphology and vegetation composition of these canyon-bound rivers. We account for the inundation and hydraulic properties of vegetation plots surveyed over three years within International River Interface Cooperative (iRIC) Fastmech, equipped with a vegetation module that accounts for flexible stems and plant reconfiguration. The presence of functional groupings of plants, or those plants that respond similarly to environmental factors such as water availability and disturbance are determined from flow response curves developed for the Yampa River. Using field measurements of vegetation morphology, distance from the channel centerline, and dominant particle size and modeled inundation properties we develop an empirical relationship between these variables and topographic change. We evaluate vegetation and channel form changes over decadal timescales, allowing for the integration of processes over time. From our analyses, we identify thresholds in the flow regime that alter the distribution of plants and reduce geomorphic complexity, predominately through side-channel and backwater infilling. Simplification of some processes (e.g., empirically-derived sedimentation) and detailed treatment of others (e.g., plant-flow interactions) allows us to model the coupled dynamics of riparian ecosystems and evaluate the impact of
An intelligent software approach to ultrasonic flaw classification in weldments
International Nuclear Information System (INIS)
Song, Sung Jin; Kim, Hak Joon; Lee, Hyun
1997-01-01
Ultrasonic pattern recognition is the most effective approach to the problem of discriminating types of flaws in weldments based on ultrasonic flaw signals. In spite of significant progress on this methodology, it has not been widely used in practical ultrasonic inspection of weldments in industry. Hence, for the convenient application of this approach in many practical situations, we develop an intelligent ultrasonic signature classification software which can discriminate types of flaws in weldments using various tools in artificial intelligence such as neural networks. This software shows excellent performances in an experimental problem where flaws in weldments are classified into two categories of cracks and non-cracks.
Directory of Open Access Journals (Sweden)
Abdullah M. Iliyasu
2017-12-01
Full Text Available A quantum hybrid (QH intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO method with the intuitionistic rationality of traditional fuzzy k-nearest neighbours (Fuzzy k-NN algorithm (known simply as the Q-Fuzzy approach is proposed for efficient feature selection and classification of cells in cervical smeared (CS images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection and another hybrid technique combining the standard PSO algorithm with the Fuzzy k-NN technique (P-Fuzzy approach. In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k-NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.
Typification of the thermal regime of the air in Nicaragua
International Nuclear Information System (INIS)
Lecha Estela, Luis; Hernandez Perez, Vidal; Prado Zambrana, Carmen
1994-01-01
In this work it is applied the method of thermal regime classification in order to evaluate the heat resources of the country, as a first step to know and to employ, rationally, the national climatic resources. It is analyzed the interaction between the spatio-temporal distribution of the thermal regime and the main climatic factors, showing the differences encountered between each geographic zone of the country and, moreover, they vertical structure. The results have applied utility in several branches of the national economy and they were included in the work to prepare the Climatic Atlas of Nicaragua
DEFF Research Database (Denmark)
Zhang, Yue; Dragoni, Nicola; Wang, Jiangtao
2015-01-01
efficiency to facilitate the design of fault detection methods and the evaluation of their energy efficiency. Following the same design principle of the fault detection framework, the paper proposes a classification for fault detection approaches. The classification is applied to a number of fault detection...
Clary, Renee; Wandersee, James
2013-01-01
In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…
A Graph Cut Approach to Artery/Vein Classification in Ultra-Widefield Scanning Laser Ophthalmoscopy.
Pellegrini, Enrico; Robertson, Gavin; MacGillivray, Tom; van Hemert, Jano; Houston, Graeme; Trucco, Emanuele
2018-02-01
The classification of blood vessels into arterioles and venules is a fundamental step in the automatic investigation of retinal biomarkers for systemic diseases. In this paper, we present a novel technique for vessel classification on ultra-wide-field-of-view images of the retinal fundus acquired with a scanning laser ophthalmoscope. To the best of our knowledge, this is the first time that a fully automated artery/vein classification technique for this type of retinal imaging with no manual intervention has been presented. The proposed method exploits hand-crafted features based on local vessel intensity and vascular morphology to formulate a graph representation from which a globally optimal separation between the arterial and venular networks is computed by graph cut approach. The technique was tested on three different data sets (one publicly available and two local) and achieved an average classification accuracy of 0.883 in the largest data set.
A Neuro-Fuzzy Approach in the Classification of Students’ Academic Performance
Directory of Open Access Journals (Sweden)
Quang Hung Do
2013-01-01
Full Text Available Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.
Understanding energy-related regimes: A participatory approach from central Australia
International Nuclear Information System (INIS)
Foran, Tira; Fleming, David; Spandonide, Bruno; Williams, Rachel; Race, Digby
2016-01-01
For a particular community, what energy-related innovations constitute no-regrets strategies? We present a methodology to understand how alternative energy consuming activities and policy regimes impact on current and future liveability of socio-culturally diverse communities facing climate change. Our methodology augments the energy policy literature by harnessing three concepts (collaborative governance, innovation and political economic regime of provisioning) to support dialogue around changing energy-related activities. We convened workshops in Alice Springs, Australia to build capability to identify no-regrets energy-related housing or transport activities and strategies. In preparation, we interviewed policy actors and constructed three new housing-related future scenarios. After discussing the scenarios, policy and research actors prioritised five socio-technical activities or strategies. Evaluations indicate participants enjoyed opportunities given by the methodology to have focussed discussions about activities and innovation, while requesting more socially nuanced scenario storylines. We discuss implications for theory and technique development. - Highlights: •Energy-related activities and regimes frustrate pro-sustainability action. •Participatory workshops increased understanding of activities and regimes. •Workshops used a novel combination of governance and social theories. •Results justify inclusive dialogue around building energy standards and transport options.
Energy Technology Data Exchange (ETDEWEB)
Ma, Deqiang [Fujian Provincial Key Laboratory for Coastal Ecology and Environmental Studies, Xiamen University, 361102 (China); Coastal and Ocean Management Institute, Xiamen University, 361102 (China); Fang, Qinhua, E-mail: qhfang@xmu.edu.cn [Fujian Provincial Key Laboratory for Coastal Ecology and Environmental Studies, Xiamen University, 361102 (China); Coastal and Ocean Management Institute, Xiamen University, 361102 (China); Guan, Song [Coastal and Ocean Management Institute, Xiamen University, 361102 (China)
2016-01-15
In 2004, the United Nations launched an Ad Hoc Open-ended Informal Working Group to study issues relating to the conservation and sustainable use of marine biological diversity in areas beyond national jurisdiction. Since then, the topic of governing marine areas beyond national jurisdiction (ABNJ) has been widely discussed by politicians, policy makers and scholars. As one of management tools to protect marine biodiversity in ABNJ, environmental impact assessment (EIA) has been widely recognized and accepted by the international community, however, the biggest challenge is how to effectively implement the EIA regime in ABNJ. This paper explores the impacts of anthropogenic activities in ABNJ on marine ecosystems, reviews the existing legal regime for EIA in ABNJ and discusses possible measures to strengthen the implementation of EIA in ABNJ. - Highlights: • We identify human activities in ABNJ and their impacts on marine ecosystems. • We analyze the characters and gaps of the existing legal regime for EIA in ABNJ. • We analyze the pros and cons of alternative approaches of EIA in ABNJ.
International Nuclear Information System (INIS)
Ma, Deqiang; Fang, Qinhua; Guan, Song
2016-01-01
In 2004, the United Nations launched an Ad Hoc Open-ended Informal Working Group to study issues relating to the conservation and sustainable use of marine biological diversity in areas beyond national jurisdiction. Since then, the topic of governing marine areas beyond national jurisdiction (ABNJ) has been widely discussed by politicians, policy makers and scholars. As one of management tools to protect marine biodiversity in ABNJ, environmental impact assessment (EIA) has been widely recognized and accepted by the international community, however, the biggest challenge is how to effectively implement the EIA regime in ABNJ. This paper explores the impacts of anthropogenic activities in ABNJ on marine ecosystems, reviews the existing legal regime for EIA in ABNJ and discusses possible measures to strengthen the implementation of EIA in ABNJ. - Highlights: • We identify human activities in ABNJ and their impacts on marine ecosystems. • We analyze the characters and gaps of the existing legal regime for EIA in ABNJ. • We analyze the pros and cons of alternative approaches of EIA in ABNJ.
A novel underwater dam crack detection and classification approach based on sonar images.
Shi, Pengfei; Fan, Xinnan; Ni, Jianjun; Khan, Zubair; Li, Min
2017-01-01
Underwater dam crack detection and classification based on sonar images is a challenging task because underwater environments are complex and because cracks are quite random and diverse in nature. Furthermore, obtainable sonar images are of low resolution. To address these problems, a novel underwater dam crack detection and classification approach based on sonar imagery is proposed. First, the sonar images are divided into image blocks. Second, a clustering analysis of a 3-D feature space is used to obtain the crack fragments. Third, the crack fragments are connected using an improved tensor voting method. Fourth, a minimum spanning tree is used to obtain the crack curve. Finally, an improved evidence theory combined with fuzzy rule reasoning is proposed to classify the cracks. Experimental results show that the proposed approach is able to detect underwater dam cracks and classify them accurately and effectively under complex underwater environments.
A novel underwater dam crack detection and classification approach based on sonar images.
Directory of Open Access Journals (Sweden)
Pengfei Shi
Full Text Available Underwater dam crack detection and classification based on sonar images is a challenging task because underwater environments are complex and because cracks are quite random and diverse in nature. Furthermore, obtainable sonar images are of low resolution. To address these problems, a novel underwater dam crack detection and classification approach based on sonar imagery is proposed. First, the sonar images are divided into image blocks. Second, a clustering analysis of a 3-D feature space is used to obtain the crack fragments. Third, the crack fragments are connected using an improved tensor voting method. Fourth, a minimum spanning tree is used to obtain the crack curve. Finally, an improved evidence theory combined with fuzzy rule reasoning is proposed to classify the cracks. Experimental results show that the proposed approach is able to detect underwater dam cracks and classify them accurately and effectively under complex underwater environments.
International Nuclear Information System (INIS)
Girardin, Mathieu
2014-01-01
Two-phase flows in Pressurized Water Reactors belong to a wide range of Mach number flows. Computing accurate approximate solutions of those flows may be challenging from a numerical point of view as classical finite volume methods are too diffusive in the low Mach regime. In this thesis, we are interested in designing and studying some robust numerical schemes that are stable for large time steps and accurate even on coarse meshes for a wide range of flow regimes. An important feature is the strategy to construct those schemes. We use a mixed implicit-explicit strategy based on an operator splitting to solve fast and slow phenomena separately. Then, we introduce a modification of a Suliciu type relaxation scheme to improve the accuracy of the numerical scheme in some regime of interest. Two approaches have been used to assess the ability of our numerical schemes to deal with a wide range of flow regimes. The first approach, based on the asymptotic preserving property, has been used for the gas dynamics equations with stiff source terms. The second approach, based on the all-regime property, has been used for the gas dynamics equations and the homogeneous two-phase flows models HRM and HEM in the low Mach regime. We obtained some robustness and stability properties for our numerical schemes. In particular, some discrete entropy inequalities are shown. Numerical evidences, in 1D and in 2D on unstructured meshes, assess the gain in term of accuracy and CPU time of those asymptotic preserving and all-regime numerical schemes in comparison with classical finite volume methods. (author) [fr
Knowledge-based sea ice classification by polarimetric SAR
DEFF Research Database (Denmark)
Skriver, Henning; Dierking, Wolfgang
2004-01-01
Polarimetric SAR images acquired at C- and L-band over sea ice in the Greenland Sea, Baltic Sea, and Beaufort Sea have been analysed with respect to their potential for ice type classification. The polarimetric data were gathered by the Danish EMISAR and the US AIRSAR which both are airborne...... systems. A hierarchical classification scheme was chosen for sea ice because our knowledge about magnitudes, variations, and dependences of sea ice signatures can be directly considered. The optimal sequence of classification rules and the rules themselves depend on the ice conditions/regimes. The use...... of the polarimetric phase information improves the classification only in the case of thin ice types but is not necessary for thicker ice (above about 30 cm thickness)...
Regime-Based Versus Static Asset Allocation: Letting the Data Speak
DEFF Research Database (Denmark)
Nystrup, Peter; Hansen, Bo William; Madsen, Henrik
2015-01-01
Regime shifts present a big challenge to traditional strategic asset allocation. This article investigates whether regimebased asset allocation can effectively respond to changes in financial regimes at the portfolio level, in an effort to provide better long-term results than more static...... approaches can offer. The authors center their regime-based approach around a regime-switching model with time-varying parameters that can match financial markets’ tendency to change behavior abruptly and the fact that the new behavior often persists for several periods after a change. In an asset universe...
Raymond, Karren-Lee; Kannis-Dymand, Lee; Lovell, Geoff P
2016-10-01
This study examined a graduated severity level approach to food addiction classification against associations with World Health Organization obesity classifications (body mass index, kg/m 2 ) among 408 people with type 2 diabetes. A survey including the Yale Food Addiction Scale and several demographic questions demonstrated four distinct Yale Food Addiction Scale symptom severity groups (in line with Diagnostic and Statistical Manual of Mental Disorders (5th ed.) severity indicators): non-food addiction, mild food addiction, moderate food addiction and severe food addiction. Analysis of variance with post hoc tests demonstrated each severity classification group was significantly different in body mass index, with each grouping being associated with increased World Health Organization obesity classifications. These findings have implications for diagnosing food addiction and implementing treatment and prevention methodologies of obesity among people with type 2 diabetes.
A hybrid clustering and classification approach for predicting crash injury severity on rural roads.
Hasheminejad, Seyed Hessam-Allah; Zahedi, Mohsen; Hasheminejad, Seyed Mohammad Hossein
2018-03-01
As a threat for transportation system, traffic crashes have a wide range of social consequences for governments. Traffic crashes are increasing in developing countries and Iran as a developing country is not immune from this risk. There are several researches in the literature to predict traffic crash severity based on artificial neural networks (ANNs), support vector machines and decision trees. This paper attempts to investigate the crash injury severity of rural roads by using a hybrid clustering and classification approach to compare the performance of classification algorithms before and after applying the clustering. In this paper, a novel rule-based genetic algorithm (GA) is proposed to predict crash injury severity, which is evaluated by performance criteria in comparison with classification algorithms like ANN. The results obtained from analysis of 13,673 crashes (5600 property damage, 778 fatal crashes, 4690 slight injuries and 2605 severe injuries) on rural roads in Tehran Province of Iran during 2011-2013 revealed that the proposed GA method outperforms other classification algorithms based on classification metrics like precision (86%), recall (88%) and accuracy (87%). Moreover, the proposed GA method has the highest level of interpretation, is easy to understand and provides feedback to analysts.
Using blocking approach to preserve privacy in classification rules by inserting dummy Transaction
Directory of Open Access Journals (Sweden)
Doryaneh Hossien Afshari
2017-03-01
Full Text Available The increasing rate of data sharing among organizations could maximize the risk of leaking sensitive knowledge. Trying to solve this problem leads to increase the importance of privacy preserving within the process of data sharing. In this study is focused on privacy preserving in classification rules mining as a technique of data mining. We propose a blocking algorithm to hiding sensitive classification rules. In the solution, rules' hiding occurs as a result of editing a set of transactions which satisfy sensitive classification rules. The proposed approach tries to deceive and block adversaries by inserting some dummy transactions. Finally, the solution has been evaluated and compared with other available solutions. Results show that limiting the number of attributes existing in each sensitive rule will lead to a decrease in both the number of lost rules and the production rate of ghost rules.
Directory of Open Access Journals (Sweden)
Meriastuti - Ginting
2015-07-01
Full Text Available Abstract. Inventory is considered as the most expensive, yet important,to any companies. It representsapproximately 50% of the total investment. Inventory cost has become one of the majorcontributorsto inefficiency, therefore it should be managed effectively. This study aims to propose an alternative inventory model, by using ABC multi-criteria classification approach to minimize total cost. By combining FANP (Fuzzy Analytical Network Process and TOPSIS (Technique of Order Preferences by Similarity to the Ideal Solution, the ABC multi-criteria classification approach identified 12 items of 69 inventory items as “outstanding important class” that contributed to 80% total inventory cost. This finding is then used as the basis to determine the proposed continuous review inventory model.This study found that by using fuzzy trapezoidal cost, the inventory turnover ratio can be increased, and inventory cost can be decreased by 78% for each item in “class A” inventory.Keywords:ABC multi-criteria classification, FANP-TOPSIS, continuous review inventory model lead-time demand distribution, trapezoidal fuzzy number
Dynamic Allocation or Diversification: A Regime-Based Approach to Multiple Assets
DEFF Research Database (Denmark)
Nystrup, Peter; Hansen, Bo William; Larsen, Henrik Olejasz
2018-01-01
’ behavior and a new, more intuitive way of inferring the hidden market regimes. The empirical results show that regime-based asset allocation is profitable, even when compared to a diversified benchmark portfolio. The results are robust because they are based on available market data with no assumptions...... about forecasting skills....
Kavzoglu, Taskin; Erdemir, Merve Yildiz; Tonbul, Hasan
2017-07-01
In object-based image analysis, obtaining representative image objects is an important prerequisite for a successful image classification. The major threat is the issue of scale selection due to the complex spatial structure of landscapes portrayed as an image. This study proposes a two-stage approach to conduct regionalized multiscale segmentation. In the first stage, an initial high-level segmentation is applied through a "broadscale," and a set of image objects characterizing natural borders of the landscape features are extracted. Contiguous objects are then merged to create regions by considering their normalized difference vegetation index resemblance. In the second stage, optimal scale values are estimated for the extracted regions, and multiresolution segmentation is applied with these settings. Two satellite images with different spatial and spectral resolutions were utilized to test the effectiveness of the proposed approach and its transferability to different geographical sites. Results were compared to those of image-based single-scale segmentation and it was found that the proposed approach outperformed the single-scale segmentations. Using the proposed methodology, significant improvement in terms of segmentation quality and classification accuracy (up to 5%) was achieved. In addition, the highest classification accuracies were produced using fine-scale values.
A coarse-to-fine approach for medical hyperspectral image classification with sparse representation
Chang, Lan; Zhang, Mengmeng; Li, Wei
2017-10-01
A coarse-to-fine approach with sparse representation is proposed for medical hyperspectral image classification in this work. Segmentation technique with different scales is employed to exploit edges of the input image, where coarse super-pixel patches provide global classification information while fine ones further provide detail information. Different from common RGB image, hyperspectral image has multi bands to adjust the cluster center with more high precision. After segmentation, each super pixel is classified by recently-developed sparse representation-based classification (SRC), which assigns label for testing samples in one local patch by means of sparse linear combination of all the training samples. Furthermore, segmentation with multiple scales is employed because single scale is not suitable for complicate distribution of medical hyperspectral imagery. Finally, classification results for different sizes of super pixel are fused by some fusion strategy, offering at least two benefits: (1) the final result is obviously superior to that of segmentation with single scale, and (2) the fusion process significantly simplifies the choice of scales. Experimental results using real medical hyperspectral images demonstrate that the proposed method outperforms the state-of-the-art SRC.
Zorman, Milan; Sánchez de la Rosa, José Luis; Dinevski, Dejan
2011-12-01
It is not very often to see a symbol-based machine learning approach to be used for the purpose of image classification and recognition. In this paper we will present such an approach, which we first used on the follicular lymphoma images. Lymphoma is a broad term encompassing a variety of cancers of the lymphatic system. Lymphoma is differentiated by the type of cell that multiplies and how the cancer presents itself. It is very important to get an exact diagnosis regarding lymphoma and to determine the treatments that will be most effective for the patient's condition. Our work was focused on the identification of lymphomas by finding follicles in microscopy images provided by the Laboratory of Pathology in the University Hospital of Tenerife, Spain. We divided our work in two stages: in the first stage we did image pre-processing and feature extraction, and in the second stage we used different symbolic machine learning approaches for pixel classification. Symbolic machine learning approaches are often neglected when looking for image analysis tools. They are not only known for a very appropriate knowledge representation, but also claimed to lack computational power. The results we got are very promising and show that symbolic approaches can be successful in image analysis applications.
Lazcano, R.; Madroñal, D.; Fabelo, H.; Ortega, S.; Salvador, R.; Callicó, G. M.; Juárez, E.; Sanz, C.
2017-10-01
Hyperspectral Imaging (HI) assembles high resolution spectral information from hundreds of narrow bands across the electromagnetic spectrum, thus generating 3D data cubes in which each pixel gathers the spectral information of the reflectance of every spatial pixel. As a result, each image is composed of large volumes of data, which turns its processing into a challenge, as performance requirements have been continuously tightened. For instance, new HI applications demand real-time responses. Hence, parallel processing becomes a necessity to achieve this requirement, so the intrinsic parallelism of the algorithms must be exploited. In this paper, a spatial-spectral classification approach has been implemented using a dataflow language known as RVCCAL. This language represents a system as a set of functional units, and its main advantage is that it simplifies the parallelization process by mapping the different blocks over different processing units. The spatial-spectral classification approach aims at refining the classification results previously obtained by using a K-Nearest Neighbors (KNN) filtering process, in which both the pixel spectral value and the spatial coordinates are considered. To do so, KNN needs two inputs: a one-band representation of the hyperspectral image and the classification results provided by a pixel-wise classifier. Thus, spatial-spectral classification algorithm is divided into three different stages: a Principal Component Analysis (PCA) algorithm for computing the one-band representation of the image, a Support Vector Machine (SVM) classifier, and the KNN-based filtering algorithm. The parallelization of these algorithms shows promising results in terms of computational time, as the mapping of them over different cores presents a speedup of 2.69x when using 3 cores. Consequently, experimental results demonstrate that real-time processing of hyperspectral images is achievable.
Kim, Jiseon
2010-01-01
Classification testing has been widely used to make categorical decisions by determining whether an examinee has a certain degree of ability required by established standards. As computer technologies have developed, classification testing has become more computerized. Several approaches have been proposed and investigated in the context of…
TYPES OF POLITICAL REGIMES IN THE IRKUTSK REGION
Directory of Open Access Journals (Sweden)
И В Орлова
2017-12-01
Full Text Available The authors consider contemporary western and Russian classifications of regional political regimes and their applicability for Russia. Based on the analysis of political theories, the authors chose the traditional typology of regional political regimes focusing on the minimalist interpretation of democracy (electoral competition and methods for identifying regional scenarios introduced by V.Ya. Gelman. The authors study the case of the Irkutsk Region as a region with conflicting elites, in which in a short period several regional heads were replaced. Based on the contemporary political history, the authors analyze the regional political regime using the following criteria: democracy/autocracy, consolidation/oligo-poly, compromise/conflict relations within the ruling elite. The results of the analysis prove the existence of checks and balances in the political system of the Irkutsk Region. Such a system restrains strong politicians attempts to monopolize the political power in the region. When any political player gains too much influence, other centers of power unite against him and together return the situation to the status quo. The political regime of the Irkutsk Region ensures a relatively high level of political competition, at the same time it is a part of the uncompetitive political regime of the Russian Federation, therefore it is a ‘hybrid democracy’. The authors’ analysis of intra-elite relations in the region revealed a high predisposition to conflicts with the dominant scenario of ‘war of all against all’.
Directory of Open Access Journals (Sweden)
R. Jegadeeshwaran
2015-03-01
Full Text Available In automobile, brake system is an essential part responsible for control of the vehicle. Any failure in the brake system impacts the vehicle's motion. It will generate frequent catastrophic effects on the vehicle cum passenger's safety. Thus the brake system plays a vital role in an automobile and hence condition monitoring of the brake system is essential. Vibration based condition monitoring using machine learning techniques are gaining momentum. This study is one such attempt to perform the condition monitoring of a hydraulic brake system through vibration analysis. In this research, the performance of a Clonal Selection Classification Algorithm (CSCA for brake fault diagnosis has been reported. A hydraulic brake system test rig was fabricated. Under good and faulty conditions of a brake system, the vibration signals were acquired using a piezoelectric transducer. The statistical parameters were extracted from the vibration signal. The best feature set was identified for classification using attribute evaluator. The selected features were then classified using CSCA. The classification accuracy of such artificial intelligence technique has been compared with other machine learning approaches and discussed. The Clonal Selection Classification Algorithm performs better and gives the maximum classification accuracy (96% for the fault diagnosis of a hydraulic brake system.
Directory of Open Access Journals (Sweden)
Rodrigo de Sales
2017-09-01
Full Text Available The studies of library classification generally interact with the historical contextualization approach and with the classification ideas typical of Philosophy. In the 19th century, the North-American philosopher and educator William Torrey Harris developed a book classification at the St. Louis Public School, based on Francis Bacon and Georg Wilhelm Friedrich Hegel. The objective of this essay is to analyze Harris’s classification, reflecting upon his theoretical and philosophical backgrounds. To achieve such objective, this essay adopts a critical-descriptive approach for analysis. Results show some influences of Bacon and Hegel in Harris’s classification.
Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use
Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil
2013-01-01
The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648
Ashrafi, Mahmoud Reza; Tavasoli, Ali Reza
2017-05-01
Childhood leukodystrophies are a growing category of neurological disorders in pediatric neurology practice. With the help of new advanced genetic studies such as whole exome sequencing (WES) and whole genome sequencing (WGS), the list of childhood heritable white matter disorders has been increased to more than one hundred disorders. During the last three decades, the basic concepts and definitions, classification, diagnostic approach and medical management of these disorders much have changed. Pattern recognition based on brain magnetic resonance imaging (MRI), has played an important role in this process. We reviewed the last Global Leukodystrophy Initiative (GLIA) expert opinions in definition, new classification, diagnostic approach and medical management including emerging treatments for pediatric leukodystrophies. Copyright © 2017 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
The concept of regime values: Are revitalization and regime change possible?
Overeem, P.
2015-01-01
Among the plethora of public values, one special class is that of “regime values.” This notion plays a central role in the constitutional approach to public administration mainly developed by the late John A. Rohr. In this article, an attempt is made to assess the viability of Rohr’s concept of
Botsis, Taxiarchis; Woo, Emily Jane; Ball, Robert
2013-07-01
Automating the classification of adverse event reports is an important step to improve the efficiency of vaccine safety surveillance. Previously we showed it was possible to classify reports using features extracted from the text of the reports. The aim of this study was to use the information encoded in the Medical Dictionary for Regulatory Activities (MedDRA(®)) in the US Vaccine Adverse Event Reporting System (VAERS) to support and evaluate two classification approaches: a multiple information retrieval strategy and a rule-based approach. To evaluate the performance of these approaches, we selected the conditions of anaphylaxis and Guillain-Barré syndrome (GBS). We used MedDRA(®) Preferred Terms stored in the VAERS, and two standardized medical terminologies: the Brighton Collaboration (BC) case definitions and Standardized MedDRA(®) Queries (SMQ) to classify two sets of reports for GBS and anaphylaxis. Two approaches were used: (i) the rule-based instruments that are available by the two terminologies (the Automatic Brighton Classification [ABC] tool and the SMQ algorithms); and (ii) the vector space model. We found that the rule-based instruments, particularly the SMQ algorithms, achieved a high degree of specificity; however, there was a cost in terms of sensitivity in all but the narrow GBS SMQ algorithm that outperformed the remaining approaches (sensitivity in the testing set was equal to 99.06 % for this algorithm vs. 93.40 % for the vector space model). In the case of anaphylaxis, the vector space model achieved higher sensitivity compared with the best values of both the ABC tool and the SMQ algorithms in the testing set (86.44 % vs. 64.11 % and 52.54 %, respectively). Our results showed the superiority of the vector space model over the existing rule-based approaches irrespective of the standardized medical knowledge represented by either the SMQ or the BC case definition. The vector space model might make automation of case definitions for
State Traditions and Language Regimes: A Historical Institutionalism Approach to Language Policy
Directory of Open Access Journals (Sweden)
Sonntag Selma K.
2015-12-01
Full Text Available This paper is an elaboration of a theoretical framework we developed in the introductory chapter of our co-edited volume, State Traditions and Language Regimes (McGill-Queen’s University Press, 2015. Using a historical institutionalism approach derived from political science, we argue that language policies need to be understood in terms of their historical and institutional context. The concept of ‘state tradition’ focuses our attention on the relative autonomy of the state in terms of its normative and institutional traditions that lead to particular path dependencies of language policy choices, subject to change at critical junctures. ‘Language regime’ is the conceptual link between state traditions and language policy choices: it allows us to analytically conceptualize how and why these choices are made and how and why they change. We suggest that our framework offers a more robust analysis of language politics than other approaches found in sociolinguistics and normative theory. It also challenges political science to become more engaged with scholarly debate on language policy and linguistic diversity.
Directory of Open Access Journals (Sweden)
Dong Jiang
Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.
Couette flow regimes with heat transfer in rarefied gas
Energy Technology Data Exchange (ETDEWEB)
Abramov, A. A., E-mail: alabr54@mail.ru; Butkovskii, A. V., E-mail: albutkov@mail.ru [Zhukovski Central Aerohydrodynamics Institute (Russian Federation)
2013-06-15
Based on numerical solution of the Boltzmann equation by direct statistic simulation, the Couette flow with heat transfer is studied in a broad range of ratios of plate temperatures and Mach numbers of a moving plate. Flow regime classification by the form of the dependences of the energy flux and friction stress on the Knudsen number Kn is proposed. These dependences can be simultaneously monotonic and nonmonotonic and have maxima. Situations are possible in which the dependence of the energy flux transferred to a plate on Kn has a minimum, while the dependence of the friction stress is monotonic or even has a maximum. Also, regimes exist in which the dependence of the energy flux on Kn has a maximum, while the dependence of the friction stress is monotonic, and vice versa.
Exchange rate regimes and macroeconomic instabilities in Sub-Saharan Africa
Directory of Open Access Journals (Sweden)
Yaya Camara Seydou
2015-01-01
Full Text Available This article addresses macroeconomic instabilities according to exchange rate regimes in Sub-Saharan Africa (SSA. Based on International Monetary Fund's exchange rate regimes de facto classification, the global sample, SSA, is first divided into two subsamples, which are countries within CFA franc zone (ZCFA and those outside CFA franc zone (HZCFA, and then into four categories, which are the Western Economic and Monetary Union (WAEMU, the Central African Economic and Monetary Community, the countries CFA franc zone with fix exchange rate regimes(HZCFA-FIX, and the countries outside CFA franc zone with flexible exchange rate regimes(HZCFA-FLEX. By applying advanced statistical and econometric methods upon internal and external macroeconomic equilibrium conditions, we show that the inflation, the GDP (or the output and the real exchange rate (RER are very volatile in SSA. However, we found out that they are more volatile in the group HZCFA comparatively to the group ZCFA. We also found out that they are higher in the group HZCFA-FIX than the group HZCFA-FLEX. Moreover, we found out that a high instability of the inflation is combined with those of the output and the RER.
Predictive Models of the Hydrological Regime of Unregulated Streams in Arizona
Anning, David W.; Parker, John T.C.
2009-01-01
Three statistical models were developed by the U.S. Geological Survey in cooperation with the Arizona Department of Environmental Quality to improve the predictability of flow occurrence in unregulated streams throughout Arizona. The models can be used to predict the probabilities of the hydrological regime being one of four categories developed by this investigation: perennial, which has streamflow year-round; nearly perennial, which has streamflow 90 to 99.9 percent of the year; weakly perennial, which has streamflow 80 to 90 percent of the year; or nonperennial, which has streamflow less than 80 percent of the year. The models were developed to assist the Arizona Department of Environmental Quality in selecting sites for participation in the U.S. Environmental Protection Agency's Environmental Monitoring and Assessment Program. One model was developed for each of the three hydrologic provinces in Arizona - the Plateau Uplands, the Central Highlands, and the Basin and Range Lowlands. The models for predicting the hydrological regime were calibrated using statistical methods and explanatory variables of discharge, drainage-area, altitude, and location data for selected U.S. Geological Survey streamflow-gaging stations and a climate index derived from annual precipitation data. Models were calibrated on the basis of streamflow data from 46 stations for the Plateau Uplands province, 82 stations for the Central Highlands province, and 90 stations for the Basin and Range Lowlands province. The models were developed using classification trees that facilitated the analysis of mixed numeric and factor variables. In all three models, a threshold stream discharge was the initial variable to be considered within the classification tree and was the single most important explanatory variable. If a stream discharge value at a station was below the threshold, then the station record was determined as being nonperennial. If, however, the stream discharge was above the threshold
Fire Regime Characteristics along Environmental Gradients in Spain
Directory of Open Access Journals (Sweden)
María Vanesa Moreno
2016-11-01
Full Text Available Concern regarding global change has increased the need to understand the relationship between fire regime characteristics and the environment. Pyrogeographical theory suggests that fire regimes are constrained by climate, vegetation and fire ignition processes, but it is not obvious how fire regime characteristics are related to those factors. We used a three-matrix approach with a multivariate statistical methodology that combined an ordination method and fourth-corner analysis for hypothesis testing to investigate the relationship between fire regime characteristics and environmental gradients across Spain. Our results suggest that fire regime characteristics (i.e., density and seasonality of fire activity are constrained primarily by direct gradients based on climate, population, and resource gradients based on forest potential productivity. Our results can be used to establish a predictive model for how fire regimes emerge in order to support fire management, particularly as global environmental changes impact fire regime characteristics.
International Nuclear Information System (INIS)
Gaudio, P; Gelfusa, M; Lupelli, I; Murari, A; Vega, J
2014-01-01
A new approach to determine the power law expressions for the threshold between the H and L mode of confinement is presented. The method is based on two powerful machine learning tools for classification: neural networks and support vector machines. Using as inputs clear examples of the systems on either side of the transition, the machine learning tools learn the input–output mapping corresponding to the equations of the boundary separating the confinement regimes. Systematic tests with synthetic data show that the machine learning tools provide results competitive with traditional statistical regression and more robust against random noise and systematic errors. The developed tools have then been applied to the multi-machine International Tokamak Physics Activity International Global Threshold Database of validated ITER-like Tokamak discharges. The machine learning tools converge on the same scaling law parameters obtained with non-linear regression. On the other hand, the developed tools allow a reduction of 50% of the uncertainty in the extrapolations to ITER. Therefore the proposed approach can effectively complement traditional regression since its application poses much less stringent requirements on the experimental data, to be used to determine the scaling laws, because they do not require examples exactly at the moment of the transition. (paper)
Solar wind classification from a machine learning perspective
Heidrich-Meisner, V.; Wimmer-Schweingruber, R. F.
2017-12-01
It is a very well known fact that the ubiquitous solar wind comes in at least two varieties, the slow solar wind and the coronal hole wind. The simplified view of two solar wind types has been frequently challenged. Existing solar wind categorization schemes rely mainly on different combinations of the solar wind proton speed, the O and C charge state ratios, the Alfvén speed, the expected proton temperature and the specific proton entropy. In available solar wind classification schemes, solar wind from stream interaction regimes is often considered either as coronal hole wind or slow solar wind, although their plasma properties are different compared to "pure" coronal hole or slow solar wind. As shown in Neugebauer et al. (2016), even if only two solar wind types are assumed, available solar wind categorization schemes differ considerably for intermediate solar wind speeds. Thus, the decision boundary between the coronal hole and the slow solar wind is so far not well defined.In this situation, a machine learning approach to solar wind classification can provide an additional perspective.We apply a well-known machine learning method, k-means, to the task of solar wind classification in order to answer the following questions: (1) How many solar wind types can reliably be identified in our data set comprised of ten years of solar wind observations from the Advanced Composition Explorer (ACE)? (2) Which combinations of solar wind parameters are particularly useful for solar wind classification?Potential subtypes of slow solar wind are of particular interest because they can provide hints of respective different source regions or release mechanisms of slow solar wind.
Buried penis: classification surgical approach.
Hadidi, Ahmed T
2014-02-01
The purpose of this study was to describe morphological classification of congenital buried penis (BP) and present a versatile surgical approach for correction. Sixty-one patients referred with BP were classified into 3 grades according to morphological findings: Grade 1-29 patients with Longer Inner Prepuce (LIP) only, Grade II-20 patients who presented with LIP associated with indrawn penis that required division of the fundiform and suspensory ligaments, and Grade III-12 patients who had in addition to the above, excess supra-pubic fat. A ventral midline penile incision extending from the tip of prepuce down to the penoscrotal junction was used in all patients. The operation was tailored according to the BP Grade. All patients underwent circumcision. Mean follow up was 3 years (range 1 to 10). All 61 patients had an abnormally long inner prepuce (LIP). Forty-seven patients had a short penile shaft. Early improvement was noted in all cases. Satisfactory results were achieved in all 29 patients in grade I and in 27 patients in grades II and III. Five children (Grades II and III) required further surgery (9%). Congenital buried penis is a spectrum characterized by LIP and may include in addition; short penile shaft, abnormal attachment of fundiform, and suspensory ligaments and excess supra-pubic fat. Congenital Mega Prepuce (CMP) is a variant of Grade I BP, with LIP characterized by intermittent ballooning of the genital area. Copyright © 2014 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Xiaodong Na
2018-05-01
Full Text Available Zhalong wetland is a globally important breeding habitat for many rare migratory bird species. Prompted by the high demand for temporal and spatial information about the wetland’s hydrological regimes and landscape patterns, eight time series Radarsat-2 images were utilized to detect the flooding characteristics of the Zhalong wetland. Subsequently, a random forest model was built to discriminate wetlands from other land cover types, combining with optical, radar, and hydrological regime data derived from multitemporal synthetic aperture radar (SAR images. The results showed that hydrological regimes variables, including flooding extent and flooding frequency, derived from multitemporal SAR images, improve the land cover classification accuracy in the natural wetlands distribution area. The permutation importance scores derived from the random forest classifier indicate that normalized difference vegetation index (NDVI calculated from optical imagery and the flooding frequency derived from multitemporal SAR imagery were found to be the most important variables for land cover mapping. Accuracy testing indicate that the addition of hydrological regime features effectively depressed the omission error rates (from 52.14% to 2.88% of marsh and the commission error (from 77.34% to 51.27% of meadow, thereby improving the overall classification accuracy (from 76.49% to 91.73%. The hydrological regimes and land cover monitoring in the typical wetlands are important for eco-hydrological modeling, biodiversity conservation, and regional ecology and water security.
IAEA Classification of Uranium Deposits
International Nuclear Information System (INIS)
Bruneton, Patrice
2014-01-01
Classifications of uranium deposits follow two general approaches, focusing on: • descriptive features such as the geotectonic position, the host rock type, the orebody morphology, …… : « geologic classification »; • or on genetic aspects: « genetic classification »
Signal classification using global dynamical models, Part I: Theory
International Nuclear Information System (INIS)
Kadtke, J.; Kremliovsky, M.
1996-01-01
Detection and classification of signals is one of the principal areas of signal processing, and the utilization of nonlinear information has long been considered as a way of improving performance beyond standard linear (e.g. spectral) techniques. Here, we develop a method for using global models of chaotic dynamical systems theory to define a signal classification processing chain, which is sensitive to nonlinear correlations in the data. We use it to demonstrate classification in high noise regimes (negative SNR), and argue that classification probabilities can be directly computed from ensemble statistics in the model coefficient space. We also develop a modification for non-stationary signals (i.e. transients) using non-autonomous ODEs. In Part II of this paper, we demonstrate the analysis on actual open ocean acoustic data from marine biologics. copyright 1996 American Institute of Physics
Transient dynamics of a quantum-dot: From Kondo regime to mixed valence and to empty orbital regimes
Cheng, YongXi; Li, ZhenHua; Wei, JianHua; Nie, YiHang; Yan, YiJing
2018-04-01
Based on the hierarchical equations of motion approach, we study the time-dependent transport properties of a strongly correlated quantum dot system in the Kondo regime (KR), mixed valence regime (MVR), and empty orbital regime (EOR). We find that the transient current in KR shows the strongest nonlinear response and the most distinct oscillation behaviors. Both behaviors become weaker in MVR and diminish in EOR. To understand the physical insight, we examine also the corresponding dot occupancies and the spectral functions, with their dependence on the Coulomb interaction, temperature, and applied step bias voltage. The above nonlinear and oscillation behaviors could be understood as the interplay between dynamical Kondo resonance and single electron resonant-tunneling.
The effects of crude oil shocks on stock market shifts behaviour A regime switching approach
Energy Technology Data Exchange (ETDEWEB)
Aloui, Chaker; Jammazi, Rania [International Finance Group-Tunisia, Faculty of Management and Economic Sciences of Tunis, Boulevard du 7 novembre, El Manar University, B.P. 248, C.P. 2092, Tunis Cedex (Tunisia)
2009-09-15
In this paper we develop a two regime Markov-switching EGARCH model introduced by Henry [Henry, O., 2009. Regime switching in the relationship between equity returns and short-term interest rates. Journal of Banking and Finance 33, 405-414.] to examine the relationship between crude oil shocks and stock markets. An application to stock markets of UK, France and Japan over the sample period January 1989 to December 2007 illustrates plausible results. We detect two episodes of series behaviour one relative to low mean/high variance regime and the other to high mean/low variance regime. Furthermore, there is evidence that common recessions coincide with the low mean/high variance regime. In addition, we allow both real stock returns and probability of transitions from one regime to another to depend on the net oil price increase variable. The findings show that rises in oil price has a significant role in determining both the volatility of stock returns and the probability of transition across regimes. (author)
The effects of crude oil shocks on stock market shifts behaviour A regime switching approach
International Nuclear Information System (INIS)
Aloui, Chaker; Jammazi, Rania
2009-01-01
In this paper we develop a two regime Markov-switching EGARCH model introduced by Henry [Henry, O., 2009. Regime switching in the relationship between equity returns and short-term interest rates. Journal of Banking and Finance 33, 405-414.] to examine the relationship between crude oil shocks and stock markets. An application to stock markets of UK, France and Japan over the sample period January 1989 to December 2007 illustrates plausible results. We detect two episodes of series behaviour one relative to low mean/high variance regime and the other to high mean/low variance regime. Furthermore, there is evidence that common recessions coincide with the low mean/high variance regime. In addition, we allow both real stock returns and probability of transitions from one regime to another to depend on the net oil price increase variable. The findings show that rises in oil price has a significant role in determining both the volatility of stock returns and the probability of transition across regimes. (author)
The global safety regime - Setting the stage
International Nuclear Information System (INIS)
Meserve, R.A.
2005-01-01
The existing global safety regime has arisen from the exercise of sovereign authority, with an overlay of voluntary international cooperation from a network of international and regional organizations and intergovernmental agreements. This system has, in the main, served us well. For several reasons, the time is ripe to consider the desired shape of a future global safety regime and to take steps to achieve it. First, every nation's reliance on nuclear power is hostage to some extent to safety performance elsewhere in the world because of the effects on public attitudes and hence there is an interest in ensuring achievement of common standards. Second, the world is increasingly interdependent and the vendors of nuclear power plants seek to market their products throughout the globe. Efficiency would arise from the avoidance of needless differences in approach that require custom modifications from country to country. Finally, we have much to learn from each other and a common effort would strengthen us all. Such an effort might also serve to enhance public confidence. Some possible characteristics of such a regime can be identified. The regime should reflect a global consensus on the level of safety that should be achieved. There should be sufficient standardization of approach so that expertise and equipment can be used everywhere without significant modification. There should be efforts to ensure a fundamental commitment to safety and the encouragement of a safety culture. And there should be efforts to adopt more widely the best regulatory practices, recognizing that some modifications in approach may be necessary to reflect each nation's legal and social culture. At the same type, the regime should have the characteristics of flexibility, transparency, stability, practicality, and encouragement of competence. (author)
Gauduel, Y. A.
2017-05-01
A major challenge of spatio-temporal radiation biomedicine concerns the understanding of biophysical events triggered by an initial energy deposition inside confined ionization tracks. This contribution deals with an interdisciplinary approach that concerns cutting-edge advances in real-time radiation events, considering the potentialities of innovating strategies based on ultrafast laser science, from femtosecond photon sources to advanced techniques of ultrafast TW laser-plasma accelerator. Recent advances of powerful TW laser sources ( 1019 W cm-2) and laser-plasma interactions providing ultra-short relativistic particle beams in the energy domain 5-200 MeV open promising opportunities for the development of high energy radiation femtochemistry (HERF) in the prethermal regime of secondary low-energy electrons and for the real-time imaging of radiation-induced biomolecular alterations at the nanoscopic scale. New developments would permit to correlate early radiation events triggered by ultrashort radiation sources with a molecular approach of Relative Biological Effectiveness (RBE). These emerging research developments are crucial to understand simultaneously, at the sub-picosecond and nanometric scales, the early consequences of ultra-short-pulsed radiation on biomolecular environments or integrated biological entities. This innovating approach would be applied to biomedical relevant concepts such as the emerging domain of real-time nanodosimetry for targeted pro-drug activation and pulsed radio-chimiotherapy of cancers.
An attempt of classification of theoretical approaches to national identity
Directory of Open Access Journals (Sweden)
Milošević-Đorđević Jasna S.
2003-01-01
Full Text Available It is compulsory that complex social concepts should be defined in different ways and approached from the perspective of different science disciplines. Therefore, it is difficult to precisely define them without overlapping of meaning with other similar concepts. This paper has made an attempt towards theoretical classification of the national identity and differentiate that concept in comparison to the other related concepts (race, ethnic group, nation, national background, authoritativeness, patriarchy. Theoretical assessments are classified into two groups: ones that are dealing with nature of national identity and others that are stating one or more dimensions of national identity, crucial for its determination. On the contrary to the primordialistic concept of national identity, describing it as a fundamental, deeply rooted human feature, there are many numerous contemporary theoretical approaches (instrumentalist, constructivist, functionalistic, emphasizing changeable, fluid, instrumentalist function of the national identity. Fundamental determinants of national identity are: language, culture (music, traditional myths, state symbols (territory, citizenship, self-categorization, religion, set of personal characteristics and values.
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Abatement costs of post-Kyoto climate regimes
International Nuclear Information System (INIS)
Elzen, Michel den; Lucas, Paul; Vuuren, Detlef van
2005-01-01
This article analyses the abatement costs of three post-Kyoto regimes for differentiating commitments compatible with stabilising atmospheric greenhouse gases concentrations at 550 ppmv CO 2 equivalent in 2100. The three regimes explored are: (1) the Multi-Stage approach assumes a gradual increase in the number of Parties involved who are adopting either emission intensity or reductions targets; (2) the Brazilian Proposal approach, i.e. the allocation or reductions based on countries' contribution to temperature increase; (3) Contraction and Convergence, with full participation in convergence of per capita emission allowances. In 2050, the global costs increase up to about 1% of the world GDP, ranging from 0.5% to 1.5%, depending on baseline scenario and marginal abatement costs. Four groups of regions can be identified on the basis of similar costs (expressed as the percentage of GDP). These are: (1) OECD regions with average costs; (2) FSU, the Middle East and Latin America with high costs; (3) South-East Asia and East Asia (incl. China) with low costs; and (4) South Asia (incl. India) and Africa with net gains from emissions trading for most regimes. The Brazilian Proposal approach gives the highest costs for groups 1 and 2. The distribution of costs for the Contraction and Convergence approach highly depends on the convergence year. The Multi-Stage approach and Contraction and Convergence (convergence year 2050) seem to result in relatively the most even distribution of costs amongst all Parties
Comparison of four approaches to a rock facies classification problem
Dubois, M.K.; Bohling, Geoffrey C.; Chakrabarti, S.
2007-01-01
In this study, seven classifiers based on four different approaches were tested in a rock facies classification problem: classical parametric methods using Bayes' rule, and non-parametric methods using fuzzy logic, k-nearest neighbor, and feed forward-back propagating artificial neural network. Determining the most effective classifier for geologic facies prediction in wells without cores in the Panoma gas field, in Southwest Kansas, was the objective. Study data include 3600 samples with known rock facies class (from core) with each sample having either four or five measured properties (wire-line log curves), and two derived geologic properties (geologic constraining variables). The sample set was divided into two subsets, one for training and one for testing the ability of the trained classifier to correctly assign classes. Artificial neural networks clearly outperformed all other classifiers and are effective tools for this particular classification problem. Classical parametric models were inadequate due to the nature of the predictor variables (high dimensional and not linearly correlated), and feature space of the classes (overlapping). The other non-parametric methods tested, k-nearest neighbor and fuzzy logic, would need considerable improvement to match the neural network effectiveness, but further work, possibly combining certain aspects of the three non-parametric methods, may be justified. ?? 2006 Elsevier Ltd. All rights reserved.
She Son Gun
2014-01-01
Approaches to development of classification of the state methods of regulation of economy are considered. On the basis of the provided review the complex method of state regulation of business activity is reasonable. The offered principles allow improving public administration and can be used in industry concepts and state programs on support of small business in fishery.
Energy Technology Data Exchange (ETDEWEB)
Albert, Christopher G.; Heyn, Martin F.; Kapper, Gernot; Kernbichler, Winfried; Martitsch, Andreas F. [Fusion@ÖAW, Institut für Theoretische Physik - Computational Physics, Technische Universität Graz, Petersgasse 16, 8010 Graz (Austria); Kasilov, Sergei V. [Fusion@ÖAW, Institut für Theoretische Physik - Computational Physics, Technische Universität Graz, Petersgasse 16, 8010 Graz (Austria); Institute of Plasma Physics, National Science Center “Kharkov Institute of Physics and Technology,” ul. Akademicheskaya 1, 61108 Kharkov (Ukraine)
2016-08-15
Toroidal torque generated by neoclassical viscosity caused by external non-resonant, non-axisymmetric perturbations has a significant influence on toroidal plasma rotation in tokamaks. In this article, a derivation for the expressions of toroidal torque and radial transport in resonant regimes is provided within quasilinear theory in canonical action-angle variables. The proposed approach treats all low-collisional quasilinear resonant neoclassical toroidal viscosity regimes including superbanana-plateau and drift-orbit resonances in a unified way and allows for magnetic drift in all regimes. It is valid for perturbations on toroidally symmetric flux surfaces of the unperturbed equilibrium without specific assumptions on geometry or aspect ratio. The resulting expressions are shown to match the existing analytical results in the large aspect ratio limit. Numerical results from the newly developed code NEO-RT are compared to calculations by the quasilinear version of the code NEO-2 at low collisionalities. The importance of the magnetic shear term in the magnetic drift frequency and a significant effect of the magnetic drift on drift-orbit resonances are demonstrated.
Automated lung nodule classification following automated nodule detection on CT: A serial approach
International Nuclear Information System (INIS)
Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.
2003-01-01
We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation
Directory of Open Access Journals (Sweden)
Saeid Hamzeh
2016-10-01
Full Text Available Land suitability classification is important in planning and managing sustainable land use. Most approaches to land suitability analysis combine a large number of land and soil parameters, and are time-consuming and costly. In this study, a potentially useful technique (combined feature selection and fuzzy-AHP method to increase the efficiency of land suitability analysis was presented. To this end, three different feature selection algorithms—random search, best search and genetic methods—were used to determine the most effective parameters for land suitability classification for the cultivation of barely in the Shavur Plain, southwest Iran. Next, land suitability classes were calculated for all methods by using the fuzzy-AHP approach. Salinity (electrical conductivity (EC, alkalinity (exchangeable sodium percentage (ESP, wetness and soil texture were selected using the random search method. Gypsum, EC, ESP, and soil texture were selected using both the best search and genetic methods. The result shows a strong agreement between the standard fuzzy-AHP methods and methods presented in this study. The values of Kappa coefficients were 0.82, 0.79 and 0.79 for the random search, best search and genetic methods, respectively, compared with the standard fuzzy-AHP method. Our results indicate that EC, ESP, soil texture and wetness are the most effective features for evaluating land suitability classification for the cultivation of barely in the study area, and uses of these parameters, together with their appropriate weights as obtained from fuzzy-AHP, can perform good results for land suitability classification. So, the combined feature selection presented and the fuzzy-AHP approach has the potential to save time and money for land suitability classification.
Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca
2017-04-01
The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of
PATTERN CLASSIFICATION APPROACHES TO MATCHING BUILDING POLYGONS AT MULTIPLE SCALES
Directory of Open Access Journals (Sweden)
X. Zhang
2012-07-01
Full Text Available Matching of building polygons with different levels of detail is crucial in the maintenance and quality assessment of multi-representation databases. Two general problems need to be addressed in the matching process: (1 Which criteria are suitable? (2 How to effectively combine different criteria to make decisions? This paper mainly focuses on the second issue and views data matching as a supervised pattern classification. Several classifiers (i.e. decision trees, Naive Bayes and support vector machines are evaluated for the matching task. Four criteria (i.e. position, size, shape and orientation are used to extract information for these classifiers. Evidence shows that these classifiers outperformed the weighted average approach.
Sow-activity classification from acceleration patterns
DEFF Research Database (Denmark)
Escalante, Hugo Jair; Rodriguez, Sara V.; Cordero, Jorge
2013-01-01
sow-activity classification can be approached with standard machine learning methods for pattern classification. Individual predictions for elements of times series of arbitrary length are combined to classify it as a whole. An extensive comparison of representative learning algorithms, including......This paper describes a supervised learning approach to sow-activity classification from accelerometer measurements. In the proposed methodology, pairs of accelerometer measurements and activity types are considered as labeled instances of a usual supervised classification task. Under this scenario...... neural networks, support vector machines, and ensemble methods, is presented. Experimental results are reported using a data set for sow-activity classification collected in a real production herd. The data set, which has been widely used in related works, includes measurements from active (Feeding...
Chen, D. W.; Sengupta, S. K.; Welch, R. M.
1989-01-01
This paper compares the results of cloud-field classification derived from two simplified vector approaches, the Sum and Difference Histogram (SADH) and the Gray Level Difference Vector (GLDV), with the results produced by the Gray Level Cooccurrence Matrix (GLCM) approach described by Welch et al. (1988). It is shown that the SADH method produces accuracies equivalent to those obtained using the GLCM method, while the GLDV method fails to resolve error clusters. Compared to the GLCM method, the SADH method leads to a 31 percent saving in run time and a 50 percent saving in storage requirements, while the GLVD approach leads to a 40 percent saving in run time and an 87 percent saving in storage requirements.
Mackinejad, Kioumars; Sharifi, Vandad
2006-01-01
In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.
Shifting balance of thermokarst lake ice regimes across the Arctic Coastal Plain of northern Alaska
Arp, Christopher D.; Jones, Benjamin M.; Lu, Zong; Whitman, Matthew S.
2012-01-01
The balance of thermokarst lakes with bedfast- and floating-ice regimes across Arctic lowlands regulates heat storage, permafrost thaw, winter-water supply, and over-wintering aquatic habitat. Using a time-series of late-winter synthetic aperture radar (SAR) imagery to distinguish lake ice regimes in two regions of the Arctic Coastal Plain of northern Alaska from 2003–2011, we found that 18% of the lakes had intermittent ice regimes, varying between bedfast-ice and floating-ice conditions. Comparing this dataset with a radar-based lake classification from 1980 showed that 16% of the bedfast-ice lakes had shifted to floating-ice regimes. A simulated lake ice thinning trend of 1.5 cm/yr since 1978 is believed to be the primary factor driving this form of lake change. The most profound impacts of this regime shift in Arctic lakes may be an increase in the landscape-scale thermal offset created by additional lake heat storage and its role in talik development in otherwise continuous permafrost as well as increases in over-winter aquatic habitat and winter-water supply.
Directory of Open Access Journals (Sweden)
Benjamin W. Heumann
2011-11-01
Full Text Available Mangroves provide valuable ecosystem goods and services such as carbon sequestration, habitat for terrestrial and marine fauna, and coastal hazard mitigation. The use of satellite remote sensing to map mangroves has become widespread as it can provide accurate, efficient, and repeatable assessments. Traditional remote sensing approaches have failed to accurately map fringe mangroves and true mangrove species due to relatively coarse spatial resolution and/or spectral confusion with landward vegetation. This study demonstrates the use of the new Worldview-2 sensor, Object-based image analysis (OBIA, and support vector machine (SVM classification to overcome both of these limitations. An exploratory spectral separability showed that individual mangrove species could not be spectrally separated, but a distinction between true and associate mangrove species could be made. An OBIA classification was used that combined a decision-tree classification with the machine-learning SVM classification. Results showed an overall accuracy greater than 94% (kappa = 0.863 for classifying true mangroves species and other dense coastal vegetation at the object level. There remain serious challenges to accurately mapping fringe mangroves using remote sensing data due to spectral similarity of mangrove and associate species, lack of clear zonation between species, and mixed pixel effects, especially when vegetation is sparse or degraded.
Directory of Open Access Journals (Sweden)
Claire Cutler
2013-05-01
Full Text Available International investment agreements are foundational instruments in a transnational investment regime that governs how states regulate the foreign-owned assets and the foreign investment activities of private actors. Over 3,000 investment agreements between states govern key governmental powers and form the basis for an emerging transnational investment regime. This transnational regime significantly decentralizes, denationalizes, and privatizes decision-making and policy choices over foreign investment. Investment agreements set limits to state action in a number of areas of vital public concern, including the protection of human and labour rights, the environment, and sustainable development. They determine the distribution of power between foreign investors and host states and their societies. However, the societies in which they operate seldom have any input into the terms or operation of these agreements, raising crucial questions of their democratic legitimacy as mechanisms of governance. This paper draws on political science and law to explore the political economy of international investment agreements and asks whether these agreements are potential vehicles for promoting international human rights. The analysis provides an historical account of the investment regime, while a review of the political economy of international investment agreements identifies what appears to be a paradox at the core of their operation. It then examines contract theory for insight into this apparent paradox and considers whether investment agreements are suitable mechanisms for advancing international human rights.
Juniati, E.; Arrofiqoh, E. N.
2017-09-01
Information extraction from remote sensing data especially land cover can be obtained by digital classification. In practical some people are more comfortable using visual interpretation to retrieve land cover information. However, it is highly influenced by subjectivity and knowledge of interpreter, also takes time in the process. Digital classification can be done in several ways, depend on the defined mapping approach and assumptions on data distribution. The study compared several classifiers method for some data type at the same location. The data used Landsat 8 satellite imagery, SPOT 6 and Orthophotos. In practical, the data used to produce land cover map in 1:50,000 map scale for Landsat, 1:25,000 map scale for SPOT and 1:5,000 map scale for Orthophotos, but using visual interpretation to retrieve information. Maximum likelihood Classifiers (MLC) which use pixel-based and parameters approach applied to such data, and also Artificial Neural Network classifiers which use pixel-based and non-parameters approach applied too. Moreover, this study applied object-based classifiers to the data. The classification system implemented is land cover classification on Indonesia topographic map. The classification applied to data source, which is expected to recognize the pattern and to assess consistency of the land cover map produced by each data. Furthermore, the study analyse benefits and limitations the use of methods.
A neural network approach for radiographic image classification in NDT
International Nuclear Information System (INIS)
Lavayssiere, B.
1993-05-01
Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this note, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighbourhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (author). 5 figs., 21 refs
Chang Chien, Kuang-Che; Fetita, Catalin; Brillet, Pierre-Yves; Prêteux, Françoise; Chang, Ruey-Feng
2009-02-01
Multi-detector computed tomography (MDCT) has high accuracy and specificity on volumetrically capturing serial images of the lung. It increases the capability of computerized classification for lung tissue in medical research. This paper proposes a three-dimensional (3D) automated approach based on mathematical morphology and fuzzy logic for quantifying and classifying interstitial lung diseases (ILDs) and emphysema. The proposed methodology is composed of several stages: (1) an image multi-resolution decomposition scheme based on a 3D morphological filter is used to detect and analyze the different density patterns of the lung texture. Then, (2) for each pattern in the multi-resolution decomposition, six features are computed, for which fuzzy membership functions define a probability of association with a pathology class. Finally, (3) for each pathology class, the probabilities are combined up according to the weight assigned to each membership function and two threshold values are used to decide the final class of the pattern. The proposed approach was tested on 10 MDCT cases and the classification accuracy was: emphysema: 95%, fibrosis/honeycombing: 84% and ground glass: 97%.
Fire regimes of quaking aspen in the Mountain West
Shinneman, Douglas J.; Baker, William L.; Rogers, Paul C.; Kulakowski, Dominik
2013-01-01
Quaking aspen (Populus tremuloides Michx.) is the most widespread tree species in North America, and it is found throughout much of the Mountain West (MW) across a broad range of bioclimatic regions. Aspen typically regenerates asexually and prolifically after fire, and due to its seral status in many western conifer forests, aspen is often considered dependent upon disturbance for persistence. In many landscapes, historical evidence for post-fire aspen establishment is clear, and following extended fire-free periods senescing or declining aspen overstories sometimes lack adequate regeneration and are succeeding to conifers. However, aspen also forms relatively stable stands that contain little or no evidence of historical fire. In fact, aspen woodlands range from highly fire-dependent, seral communities to relatively stable, self-replacing, non-seral communities that do not require fire for persistence. Given the broad geographic distribution of aspen, fire regimes in these forests likely co-vary spatially with changing community composition, landscape setting, and climate, and temporally with land use and climate – but relatively few studies have explicitly focused on these important spatiotemporal variations. Here we reviewed the literature to summarize aspen fire regimes in the western US and highlight knowledge gaps. We found that only about one-fourth of the 46 research papers assessed for this review could be considered fire history studies (in which mean fire intervals were calculated), and all but one of these were based primarily on data from fire-scarred conifers. Nearly half of the studies reported at least some evidence of persistent aspen in the absence of fire. We also found that large portions of the MW have had little or no aspen fire history research. As a result of this review, we put forth a classification framework for aspen that is defined by key fire regime parameters (fire severity and probability), and that reflects underlying biophysical
Colombia: Territorial classification
International Nuclear Information System (INIS)
Mendoza Morales, Alberto
1998-01-01
The article is about the approaches of territorial classification, thematic axes, handling principles and territorial occupation, politician and administrative units and administration regions among other topics. Understanding as Territorial Classification the space distribution on the territory of the country, of the geographical configurations, the human communities, the political-administrative units and the uses of the soil, urban and rural, existent and proposed
Caumes, Géraldine; Borrel, Alexandre; Abi Hussein, Hiba; Camproux, Anne-Claude; Regad, Leslie
2017-09-01
Small molecules interact with their protein target on surface cavities known as binding pockets. Pocket-based approaches are very useful in all of the phases of drug design. Their first step is estimating the binding pocket based on protein structure. The available pocket-estimation methods produce different pockets for the same target. The aim of this work is to investigate the effects of different pocket-estimation methods on the results of pocket-based approaches. We focused on the effect of three pocket-estimation methods on a pocket-ligand (PL) classification. This pocket-based approach is useful for understanding the correspondence between the pocket and ligand spaces and to develop pharmacological profiling models. We found pocket-estimation methods yield different binding pockets in terms of boundaries and properties. These differences are responsible for the variation in the PL classification results that can have an impact on the detected correspondence between pocket and ligand profiles. Thus, we highlighted the importance of the pocket-estimation method choice in pocket-based approaches. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Body size distributions signal a regime shift in a lake ...
Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this study, we assessed whether body size patterns serve as an indicator of a threshold between alternative regimes. Over the past 7000 years, the biological communities of Foy Lake (Montana,USA) have undergone a major regime shift owing to climate change. We used a palaeoecological record of diatom communities to estimate diatom sizes, and then analysed the discontinuous distribution of organism sizes over time. We used Bayesian classification and regression tree models to determine that all time intervals exhibited aggregations of sizes separated by gaps in the distribution and found a significant change in diatom body size distributions approximately 150 years before the identified ecosystem regime shift. We suggest that discontinuity analysis is a useful addition to the suite of tools for the detection of early warning signals of regime shifts. Communities of organisms from mammals to microorganisms have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at discrete spatial and temporal scales within ecosystems. Here, a paleoecological record of diatom community change is use
An efficient approach for video action classification based on 3d Zernike moments
Lassoued , Imen; Zagrouba , Ezzedine; Chahir , Youssef
2011-01-01
International audience; Action recognition in video and still image is one of the most challenging research topics in pattern recognition and computer vision. This paper proposes a new method for video action classification based on 3D Zernike moments. These last ones aim to capturing both structural and temporal information of a time varying sequence. The originality of this approach consists to represent actions in video sequences by a three-dimension shape obtained from different silhouett...
A Visual Analytics Approach for Correlation, Classification, and Regression Analysis
Energy Technology Data Exchange (ETDEWEB)
Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)
2012-02-01
New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.
Directory of Open Access Journals (Sweden)
Charles R. Lane
2014-12-01
Full Text Available Although remote sensing technology has long been used in wetland inventory and monitoring, the accuracy and detail level of wetland maps derived with moderate resolution imagery and traditional techniques have been limited and often unsatisfactory. We explored and evaluated the utility of a newly launched high-resolution, eight-band satellite system (Worldview-2; WV2 for identifying and classifying freshwater deltaic wetland vegetation and aquatic habitats in the Selenga River Delta of Lake Baikal, Russia, using a hybrid approach and a novel application of Indicator Species Analysis (ISA. We achieved an overall classification accuracy of 86.5% (Kappa coefficient: 0.85 for 22 classes of aquatic and wetland habitats and found that additional metrics, such as the Normalized Difference Vegetation Index and image texture, were valuable for improving the overall classification accuracy and particularly for discriminating among certain habitat classes. Our analysis demonstrated that including WV2’s four spectral bands from parts of the spectrum less commonly used in remote sensing analyses, along with the more traditional bandwidths, contributed to the increase in the overall classification accuracy by ~4% overall, but with considerable increases in our ability to discriminate certain communities. The coastal band improved differentiating open water and aquatic (i.e., vegetated habitats, and the yellow, red-edge, and near-infrared 2 bands improved discrimination among different vegetated aquatic and terrestrial habitats. The use of ISA provided statistical rigor in developing associations between spectral classes and field-based data. Our analyses demonstrated the utility of a hybrid approach and the benefit of additional bands and metrics in providing the first spatially explicit mapping of a large and heterogeneous wetland system.
Directory of Open Access Journals (Sweden)
Newton Paulo Bueno
2013-04-01
Full Text Available A tese deste trabalho é que as técnicas mais sofisticadas atualmente utilizadas pelos economistas para fazer previsões (métodos não estruturais de previsão, em geral, e modelos de detecção de mudanças de regime, em particular não parecem realmente muito eficazes em prever mudanças radicais de regime como a que ocorreu na economia mundial recentemente. Assim, para aumentar seu grau de acurácia, parece razoável imaginar que tais técnicas devam ser complementadas por abordagens mais holísticas. O objetivo geral deste trabalho é mostrar que a metodologia de dinâmica de sistemas (system dynamics, que permite identificar os ciclos de feedback que comandam a dinâmica de sistemas complexos, parece estar especialmente bem-equipada para se tornar uma dessas abordagens complementares. Pretende-se, especificamente, apresentar um algoritmo sistêmico para identificar processos de mudança de regime como os que ocorrem quando uma economia, após anos de expansão continuada, sofre os efeitos da explosão de uma bolha financeira, como ocorreu recentemente.This paper argues that the sophisticated techniques presently used by economists to forecast macroeconomic variables behavior (non-structural forecasting methods, in general, and regime-switching models, in particular do not seem much effective for anticipating radical regime shifts as recently happened in the world economy. Thus, in order to improve their accuracy, it seems that they should be complemented by more holistic approaches. The general purpose of the paper is to show that the system dynamics methodology, which allows identifying the critical feedback loops that drive complex systems' dynamics, seems to be especially fitted to be one of those complementary approaches. To reach that goal, we present a systemic algorithm which allows identifying regime shift processes as the ones that take place when an economy is hit by the effects of a financial bubble burst.
Khan, Faisal; Enzmann, Frieder; Kersten, Michael
2016-03-01
Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.
Haaf, Ezra; Barthel, Roland
2016-04-01
When assessing hydrogeological conditions at the regional scale, the analyst is often confronted with uncertainty of structures, inputs and processes while having to base inference on scarce and patchy data. Haaf and Barthel (2015) proposed a concept for handling this predicament by developing a groundwater systems classification framework, where information is transferred from similar, but well-explored and better understood to poorly described systems. The concept is based on the central hypothesis that similar systems react similarly to the same inputs and vice versa. It is conceptually related to PUB (Prediction in ungauged basins) where organization of systems and processes by quantitative methods is intended and used to improve understanding and prediction. Furthermore, using the framework it is expected that regional conceptual and numerical models can be checked or enriched by ensemble generated data from neighborhood-based estimators. In a first step, groundwater hydrographs from a large dataset in Southern Germany are compared in an effort to identify structural similarity in groundwater dynamics. A number of approaches to group hydrographs, mostly based on a similarity measure - which have previously only been used in local-scale studies, can be found in the literature. These are tested alongside different global feature extraction techniques. The resulting classifications are then compared to a visual "expert assessment"-based classification which serves as a reference. A ranking of the classification methods is carried out and differences shown. Selected groups from the classifications are related to geological descriptors. Here we present the most promising results from a comparison of classifications based on series correlation, different series distances and series features, such as the coefficients of the discrete Fourier transform and the intrinsic mode functions of empirical mode decomposition. Additionally, we show examples of classes
Rule-guided human classification of Volunteered Geographic Information
Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian
2017-05-01
During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass
Service regime: An empirical analysis of innovation patterns in service firms.
Chang, Y.C.; Linton, J.D.; Linton, Jonathan; Chen, M.N.
2012-01-01
The concept of service regime is developed to extend and test Miozzo and Soete's service taxonomy. Derived from the synthesis approach of service innovation, the service regime considers sources of innovation, innovation trajectories, and appropriability. Hypotheses on firm patterns of innovation
Classification of boreal forest by satellite and inventory data using neural network approach
Romanov, A. A.
2012-12-01
The main objective of this research was to develop methodology for boreal (Siberian Taiga) land cover classification in a high accuracy level. The study area covers the territories of Central Siberian several parts along the Yenisei River (60-62 degrees North Latitude): the right bank includes mixed forest and dark taiga, the left - pine forests; so were taken as a high heterogeneity and statistically equal surfaces concerning spectral characteristics. Two main types of data were used: time series of middle spatial resolution satellite images (Landsat 5, 7 and SPOT4) and inventory datasets from the nature fieldworks (used for training samples sets preparation). Method of collecting field datasets included a short botany description (type/species of vegetation, density, compactness of the crowns, individual height and max/min diameters representative of each type, surface altitude of the plot), at the same time the geometric characteristic of each training sample unit corresponded to the spatial resolution of satellite images and geo-referenced (prepared datasets both of the preliminary processing and verification). The network of test plots was planned as irregular and determined by the landscape oriented approach. The main focus of the thematic data processing has been allocated for the use of neural networks (fuzzy logic inc.); therefore, the results of field studies have been converting input parameter of type / species of vegetation cover of each unit and the degree of variability. Proposed approach involves the processing of time series separately for each image mainly for the verification: shooting parameters taken into consideration (time, albedo) and thus expected to assess the quality of mapping. So the input variables for the networks were sensor bands, surface altitude, solar angels and land surface temperature (for a few experiments); also given attention to the formation of the formula class on the basis of statistical pre-processing of results of
a Point Cloud Classification Approach Based on Vertical Structures of Ground Objects
Zhao, Y.; Hu, Q.; Hu, W.
2018-04-01
This paper proposes a novel method for point cloud classification using vertical structural characteristics of ground objects. Since urbanization develops rapidly nowadays, urban ground objects also change frequently. Conventional photogrammetric methods cannot satisfy the requirements of updating the ground objects' information efficiently, so LiDAR (Light Detection and Ranging) technology is employed to accomplish this task. LiDAR data, namely point cloud data, can obtain detailed three-dimensional coordinates of ground objects, but this kind of data is discrete and unorganized. To accomplish ground objects classification with point cloud, we first construct horizontal grids and vertical layers to organize point cloud data, and then calculate vertical characteristics, including density and measures of dispersion, and form characteristic curves for each grids. With the help of PCA processing and K-means algorithm, we analyze the similarities and differences of characteristic curves. Curves that have similar features will be classified into the same class and point cloud correspond to these curves will be classified as well. The whole process is simple but effective, and this approach does not need assistance of other data sources. In this study, point cloud data are classified into three classes, which are vegetation, buildings, and roads. When horizontal grid spacing and vertical layer spacing are 3 m and 1 m respectively, vertical characteristic is set as density, and the number of dimensions after PCA processing is 11, the overall precision of classification result is about 86.31 %. The result can help us quickly understand the distribution of various ground objects.
Classification of Radioactive Waste. General Safety Guide
Energy Technology Data Exchange (ETDEWEB)
NONE
2009-11-15
This publication is a revision of an earlier Safety Guide of the same title issued in 1994. It recommends revised waste management strategies that reflect changes in practices and approaches since then. It sets out a classification system for the management of waste prior to disposal and for disposal, driven by long term safety considerations. It includes a number of schemes for classifying radioactive waste that can be used to assist with planning overall national approaches to radioactive waste management and to assist with operational management at facilities. Contents: 1. Introduction; 2. The radioactive waste classification scheme; Appendix: The classification of radioactive waste; Annex I: Evolution of IAEA standards on radioactive waste classification; Annex II: Methods of classification; Annex III: Origin and types of radioactive waste.
Classification of Radioactive Waste. General Safety Guide
International Nuclear Information System (INIS)
2009-01-01
This publication is a revision of an earlier Safety Guide of the same title issued in 1994. It recommends revised waste management strategies that reflect changes in practices and approaches since then. It sets out a classification system for the management of waste prior to disposal and for disposal, driven by long term safety considerations. It includes a number of schemes for classifying radioactive waste that can be used to assist with planning overall national approaches to radioactive waste management and to assist with operational management at facilities. Contents: 1. Introduction; 2. The radioactive waste classification scheme; Appendix: The classification of radioactive waste; Annex I: Evolution of IAEA standards on radioactive waste classification; Annex II: Methods of classification; Annex III: Origin and types of radioactive waste
Emotion models for textual emotion classification
Bruna, O.; Avetisyan, H.; Holub, J.
2016-11-01
This paper deals with textual emotion classification which gained attention in recent years. Emotion classification is used in user experience, product evaluation, national security, and tutoring applications. It attempts to detect the emotional content in the input text and based on different approaches establish what kind of emotional content is present, if any. Textual emotion classification is the most difficult to handle, since it relies mainly on linguistic resources and it introduces many challenges to assignment of text to emotion represented by a proper model. A crucial part of each emotion detector is emotion model. Focus of this paper is to introduce emotion models used for classification. Categorical and dimensional models of emotion are explained and some more advanced approaches are mentioned.
Regime-Switching Risk: To Price or Not to Price?
Directory of Open Access Journals (Sweden)
Tak Kuen Siu
2011-01-01
“normative” issues to be addressed in pricing contingent claims under a Markovian, regime-switching, Black-Scholes-Merton model. We address this issue using a minimal relative entropy approach. Firstly, we apply a martingale representation for a double martingale to characterize the canonical space of equivalent martingale measures which may be viewed as the largest space of equivalent martingale measures to incorporate both the diffusion risk and the regime-switching risk. Then we show that an optimal equivalent martingale measure over the canonical space selected by minimizing the relative entropy between an equivalent martingale measure and the real-world probability measure does not price the regime-switching risk. The optimal measure also justifies the use of the Esscher transform for option valuation in the regime-switching market.
Ta, Goh Choo; Mokhtar, Mazlin Bin; Peterson, Peter John; Yahaya, Nadzri Bin
2011-01-01
The European Union (EU) and the World Health Organization (WHO) have applied different approaches to facilitate the implementation of the UN Globally Harmonized System of Classification and Labelling of Chemicals (GHS). The EU applied the mandatory approach by gazetting the EU Regulation 1272/2008 incorporating GHS elements on classification, labelling and packaging of substances and mixtures in 2008; whereas the WHO utilized a voluntary approach by incorporating GHS elements in the WHO guidelines entitled 'WHO Recommended Classification of Pesticides by Hazard' in 2009. We report on an analysis of both the mandatory and voluntary approaches practised by the EU and the WHO respectively, with close reference to the GHS 'purple book'. Our findings indicate that the mandatory approach practiced by the EU covers all the GHS elements referred to in the second revised edition of the GHS 'purple book'. Hence we can conclude that the EU has implemented the GHS particularly for industrial chemicals. On the other hand, the WHO guidelines published in 2009 should be revised to address concerns raised in this paper. In addition, both mandatory and voluntary approaches should be carefully examined because the classification results may be different.
Classification and description of world formation types
D. Faber-Langendoen; T. Keeler-Wolf; D. Meidinger; C. Josse; A. Weakley; D. Tart; G. Navarro; B. Hoagland; S. Ponomarenko; G. Fults; Eileen Helmer
2016-01-01
An ecological vegetation classification approach has been developed in which a combination of vegetation attributes (physiognomy, structure, and floristics) and their response to ecological and biogeographic factors are used as the basis for classifying vegetation types. This approach can help support international, national, and subnational classification efforts. The...
Transport regimes spanning magnetization-coupling phase space
Baalrud, Scott D.; Daligault, Jérôme
2017-10-01
The manner in which transport properties vary over the entire parameter-space of coupling and magnetization strength is explored. Four regimes are identified based on the relative size of the gyroradius compared to other fundamental length scales: the collision mean free path, Debye length, distance of closest approach, and interparticle spacing. Molecular dynamics simulations of self-diffusion and temperature anisotropy relaxation spanning the parameter space are found to agree well with the predicted boundaries. Comparison with existing theories reveals regimes where they succeed, where they fail, and where no theory has yet been developed.
Waste classification: a management approach
International Nuclear Information System (INIS)
Wickham, L.E.
1984-01-01
A waste classification system designed to quantify the total hazard of a waste has been developed by the Low-Level Waste Management Program. As originally conceived, the system was designed to deal with mixed radioactive waste. The methodology has been developed and successfully applied to radiological and chemical wastes, both individually and mixed together. Management options to help evaluate the financial and safety trade-offs between waste segregation, waste treatment, container types, and site factors are described. Using the system provides a very simple and cost effective way of making quick assessments of a site's capabilities to contain waste materials. 3 references
Chorological classification approach for species and ecosystem conservation practice
Rogova, T. V.; Kozevnikova, M. V.; Prokhorov, V. E.; Timofeeva, N. O.
2018-01-01
The habitat type allocation approach based on the EUNIS Habitat Classification and the JUICE version 7 software is used for the conservation of species and ecosystem biodiversity. Using the vegetation plots of the Vegetation Database of Tatarstan, included in the EVA (European Vegetation Archive) and GIVD (Global Index of Vegetation-plots Databases) types of habitats of dry meadows and steppes are distinguished by differing compositions of the leading families composing their flora - Asteraceae, Fabaceae, Poaceae and Rosaceae. E12a - Semi-dry perennial calcareous grassland, and E12b - Perennial calcareous grassland and basic steppes were identified. The selected group of relevés that do not correspond to any of the EUNIS types can be considered specific for ecotone forest-steppe landscapes of the southeast of the Republic of Tatarstan. In all types of studied habitats, rare and protected plant species are noted, most of which are South-East-European-Asian species.
Directory of Open Access Journals (Sweden)
De Pedro, Emiliano
2009-07-01
Full Text Available Multivariate Classification models to classify real farm conditions Iberian pigs, according to the feeding regime were developed by using fatty acids composition or NIR spectral data of liquid fat samples. A total of 121 subcutaneous fat samples were taken from Iberian pigs carcasses belonging to 5 batches reared under different feeding systems. Once the liquid sample was extracted from each subcutaneous fat sample, it was determined the percentage of 11 fatty acids (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1. At the same time, Near Infrared (NIR spectrum of each liquid sample was obtained. Linear Discriminant Analysis (LDA was considered as pattern recognition method to develop the multivariate models. Classification errors of the LDA models generated by using NIR spectral data were 0.0% and 1.7% for the model generated by using fatty acids composition. Results confirm the possibility to discriminate Iberian pig liquid samples from animals reared under different feeding regimes on real farm conditions by using NIR spectral data or fatty acids composition. Classification error obtained using models generated from NIR spectral data were lower than those obtained in models based on fatty acids composition.Se han desarrollado modelos multivariantes, generados a partir de la composición en ácidos grasos o datos espectrales NIR, para clasificar según el régimen alimenticio cerdos Ibéricos producidos bajo condiciones no experimentales. Se han empleado 121 muestras de grasa líquida procedentes de grasa subcutánea de canales de cerdos Ibéricos pertenecientes a 5 partidas con regímenes alimenticios diferentes. A dichas muestras líquidas se les determinó el contenido en 11 ácidos grasos (C14:0, C16:0, C16:1, C17:0, C17:1, C18:0, C18:1, C18:2, C18:3, C20:0 and C20:1 y se obtuvo su espectro NIR. Los modelos de clasificación multivariantes se desarrollaron mediante Análisis Discriminante Lineal. Dichos
Directory of Open Access Journals (Sweden)
Arran Schlosberg
2014-05-01
Full Text Available Improvements in speed and cost of genome sequencing are resulting in increasing numbers of novel non-synonymous single nucleotide polymorphisms (nsSNPs in genes known to be associated with disease. The large number of nsSNPs makes laboratory-based classification infeasible and familial co-segregation with disease is not always possible. In-silico methods for classification or triage are thus utilised. A popular tool based on multiple-species sequence alignments (MSAs and work by Grantham, Align-GVGD, has been shown to underestimate deleterious effects, particularly as sequence numbers increase. We utilised the DEFLATE compression algorithm to account for expected variation across a number of species. With the adjusted Grantham measure we derived a means of quantitatively clustering known neutral and deleterious nsSNPs from the same gene; this was then used to assign novel variants to the most appropriate cluster as a means of binary classification. Scaling of clusters allows for inter-gene comparison of variants through a single pathogenicity score. The approach improves upon the classification accuracy of Align-GVGD while correcting for sensitivity to large MSAs. Open-source code and a web server are made available at https://github.com/aschlosberg/CompressGV.
Weidinger, Simon; Knap, Michael
We study the regimes of heating in the periodically driven O (N) -model, which represents a generic model for interacting quantum many-body systems. By computing the absorbed energy with a non-equilibrium Keldysh Green's function approach, we establish three dynamical regimes: at short times a single-particle dominated regime, at intermediate times a stable Floquet prethermal regime in which the system ceases to absorb, and at parametrically late times a thermalizing regime. Our simulations suggest that in the thermalizing regime the absorbed energy grows algebraically in time with an the exponent that approaches the universal value of 1 / 2 , and is thus significantly slower than linear Joule heating. Our results demonstrate the parametric stability of prethermal states in a generic many-body system driven at frequencies that are comparable to its microscopic scales. This paves the way for realizing exotic quantum phases, such as time crystals or interacting topological phases, in the prethermal regime of interacting Floquet systems. We acknowledge support from the Technical University of Munich - Institute for Advanced Study, funded by the German Excellence Initiative and the European Union FP7 under Grant agreement 291763, and from the DFG Grant No. KN 1254/1-1.
Classification of Arctic, Mid-Latitude and Tropical Clouds in the Mixed-Phase Temperature Regime
Costa, Anja; Afchine, Armin; Luebke, Anna; Meyer, Jessica; Dorsey, James R.; Gallagher, Martin W.; Ehrlich, André; Wendisch, Manfred; Krämer, Martina
2016-04-01
The degree of glaciation and the sizes and habits of ice particles formed in mixed-phase clouds remain not fully understood. However, these properties define the mixed clouds' radiative impact on the Earth's climate and thus a correct representation of this cloud type in global climate models is of importance for an improved certainty of climate predictions. This study focuses on the occurrence and characteristics of two types of clouds in the mixed-phase temperature regime (238-275K): coexistence clouds (Coex), in which both liquid drops and ice crystals exist, and fully glaciated clouds that develop in the Wegener-Bergeron-Findeisen regime (WBF clouds). We present an extensive dataset obtained by the Cloud and Aerosol Particle Spectrometer NIXE-CAPS, covering Arctic, mid-latitude and tropical regions. In total, we spent 45.2 hours within clouds in the mixed-phase temperature regime during five field campaigns (Arctic: VERDI, 2012 and RACEPAC, 2014 - Northern Canada; mid-latitude: COALESC, 2011 - UK and ML-Cirrus, 2014 - central Europe; tropics: ACRIDICON, 2014 - Brazil). We show that WBF and Coex clouds can be identified via cloud particle size distributions. The classified datasets are used to analyse temperature dependences of both cloud types as well as range and frequencies of cloud particle concentrations and sizes. One result is that Coex clouds containing supercooled liquid drops are found down to temperatures of -40 deg C only in tropical mixed clouds, while in the Arctic and mid-latitudes no liquid drops are observed below about -20 deg C. In addition, we show that the cloud particles' aspherical fractions - derived from polarization signatures of particles with diameters between 20 and 50 micrometers - differ significantly between WBF and Coex clouds. In Coex clouds, the aspherical fraction of cloud particles is generally very low, but increases with decreasing temperature. In WBF clouds, where all cloud particles are ice, about 20-40% of the cloud
Land-Use and Land-Cover Mapping Using a Gradable Classification Method
Directory of Open Access Journals (Sweden)
Keigo Kitada
2012-05-01
Full Text Available Conventional spectral-based classification methods have significant limitations in the digital classification of urban land-use and land-cover classes from high-resolution remotely sensed data because of the lack of consideration given to the spatial properties of images. To recognize the complex distribution of urban features in high-resolution image data, texture information consisting of a group of pixels should be considered. Lacunarity is an index used to characterize different texture appearances. It is often reported that the land-use and land-cover in urban areas can be effectively classified using the lacunarity index with high-resolution images. However, the applicability of the maximum-likelihood approach for hybrid analysis has not been reported. A more effective approach that employs the original spectral data and lacunarity index can be expected to improve the accuracy of the classification. A new classification procedure referred to as “gradable classification method” is proposed in this study. This method improves the classification accuracy in incremental steps. The proposed classification approach integrates several classification maps created from original images and lacunarity maps, which consist of lacnarity values, to create a new classification map. The results of this study confirm the suitability of the gradable classification approach, which produced a higher overall accuracy (68% and kappa coefficient (0.64 than those (65% and 0.60, respectively obtained with the maximum-likelihood approach.
International Nuclear Information System (INIS)
Päßler, Sebastian; Fischer, Wolf-Joachim; Wolff, Matthias
2012-01-01
Obesity and nutrition-related diseases are currently growing challenges for medicine. A precise and timesaving method for food intake monitoring is needed. For this purpose, an approach based on the classification of sounds produced during food intake is presented. Sounds are recorded non-invasively by miniature microphones in the outer ear canal. A database of 51 participants eating seven types of food and consuming one drink has been developed for algorithm development and model training. The database is labeled manually using a protocol with introductions for annotation. The annotation procedure is evaluated using Cohen's kappa coefficient. The food intake activity is detected by the comparison of the signal energy of in-ear sounds to environmental sounds recorded by a reference microphone. Hidden Markov models are used for the recognition of single chew or swallowing events. Intake cycles are modeled as event sequences in finite-state grammars. Classification of consumed food is realized by a finite-state grammar decoder based on the Viterbi algorithm. We achieved a detection accuracy of 83% and a food classification accuracy of 79% on a test set of 10% of all records. Our approach faces the need of monitoring the time and occurrence of eating. With differentiation of consumed food, a first step toward the goal of meal weight estimation is taken. (paper)
PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING
Energy Technology Data Exchange (ETDEWEB)
Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); McEwen, Jason D., E-mail: dr.michelle.lochner@gmail.com [Mullard Space Science Laboratory, University College London, Surrey RH5 6NT (United Kingdom)
2016-08-01
Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.
PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING
International Nuclear Information System (INIS)
Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K.; McEwen, Jason D.
2016-01-01
Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.
Early detection of ecosystem regime shifts
DEFF Research Database (Denmark)
Lindegren, Martin; Dakos, Vasilis; Groeger, Joachim P.
2012-01-01
methods may have limited utility in ecosystem-based management as they show no or weak potential for early-warning. We therefore propose a multiple method approach for early detection of ecosystem regime shifts in monitoring data that may be useful in informing timely management actions in the face...
Classification by a neural network approach applied to non destructive testing
International Nuclear Information System (INIS)
Lefevre, M.; Preteux, F.; Lavayssiere, B.
1995-01-01
Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this paper, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighborhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (authors). 21 refs., 5 figs
Nguyen, D.; Wagner, W.; Naeimi, V.; Cao, S.
2015-04-01
Recent studies have shown the potential of Synthetic Aperture Radars (SAR) for mapping of rice fields and some other vegetation types. For rice field classification, conventional classification techniques have been mostly used including manual threshold-based and supervised classification approaches. The challenge of the threshold-based approach is to find acceptable thresholds to be used for each individual SAR scene. Furthermore, the influence of local incidence angle on backscatter hinders using a single threshold for the entire scene. Similarly, the supervised classification approach requires different training samples for different output classes. In case of rice crop, supervised classification using temporal data requires different training datasets to perform classification procedure which might lead to inconsistent mapping results. In this study we present an automatic method to identify rice crop areas by extracting phonological parameters after performing an empirical regression-based normalization of the backscatter to a reference incidence angle. The method is evaluated in the Red River Delta (RRD), Vietnam using the time series of ENVISAT Advanced SAR (ASAR) Wide Swath (WS) mode data. The results of rice mapping algorithm compared to the reference data indicate the Completeness (User accuracy), Correctness (Producer accuracy) and Quality (Overall accuracies) of 88.8%, 92.5 % and 83.9 % respectively. The total area of the classified rice fields corresponds to the total rice cultivation areas given by the official statistics in Vietnam (R2 0.96). The results indicates that applying a phenology-based classification approach using backscatter time series in optimal incidence angle normalization can achieve high classification accuracies. In addition, the method is not only useful for large scale early mapping of rice fields in the Red River Delta using the current and future C-band Sentinal-1A&B backscatter data but also might be applied for other rice
A simplified approach for the molecular classification of glioblastomas.
Directory of Open Access Journals (Sweden)
Marie Le Mercier
Full Text Available Glioblastoma (GBM is the most common malignant primary brain tumors in adults and exhibit striking aggressiveness. Although GBM constitute a single histological entity, they exhibit considerable variability in biological behavior, resulting in significant differences in terms of prognosis and response to treatment. In an attempt to better understand the biology of GBM, many groups have performed high-scale profiling studies based on gene or protein expression. These studies have revealed the existence of several GBM subtypes. Although there remains to be a clear consensus, two to four major subtypes have been identified. Interestingly, these different subtypes are associated with both differential prognoses and responses to therapy. In the present study, we investigated an alternative immunohistochemistry (IHC-based approach to achieve a molecular classification for GBM. For this purpose, a cohort of 100 surgical GBM samples was retrospectively evaluated by immunohistochemical analysis of EGFR, PDGFRA and p53. The quantitative analysis of these immunostainings allowed us to identify the following two GBM subtypes: the "Classical-like" (CL subtype, characterized by EGFR-positive and p53- and PDGFRA-negative staining and the "Proneural-like" (PNL subtype, characterized by p53- and/or PDGFRA-positive staining. This classification represents an independent prognostic factor in terms of overall survival compared to age, extent of resection and adjuvant treatment, with a significantly longer survival associated with the PNL subtype. Moreover, these two GBM subtypes exhibited different responses to chemotherapy. The addition of temozolomide to conventional radiotherapy significantly improved the survival of patients belonging to the CL subtype, but it did not affect the survival of patients belonging to the PNL subtype. We have thus shown that it is possible to differentiate between different clinically relevant subtypes of GBM by using IHC
Classification of Marital Relationships: An Empirical Approach.
Snyder, Douglas K.; Smith, Gregory T.
1986-01-01
Derives an empirically based classification system of marital relationships, employing a multidimensional self-report measure of marital interaction. Spouses' profiles on the Marital Satisfaction Inventory for samples of clinic and nonclinic couples were subjected to cluster analysis, resulting in separate five-group typologies for husbands and…
Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation
DEFF Research Database (Denmark)
Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben
Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based on the po......Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based...
Vuurpijl, L.; Schomaker, L.
2000-01-01
This paper describes a two-stage classification method for (1) classification of isolated characters and (2) verification of the classification result. Character prototypes are generated using hierarchical clustering. For those prototypes known to sometimes produce wrong classification results, a
Efficient Fingercode Classification
Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang
In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.
Regime shifts and resilience in China's coastal ecosystems.
Zhang, Ke
2016-02-01
Regime shift often results in large, abrupt, and persistent changes in the provision of ecosystem services and can therefore have significant impacts on human wellbeing. Understanding regime shifts has profound implications for ecosystem recovery and management. China's coastal ecosystems have experienced substantial deterioration within the past decades, at a scale and speed the world has never seen before. Yet, information about this coastal ecosystem change from a dynamics perspective is quite limited. In this review, I synthesize existing information on coastal ecosystem regime shifts in China and discuss their interactions and cascading effects. The accumulation of regime shifts in China's coastal ecosystems suggests that the desired system resilience has been profoundly eroded, increasing the potential of abrupt shifts to undesirable states at a larger scale, especially given multiple escalating pressures. Policy and management strategies need to incorporate resilience approaches in order to cope with future challenges and avoid major losses in China's coastal ecosystem services.
Active Learning for Text Classification
Hu, Rong
2011-01-01
Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...
Detecting spatial regimes in ecosystems
Sundstrom, Shana M.; Eason, Tarsha; Nelson, R. John; Angeler, David G.; Barichievy, Chris; Garmestani, Ahjond S.; Graham, Nicholas A.J.; Granholm, Dean; Gunderson, Lance; Knutson, Melinda; Nash, Kirsty L.; Spanbauer, Trisha; Stow, Craig A.; Allen, Craig R.
2017-01-01
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory-based method, on both terrestrial and aquatic animal data (U.S. Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps and multivariate analyses such as nMDS and cluster analysis. We successfully detected spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change.
Directory of Open Access Journals (Sweden)
Toni Maree Dwan
2015-03-01
Full Text Available Approaches to classifying neuropsychological impairment after brain tumor vary according to testing level (individual tests, domains or global index and source of reference (i.e., norms, controls and premorbid functioning. This study aimed to compare rates of impairment according to different classification approaches. Participants were 44 individuals (57% female with a primary brain tumor diagnosis (mean age = 45.6 years and 44 matched control participants (59% female, mean age = 44.5 years. All participants completed a test battery that assesses premorbid IQ (Wechsler Adult Reading Test, attention/processing speed (Digit Span, Trail Making Test A, memory (Hopkins Verbal Learning Test – Revised, Rey-Osterrieth Complex Figure-recall and executive function (Trail Making Test B, Rey-Osterrieth Complex Figure copy, Controlled Oral Word Association Test. Results indicated that across the different sources of reference, 86-93% of participants were classified as impaired at a test-specific level, 61-73% were classified as impaired at a domain-specific level, and 32-50% were classified as impaired at a global level. Rates of impairment did not significantly differ according to source of reference (p>.05; however, at the individual participant level, classification based on estimated premorbid IQ was often inconsistent with classification based on the norms or controls. Participants with brain tumor performed significantly poorer than matched controls on tests of neuropsychological functioning, including executive function (p=.001 and memory (p.05. These results highlight the need to examine individuals’ performance across a multi-faceted neuropsychological test battery to avoid over- or under-estimation of impairment.
Directory of Open Access Journals (Sweden)
LI Jian-Wei
2014-08-01
Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.
Weather types and the regime of wildfires in Portugal
Pereira, M. G.; Trigo, R. M.; Dacamara, C. C.
2009-04-01
An objective classification scheme, as developed by Trigo and DaCamara (2000), was applied to classify the daily atmospheric circulation affecting Portugal between 1980 and 2007 into a set of 10 basic weather types (WTs). The classification scheme relies on a set of atmospheric circulation indices, namely southerly flow (SF), westerly flow (WF), total flow (F), southerly shear vorticity (ZS), westerly shear vorticity (ZW) and total vorticity (Z). The weather-typing approach, together with surfacemeteorological variables (e.g. intensity and direction of geostrophic wind, maximum and minimum temperature and precipitation) were then associated to wildfire events as recorded in the official Portuguese fire database consisting of information on each fire occurred in the 18 districts of Continental Portugal within the same period (>450.000 events). The objective of this study is to explore the dependence of wildfire activity on weather and climate and then evaluate the potential of WTs to discriminate among recorded wildfires on what respects to their occurrence and development. Results show that days characterised by surface flow with an eastern component (i.e. NE, E and SE) account for a high percentage of daily burnt area, as opposed to surface westerly flow (NW, W and SW), which represents about a quarter of the total number of days but only accounts for a very low percentage of active fires and of burnt area. Meteorological variables such as minimum and maximum temperatures, that are closely associated to surface wind intensity and direction, also present a good ability to discriminate between the different types of fire events.. Trigo R.M., DaCamara C. (2000) "Circulation Weather Types and their impact on the precipitation regime in Portugal". Int J of Climatology, 20, 1559-1581.
Martin, Steve; Nutley, Sandra; Downe, James; Grace, Clive
2016-03-01
Approaches to performance assessment have been described as 'performance regimes', but there has been little analysis of what is meant by this concept and whether it has any real value. We draw on four perspectives on regimes - 'institutions and instruments', 'risk regulation regimes', 'internal logics and effects' and 'analytics of government' - to explore how the concept of a multi-dimensional regime can be applied to performance assessment in public services. We conclude that the concept is valuable. It helps to frame comparative and longitudinal analyses of approaches to performance assessment and draws attention to the ways in which public service performance regimes operate at different levels, how they change over time and what drives their development. Areas for future research include analysis of the impacts of performance regimes and interactions between their visible features (such as inspections, performance indicators and star ratings) and the veiled rationalities which underpin them.
School Refusal Behavior: Classification, Assessment, and Treatment Issues.
Lee, Marcella I.; Miltenberger, Raymond G.
1996-01-01
Discusses diagnostic and functional classification, assessment, and treatment approaches for school refusal behavior. Diagnostic classification focuses on separation anxiety disorder, specific phobia, social phobia, depression, and truancy. Functional classification focuses on the maintaining consequences of the behavior, such as avoidance of…
Directory of Open Access Journals (Sweden)
Pengfei Li
2014-01-01
Full Text Available To deal with the difficulty to obtain a large number of fault samples under the practical condition for mechanical fault diagnosis, a hybrid method that combined wavelet packet decomposition and support vector classification (SVC is proposed. The wavelet packet is employed to decompose the vibration signal to obtain the energy ratio in each frequency band. Taking energy ratios as feature vectors, the pattern recognition results are obtained by the SVC. The rolling bearing and gear fault diagnostic results of the typical experimental platform show that the present approach is robust to noise and has higher classification accuracy and, thus, provides a better way to diagnose mechanical faults under the condition of small fault samples.
Quantifying the fire regime distributions for severity in Yosemite National Park, California, USA
Thode, Andrea E.; van Wagtendonk, Jan W.; Miller, Jay D.; Quinn, James F.
2011-01-01
This paper quantifies current fire severity distributions for 19 different fire-regime types in Yosemite National Park, California, USA. Landsat Thematic Mapper remote sensing data are used to map burn severity for 99 fires (cumulatively over 97 000 ha) that burned in Yosemite over a 20-year period. These maps are used to quantify the frequency distributions of fire severity by fire-regime type. A classification is created for the resultant distributions and they are discussed within the context of four vegetation zones: the foothill shrub and woodland zone; the lower montane forest zone; the upper montane forest zone and the subalpine forest zone. The severity distributions can form a building block from which to discuss current fire regimes across the Sierra Nevada in California. This work establishes a framework for comparing the effects of current fires on our landscapes with our notions of how fires historically burned, and how current fire severity distributions differ from our desired future conditions. As this process is refined, a new set of information will be available to researchers and land managers to help understand how fire regimes have changed from the past and how we might attempt to manage them in the future.
Improvement of Classification of Enterprise Circulating Funds
Rohanova Hanna O.
2014-01-01
The goal of the article lies in revelation of possibilities of increase of efficiency of managing enterprise circulating funds by means of improvement of their classification features. Having analysed approaches of many economists to classification of enterprise circulating funds, systemised and supplementing them, the article offers grouping classification features of enterprise circulating funds. In the result of the study the article offers an expanded classification of circulating funds, ...
The development of a classification system for inland aquatic ...
African Journals Online (AJOL)
A classification system is described that was developed for inland aquatic ecosystems in South Africa, including wetlands. The six-tiered classification system is based on a top-down, hierarchical classification of aquatic ecosystems, following the functionally-oriented hydrogeomorphic (HGM) approach to classification but ...
The Importance of Classification to Business Model Research
Susan Lambert
2015-01-01
Purpose: To bring to the fore the scientific significance of classification and its role in business model theory building. To propose a method by which existing classifications of business models can be analyzed and new ones developed. Design/Methodology/Approach: A review of the scholarly literature relevant to classifications of business models is presented along with a brief overview of classification theory applicable to business model research. Existing business model classification...
Specific classification of financial analysis of enterprise activity
Directory of Open Access Journals (Sweden)
Synkevych Nadiia I.
2014-01-01
Full Text Available Despite the fact that one can find a big variety of classifications of types of financial analysis of enterprise activity, which differ with their approach to classification and a number of classification features and their content, in modern scientific literature, their complex comparison and analysis of existing classification have not been done. This explains urgency of this study. The article studies classification of types of financial analysis of scientists and presents own approach to this problem. By the results of analysis the article improves and builds up a specific classification of financial analysis of enterprise activity and offers classification by the following features: objects, subjects, goals of study, automation level, time period of the analytical base, scope of study, organisation system, classification features of the subject, spatial belonging, sufficiency, information sources, periodicity, criterial base, method of data selection for analysis and time direction. All types of financial analysis significantly differ with their inherent properties and parameters depending on the goals of financial analysis. The developed specific classification provides subjects of financial analysis of enterprise activity with a possibility to identify a specific type of financial analysis, which would correctly meet the set goals.
Maximum mutual information regularized classification
Wang, Jim Jing-Yan
2014-09-07
In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.
Maximum mutual information regularized classification
Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin
2014-01-01
In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.
A Novel Approach to ECG Classification Based upon Two-Layered HMMs in Body Sensor Networks
Liang, Wei; Zhang, Yinlong; Tan, Jindong; Li, Yang
2014-01-01
This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient's ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS) filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs) are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC) in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN) platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen. PMID:24681668
The Displacement of Regimes of Action in the Armed Forces
DEFF Research Database (Denmark)
Holsting, Vilhelm Stefan
The aim of this paper is to explore how one might approach the profession of military command using the concepts of regimes of justification by Luc Boltanski and Laurent Thévenot in an analysis of the contemporary disputes and tensions between political and professional actors and in a time where...... of their contemporary role, responsibility and challenges) and an analysis of the displacement of military regimes in the light of economic and security changes at the societal level. I am arguing that the entrance of new regimes of justification are challenging and displacing the traditional professional justificatory...
A Novel Anti-classification Approach for Knowledge Protection.
Lin, Chen-Yi; Chen, Tung-Shou; Tsai, Hui-Fang; Lee, Wei-Bin; Hsu, Tien-Yu; Kao, Yuan-Hung
2015-10-01
Classification is the problem of identifying a set of categories where new data belong, on the basis of a set of training data whose category membership is known. Its application is wide-spread, such as the medical science domain. The issue of the classification knowledge protection has been paid attention increasingly in recent years because of the popularity of cloud environments. In the paper, we propose a Shaking Sorted-Sampling (triple-S) algorithm for protecting the classification knowledge of a dataset. The triple-S algorithm sorts the data of an original dataset according to the projection results of the principal components analysis so that the features of the adjacent data are similar. Then, we generate noise data with incorrect classes and add those data to the original dataset. In addition, we develop an effective positioning strategy, determining the added positions of noise data in the original dataset, to ensure the restoration of the original dataset after removing those noise data. The experimental results show that the disturbance effect of the triple-S algorithm on the CLC, MySVM, and LibSVM classifiers increases when the noise data ratio increases. In addition, compared with existing methods, the disturbance effect of the triple-S algorithm is more significant on MySVM and LibSVM when a certain amount of the noise data added to the original dataset is reached.
Improving the Computational Performance of Ontology-Based Classification Using Graph Databases
Directory of Open Access Journals (Sweden)
Thomas J. Lampoltshammer
2015-07-01
Full Text Available The increasing availability of very high-resolution remote sensing imagery (i.e., from satellites, airborne laser scanning, or aerial photography represents both a blessing and a curse for researchers. The manual classification of these images, or other similar geo-sensor data, is time-consuming and leads to subjective and non-deterministic results. Due to this fact, (semi- automated classification approaches are in high demand in affected research areas. Ontologies provide a proper way of automated classification for various kinds of sensor data, including remotely sensed data. However, the processing of data entities—so-called individuals—is one of the most cost-intensive computational operations within ontology reasoning. Therefore, an approach based on graph databases is proposed to overcome the issue of a high time consumption regarding the classification task. The introduced approach shifts the classification task from the classical Protégé environment and its common reasoners to the proposed graph-based approaches. For the validation, the authors tested the approach on a simulation scenario based on a real-world example. The results demonstrate a quite promising improvement of classification speed—up to 80,000 times faster than the Protégé-based approach.
Characterising the hydrological regime of an ungauged temporary river system: a case study.
D'Ambrosio, Ersilia; De Girolamo, Anna Maria; Barca, Emanuele; Ielpo, Pierina; Rulli, Maria Cristina
2017-06-01
Temporary streams are characterised by specific hydrological regimes, which influence ecosystem processes, groundwater and surface water interactions, sediment regime, nutrient delivery, water quality and ecological status. This paper presents a methodology to characterise and classify the regime of a temporary river in Southern Italy based on hydrological indicators (HIs) computed with long-term daily flow records. By using a principal component analysis (PCA), a set of non-redundant indices were identified describing the main characteristics of the hydrological regime in the study area. The indicators identified were the annual maximum 30- and 90-day mean (DH4 and DH5), the number of zero flow days (DL6), flow permanence (MF) and the 6-month seasonal predictability of dry periods (SD6). A methodology was also tested to estimate selected HIs in ungauged river reaches. Watershed characteristics such as catchment area, gauging station elevation, mean watershed slope, mean annual rainfall, land use, soil hydraulic conductivity and available water content were derived for each site. Selected indicators were then linked to the catchment characteristics using a regression analysis. Finally, MF and SD6 were used to classify the river reaches on the basis of their degree of intermittency. The methodology presented in this paper constitutes a useful tool for ecologists and water resource managers in the Water Framework Directive implementation process, which requires a characterisation of the hydrological regime and a 'river type' classification for all water bodies.
Improving Prediction of Large-scale Regime Transitions
Gyakum, J. R.; Roebber, P.; Bosart, L. F.; Honor, A.; Bunker, E.; Low, Y.; Hart, J.; Bliankinshtein, N.; Kolly, A.; Atallah, E.; Huang, Y.
2017-12-01
Cool season atmospheric predictability over the CONUS on subseasonal times scales (1-4 weeks) is critically dependent upon the structure, configuration, and evolution of the North Pacific jet stream (NPJ). The NPJ can be perturbed on its tropical side on synoptic time scales by recurving and transitioning tropical cyclones (TCs) and on subseasonal time scales by longitudinally varying convection associated with the Madden-Julian Oscillation (MJO). Likewise, the NPJ can be perturbed on its poleward side on synoptic time scales by midlatitude and polar disturbances that originate over the Asian continent. These midlatitude and polar disturbances can often trigger downstream Rossby wave propagation across the North Pacific, North America, and the North Atlantic. The project team is investigating the following multiscale processes and features: the spatiotemporal distribution of cyclone clustering over the Northern Hemisphere; cyclone clustering as influenced by atmospheric blocking and the phases and amplitudes of the major teleconnection indices, ENSO and the MJO; composite and case study analyses of representative cyclone clustering events to establish the governing dynamics; regime change predictability horizons associated with cyclone clustering events; Arctic air mass generation and modification; life cycles of the MJO; and poleward heat and moisture transports of subtropical air masses. A critical component of the study is weather regime classification. These classifications are defined through: the spatiotemporal clustering of surface cyclogenesis; a general circulation metric combining data at 500-hPa and the dynamic tropopause; Self Organizing Maps (SOM), constructed from dynamic tropopause and 850 hPa equivalent potential temperature data. The resultant lattice of nodes is used to categorize synoptic classes and their predictability, as well as to determine the robustness of the CFSv2 model climate relative to observations. Transition pathways between these
Xu, Dazhi; Cao, Jianshu
2016-08-01
The concept of polaron, emerged from condense matter physics, describes the dynamical interaction of moving particle with its surrounding bosonic modes. This concept has been developed into a useful method to treat open quantum systems with a complete range of system-bath coupling strength. Especially, the polaron transformation approach shows its validity in the intermediate coupling regime, in which the Redfield equation or Fermi's golden rule will fail. In the polaron frame, the equilibrium distribution carried out by perturbative expansion presents a deviation from the canonical distribution, which is beyond the usual weak coupling assumption in thermodynamics. A polaron transformed Redfield equation (PTRE) not only reproduces the dissipative quantum dynamics but also provides an accurate and efficient way to calculate the non-equilibrium steady states. Applications of the PTRE approach to problems such as exciton diffusion, heat transport and light-harvesting energy transfer are presented.
Classification of line features from remote sensing data
Kolankiewiczová, Soňa
2009-01-01
This work deals with object-based classification of high resolution data. The aim of the thesis (paper, work) is to develope an acceptable classification process of linear features (roads and railways) from high-resolution satellite images. The first part shows different approaches of the linear feature classification and compares theoretic differences between an object-oriented and a pixel-based classification. Linear feature classification was created in the second part. The high-resolution...
Classification, diagnosis, and approach to treatment for angioedema
DEFF Research Database (Denmark)
Cicardi, M; Aberer, W; Banerji, A
2014-01-01
Angioedema is defined as localized and self-limiting edema of the subcutaneous and submucosal tissue, due to a temporary increase in vascular permeability caused by the release of vasoactive mediator(s). When angioedema recurs without significant wheals, the patient should be diagnosed to have...... angioedema as a distinct disease. In the absence of accepted classification, different types of angioedema are not uniquely identified. For this reason, the European Academy of Allergy and Clinical Immunology gave its patronage to a consensus conference aimed at classifying angioedema. Four types of acquired...... and three types of hereditary angioedema were identified as separate forms from the analysis of the literature and were presented in detail at the meeting. Here, we summarize the analysis of the data and the resulting classification of angioedema....
Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M
2009-11-01
This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.
Use of a Novel Grammatical Inference Approach in Classification of Amyloidogenic Hexapeptides
Directory of Open Access Journals (Sweden)
Wojciech Wieczorek
2016-01-01
Full Text Available The present paper is a novel contribution to the field of bioinformatics by using grammatical inference in the analysis of data. We developed an algorithm for generating star-free regular expressions which turned out to be good recommendation tools, as they are characterized by a relatively high correlation coefficient between the observed and predicted binary classifications. The experiments have been performed for three datasets of amyloidogenic hexapeptides, and our results are compared with those obtained using the graph approaches, the current state-of-the-art methods in heuristic automata induction, and the support vector machine. The results showed the superior performance of the new grammatical inference algorithm on fixed-length amyloid datasets.
Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.
2010-01-01
We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…
Classification of Arctic, midlatitude and tropical clouds in the mixed-phase temperature regime
Costa, Anja; Meyer, Jessica; Afchine, Armin; Luebke, Anna; Günther, Gebhard; Dorsey, James R.; Gallagher, Martin W.; Ehrlich, Andre; Wendisch, Manfred; Baumgardner, Darrel; Wex, Heike; Krämer, Martina
2017-10-01
The degree of glaciation of mixed-phase clouds constitutes one of the largest uncertainties in climate prediction. In order to better understand cloud glaciation, cloud spectrometer observations are presented in this paper, which were made in the mixed-phase temperature regime between 0 and -38 °C (273 to 235 K), where cloud particles can either be frozen or liquid. The extensive data set covers four airborne field campaigns providing a total of 139 000 1 Hz data points (38.6 h within clouds) over Arctic, midlatitude and tropical regions. We develop algorithms, combining the information on number concentration, size and asphericity of the observed cloud particles to classify four cloud types: liquid clouds, clouds in which liquid droplets and ice crystals coexist, fully glaciated clouds after the Wegener-Bergeron-Findeisen process and clouds where secondary ice formation occurred. We quantify the occurrence of these cloud groups depending on the geographical region and temperature and find that liquid clouds dominate our measurements during the Arctic spring, while clouds dominated by the Wegener-Bergeron-Findeisen process are most common in midlatitude spring. The coexistence of liquid water and ice crystals is found over the whole mixed-phase temperature range in tropical convective towers in the dry season. Secondary ice is found at midlatitudes at -5 to -10 °C (268 to 263 K) and at higher altitudes, i.e. lower temperatures in the tropics. The distribution of the cloud types with decreasing temperature is shown to be consistent with the theory of evolution of mixed-phase clouds. With this study, we aim to contribute to a large statistical database on cloud types in the mixed-phase temperature regime.
International Nuclear Information System (INIS)
Liles, D.R.
1982-01-01
Internal boundaries in multiphase flow greatly complicate fluid-dynamic and heat-transfer descriptions. Different flow regimes or topological configurations can have radically dissimilar interfacial and wall mass, momentum, and energy exchanges. To model the flow dynamics properly requires estimates of these rates. In this paper the common flow regimes for gas-liquid systems are defined and the techniques used to estimate the extent of a particular regime are described. Also, the current computer-code procedures are delineated and introduce a potentially better method is introduced
V. Baban; I. Gamaliy.
2014-01-01
This paper deals with analysis of hydrological regime of the fishery water reservoirs at the basin of the Southern Bug of Vinnytsia region. Systematization of the investigated water bodies was performed by the authors on the basis of previously developed classification and typing of water reservoirs.
Water regime of steam power plants
International Nuclear Information System (INIS)
Oesz, Janos
2011-01-01
The water regime of water-steam thermal power plants (secondary side of pressurized water reactors (PWR); fossil-fired thermal power plants - referred to as steam power plants) has changed in the past 30 years, due to a shift from water chemistry to water regime approach. The article summarizes measures (that have been realised by chemists of NPP Paks) on which the secondary side of NPP Paks has become a high purity water-steam power plant and by which the water chemistry stress corrosion risk of heat transfer tubes in the VVER-440 steam generators was minimized. The measures can also be applied to the water regime of fossil-fired thermal power plants with super- and subcritical steam pressure. Based on the reliability analogue of PWR steam generators, water regime can be defined as the harmony of construction, material(s) and water chemistry, which needs to be provided in not only the steam generators (boiler) but in each heat exchanger of steam power plant: - Construction determines the processes of flow, heat and mass transfer and their local inequalities; - Material(s) determines the minimal rate of general corrosion and the sensitivity for local corrosion damage; - Water chemistry influences the general corrosion of material(s) and the corrosion products transport, as well as the formation of local corrosion environment. (orig.)
What Do We Know about Hybrid Regimes after Two Decades of Scholarship?
Directory of Open Access Journals (Sweden)
Mariam Mufti
2018-06-01
Full Text Available In two decades of scholarship on hybrid regimes two significant advancements have been made. First, scholars have emphasized that the hybrid regimes that emerged in the post-Cold War era should not be treated as diminished sub-types of democracy, and second, regime type is a multi-dimensional concept. This review essay further contends that losing the lexicon of hybridity and focusing on a single dimension of regime type—flawed electoral competition—has prevented an examination of extra-electoral factors that are necessary for understanding how regimes are differently hybrid, why there is such immense variation in the outcome of elections and why these regimes are constantly in flux. Therefore, a key recommendation emerging from this review of the scholarship is that to achieve a more thorough, multi-dimensional assessment of hybrid regimes, further research ought to be driven by nested research designs in which qualitative and quantitative approaches can be used to advance mid-range theory building.
Classification of wetlands and deepwater habitats of the United States
Cowardin, L.M.; Carter, V.; Golet, F.C.; LaRoe, E.T.
1985-01-01
This classification, to be used in a new inventory of wetlands and deepwater habitats of the United States, is intended to describe ecological taxa, arrange them in a system useful to resource managers, furnish units for mapping, and provide uniformity of concepts and terms. Wetlands are defined by plants (hydrophytes), soils (hydric soils), and frequency of flooding. Ecologically related areas of deep water, traditionally not considered wetlands, are included in the classification as deepwater habitats.Systems form the highest level of the classification hierarchy; five are defined-Marine, Estuarine, Riverine, Lacustrine, and Palustrine. Marine and Estuarine Systems each have two Subsystems, Subtidal and Intertidal; the Riverine System has four Subsystems, Tidal, Lower Perennial, Upper Perennial, and Intermittent; the Lacustrine has two, Littoral and Limnetic; and the Palustrine has no Subsystems.Within the Subsystems, Classes are based on substrate material and flooding regime, or on vegetative life form. The same Classes may appear under one or more of the Systems or Subsystems. Six Classes are based on substrate and flooding regime: (1) Rock Bottom with a substrate of bedrock, boulders, or stones; (2) Unconsolidated Bottom with a substrate of cobbles, gravel, sand, mud, or organic material; (3) Rocky Shore with the same substrates as Rock Bottom; (4) Unconsolidated Shore with the same substrates as Unconsolidated Bottom; (5) Streambed with any of the substrates; and (6) Reef with a substrate composed of the living and dead remains of invertebrates (corals, mollusks, or worms). The bottom Classes, (1) and (2) above, are flooded all or most of the time and the shore Classes, (3) and (4), are exposed most of the time. The Class Streambed is restricted to channels of intermittent streams and tidal channels that are dewatered at low tide. The life form of the dominant vegetation defines the five Classes based on vegetative form: (1) Aquatic Bed, dominated by plants
Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.
2017-12-01
The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.
Second regime tokamak operation at large aspect ratio
International Nuclear Information System (INIS)
Navratil, G.A.
1989-01-01
This paper reviews the need for high beta in economic tokamak reactors and summarizes recent results on the scaling of the second regime beta limit for high-n ballooning modes using optimized pressure profiles as well as results on low-n mode stability at the first regime beta limit from the Columbia HBT tokamak. While several experiments have studied ballooning limits using high εβ p plasmas, the most important question for the use of the second stability regime for tokamak reactor improvement is how to achieve these high values of εβ p while at the same time increasing the value of beta to several times the Troyon beta limit. An approach to the study of this key question on beta limits using modest sized, large aspect ratio tokamaks is described. (author). 28 refs, 7 figs, 1 tab
Classification of lung sounds using higher-order statistics: A divide-and-conquer approach.
Naves, Raphael; Barbosa, Bruno H G; Ferreira, Danton D
2016-06-01
Lung sound auscultation is one of the most commonly used methods to evaluate respiratory diseases. However, the effectiveness of this method depends on the physician's training. If the physician does not have the proper training, he/she will be unable to distinguish between normal and abnormal sounds generated by the human body. Thus, the aim of this study was to implement a pattern recognition system to classify lung sounds. We used a dataset composed of five types of lung sounds: normal, coarse crackle, fine crackle, monophonic and polyphonic wheezes. We used higher-order statistics (HOS) to extract features (second-, third- and fourth-order cumulants), Genetic Algorithms (GA) and Fisher's Discriminant Ratio (FDR) to reduce dimensionality, and k-Nearest Neighbors and Naive Bayes classifiers to recognize the lung sound events in a tree-based system. We used the cross-validation procedure to analyze the classifiers performance and the Tukey's Honestly Significant Difference criterion to compare the results. Our results showed that the Genetic Algorithms outperformed the Fisher's Discriminant Ratio for feature selection. Moreover, each lung class had a different signature pattern according to their cumulants showing that HOS is a promising feature extraction tool for lung sounds. Besides, the proposed divide-and-conquer approach can accurately classify different types of lung sounds. The classification accuracy obtained by the best tree-based classifier was 98.1% for classification accuracy on training, and 94.6% for validation data. The proposed approach achieved good results even using only one feature extraction tool (higher-order statistics). Additionally, the implementation of the proposed classifier in an embedded system is feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai
2017-01-01
Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99
Directory of Open Access Journals (Sweden)
Miraemiliana Murat
2017-09-01
Full Text Available Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM, Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD, Histogram of Oriented Gradients (HOG, Hu invariant moments (Hu and Zernike moments (ZM. Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN, random forest (RF, support vector machine (SVM, k-nearest neighbour (k-NN, linear discriminant analysis (LDA and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM. In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS and Pearson’s coefficient correlation (PCC. The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia
Transparent electrodes in the terahertz regime – a new approach
DEFF Research Database (Denmark)
Malureanu, Radu; Song, Z.; Zalkovskij, Maksim
We suggest a new possibility for obtaining a transparent metallic film, thus allowing for completely transparent electrodes. By placing a complementary composite layer on top of the electrode, we can cancel the back-scattering of the latter thus obtaining a perfectly transparent structure. For ea...... of fabrication, we performed the first experiments in the THz regime, but the concept is applicable to the entire electromagnetic waves spectrum. We show that the experiments and theory match each other perfectly....
A Cointegrated Regime-Switching Model Approach with Jumps Applied to Natural Gas Futures Prices
Directory of Open Access Journals (Sweden)
Daniel Leonhardt
2017-09-01
Full Text Available Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for different states of the underlying volatility of the futures. In this paper, we therefore allow for cointegration of the term structure within a multi-factor model, which includes seasonality, as well as joint and individual jumps in the price processes of futures with different maturities. The seasonality in this model is realized via a deterministic function, and the jumps are represented with thinned-out compound Poisson processes. The model also includes a regime-switching approach that is modelled through a Markov chain and extends the class of geometric models. We show how the model can be calibrated to empirical data and give some practical applications.
Modelling regime shifts in the southern Benguela: a frame-based ...
African Journals Online (AJOL)
Modelling regime shifts in the southern Benguela: a frame-based approach. MD Smith, A Jarre. Abstract. This study explores the usefulness of a frame-based modelling approach in the southern Benguela upwelling ecosystem, with four frames describing observed small pelagic fish dominance patterns. We modelled the ...
A Novel Approach to ECG Classification Based upon Two-Layered HMMs in Body Sensor Networks
Directory of Open Access Journals (Sweden)
Wei Liang
2014-03-01
Full Text Available This paper presents a novel approach to ECG signal filtering and classification. Unlike the traditional techniques which aim at collecting and processing the ECG signals with the patient being still, lying in bed in hospitals, our proposed algorithm is intentionally designed for monitoring and classifying the patient’s ECG signals in the free-living environment. The patients are equipped with wearable ambulatory devices the whole day, which facilitates the real-time heart attack detection. In ECG preprocessing, an integral-coefficient-band-stop (ICBS filter is applied, which omits time-consuming floating-point computations. In addition, two-layered Hidden Markov Models (HMMs are applied to achieve ECG feature extraction and classification. The periodic ECG waveforms are segmented into ISO intervals, P subwave, QRS complex and T subwave respectively in the first HMM layer where expert-annotation assisted Baum-Welch algorithm is utilized in HMM modeling. Then the corresponding interval features are selected and applied to categorize the ECG into normal type or abnormal type (PVC, APC in the second HMM layer. For verifying the effectiveness of our algorithm on abnormal signal detection, we have developed an ECG body sensor network (BSN platform, whereby real-time ECG signals are collected, transmitted, displayed and the corresponding classification outcomes are deduced and shown on the BSN screen.
Classification of Arctic, midlatitude and tropical clouds in the mixed-phase temperature regime
Directory of Open Access Journals (Sweden)
A. Costa
2017-10-01
Full Text Available The degree of glaciation of mixed-phase clouds constitutes one of the largest uncertainties in climate prediction. In order to better understand cloud glaciation, cloud spectrometer observations are presented in this paper, which were made in the mixed-phase temperature regime between 0 and −38 °C (273 to 235 K, where cloud particles can either be frozen or liquid. The extensive data set covers four airborne field campaigns providing a total of 139 000 1 Hz data points (38.6 h within clouds over Arctic, midlatitude and tropical regions. We develop algorithms, combining the information on number concentration, size and asphericity of the observed cloud particles to classify four cloud types: liquid clouds, clouds in which liquid droplets and ice crystals coexist, fully glaciated clouds after the Wegener–Bergeron–Findeisen process and clouds where secondary ice formation occurred. We quantify the occurrence of these cloud groups depending on the geographical region and temperature and find that liquid clouds dominate our measurements during the Arctic spring, while clouds dominated by the Wegener–Bergeron–Findeisen process are most common in midlatitude spring. The coexistence of liquid water and ice crystals is found over the whole mixed-phase temperature range in tropical convective towers in the dry season. Secondary ice is found at midlatitudes at −5 to −10 °C (268 to 263 K and at higher altitudes, i.e. lower temperatures in the tropics. The distribution of the cloud types with decreasing temperature is shown to be consistent with the theory of evolution of mixed-phase clouds. With this study, we aim to contribute to a large statistical database on cloud types in the mixed-phase temperature regime.
Pattern Recognition Approaches for Breast Cancer DCE-MRI Classification: A Systematic Review.
Fusco, Roberta; Sansone, Mario; Filice, Salvatore; Carone, Guglielmo; Amato, Daniela Maria; Sansone, Carlo; Petrillo, Antonella
2016-01-01
We performed a systematic review of several pattern analysis approaches for classifying breast lesions using dynamic, morphological, and textural features in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). Several machine learning approaches, namely artificial neural networks (ANN), support vector machines (SVM), linear discriminant analysis (LDA), tree-based classifiers (TC), and Bayesian classifiers (BC), and features used for classification are described. The findings of a systematic review of 26 studies are presented. The sensitivity and specificity are respectively 91 and 83 % for ANN, 85 and 82 % for SVM, 96 and 85 % for LDA, 92 and 87 % for TC, and 82 and 85 % for BC. The sensitivity and specificity are respectively 82 and 74 % for dynamic features, 93 and 60 % for morphological features, 88 and 81 % for textural features, 95 and 86 % for a combination of dynamic and morphological features, and 88 and 84 % for a combination of dynamic, morphological, and other features. LDA and TC have the best performance. A combination of dynamic and morphological features gives the best performance.
Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein
2016-06-01
This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.
Classification system for acute and chronic radiation treatment sequelae
International Nuclear Information System (INIS)
Seegenschmiedt, M.H.; Sauer, R.
1993-01-01
A classification system in German language is proposed for scoring of acute and chronic treatment sequelae after radiotherapy. It includes all important organs and organ systems. The proposed grading corresponds to the four-scale-system of the WHO and UICC. The system is also compatible to the RTOG and EORTC acute and late radiation morbidity scoring criteria. This facilitates the data transfer for retrospective and prospective analysis of monomodal and multimodal radiotherapy treatment regimes. We recommend to use this scoring system in all German speaking countries for multicentric prospective studies. It is possible, that organ-specific sophistications of the toxicity grading will be developed in the future. These additions should conform with (inter)national standards and apply the same four-scale grading of this classification system. (orig.) [de
Notes on the Emerging Accreditation Regimes in Australia and New Zealand
Boehringer, Kristian; Blyth, Sue; Scott, Fionna
2012-01-01
In recent years, new higher education regulatory regimes have emerged in both New Zealand and Australia. In Australia, the new Tertiary Education Quality and Standards Agency (TEQSA) employs a risk management approach while the New Zealand Quality Agency (NZQA) has adopted an evaluative approach. In practice, these varying approaches create real…
Precipitation regimes over central Greenland inferred from 5 years of ICECAPS observations
Pettersen, Claire; Bennartz, Ralf; Merrelli, Aronne J.; Shupe, Matthew D.; Turner, David D.; Walden, Von P.
2018-04-01
A novel method for classifying Arctic precipitation using ground based remote sensors is presented. Using differences in the spectral variation of microwave absorption and scattering properties of cloud liquid water and ice, this method can distinguish between different types of snowfall events depending on the presence or absence of condensed liquid water in the clouds that generate the precipitation. The classification reveals two distinct, primary regimes of precipitation over the Greenland Ice Sheet (GIS): one originating from fully glaciated ice clouds and the other from mixed-phase clouds. Five years of co-located, multi-instrument data from the Integrated Characterization of Energy, Clouds, Atmospheric state, and Precipitation at Summit (ICECAPS) are used to examine cloud and meteorological properties and patterns associated with each precipitation regime. The occurrence and accumulation of the precipitation regimes are identified and quantified. Cloud and precipitation observations from additional ICECAPS instruments illustrate distinct characteristics for each regime. Additionally, reanalysis products and back-trajectory analysis show different synoptic-scale forcings associated with each regime. Precipitation over the central GIS exhibits unique microphysical characteristics due to the high surface elevations as well as connections to specific large-scale flow patterns. Snowfall originating from the ice clouds is coupled to deep, frontal cloud systems advecting up and over the southeast Greenland coast to the central GIS. These events appear to be associated with individual storm systems generated by low pressure over Baffin Bay and Greenland lee cyclogenesis. Snowfall originating from mixed-phase clouds is shallower and has characteristics typical of supercooled cloud liquid water layers, and slowly propagates from the south and southwest of Greenland along a quiescent flow above the GIS.
Urogenital tuberculosis: definition and classification.
Kulchavenya, Ekaterina
2014-10-01
To improve the approach to the diagnosis and management of urogenital tuberculosis (UGTB), we need clear and unique classification. UGTB remains an important problem, especially in developing countries, but it is often an overlooked disease. As with any other infection, UGTB should be cured by antibacterial therapy, but because of late diagnosis it may often require surgery. Scientific literature dedicated to this problem was critically analyzed and juxtaposed with the author's own more than 30 years' experience in tuberculosis urology. The conception, terms and definition were consolidated into one system; classification stage by stage as well as complications are presented. Classification of any disease includes dispersion on forms and stages and exact definitions for each stage. Clinical features and symptoms significantly vary between different forms and stages of UGTB. A simple diagnostic algorithm was constructed. UGTB is multivariant disease and a standard unified approach to it is impossible. Clear definition as well as unique classification are necessary for real estimation of epidemiology and the optimization of therapy. The term 'UGTB' has insufficient information in order to estimate therapy, surgery and prognosis, or to evaluate the epidemiology.
Building and Solving Odd-One-Out Classification Problems: A Systematic Approach
Ruiz, Philippe E.
2011-01-01
Classification problems ("find the odd-one-out") are frequently used as tests of inductive reasoning to evaluate human or animal intelligence. This paper introduces a systematic method for building the set of all possible classification problems, followed by a simple algorithm for solving the problems of the R-ASCM, a psychometric test derived…
On Internet Traffic Classification: A Two-Phased Machine Learning Approach
Directory of Open Access Journals (Sweden)
Taimur Bakhshi
2016-01-01
Full Text Available Traffic classification utilizing flow measurement enables operators to perform essential network management. Flow accounting methods such as NetFlow are, however, considered inadequate for classification requiring additional packet-level information, host behaviour analysis, and specialized hardware limiting their practical adoption. This paper aims to overcome these challenges by proposing two-phased machine learning classification mechanism with NetFlow as input. The individual flow classes are derived per application through k-means and are further used to train a C5.0 decision tree classifier. As part of validation, the initial unsupervised phase used flow records of fifteen popular Internet applications that were collected and independently subjected to k-means clustering to determine unique flow classes generated per application. The derived flow classes were afterwards used to train and test a supervised C5.0 based decision tree. The resulting classifier reported an average accuracy of 92.37% on approximately 3.4 million test cases increasing to 96.67% with adaptive boosting. The classifier specificity factor which accounted for differentiating content specific from supplementary flows ranged between 98.37% and 99.57%. Furthermore, the computational performance and accuracy of the proposed methodology in comparison with similar machine learning techniques lead us to recommend its extension to other applications in achieving highly granular real-time traffic classification.
International Nuclear Information System (INIS)
Sharaevsky, L.G.; Sharaevskaya, E.I.; Domashev, E.D.; Arkhypov, A.P.; Kolochko, V.N.
2002-01-01
The paper deals with one of the acute for the nuclear energy problem of accident regimes of NPPs recognition diagnostics using noise signal diagnostics methodology. The methodology intends transformation of the random noise signals of the main technological parameters at the exit of a nuclear facility (neutron flow, dynamic pressure etc.) which contain the important information about the technical status of the equipment. The effective algorithms for identification of random processes wore developed. After proper transformation its were considered as multidimensional random vectors. Automatic classification of these vectors in the developed algorithms is realized on the basis of the probability function in particular Bayes classifier and decision functions. Till now there no mathematical models for thermalhydraulic regimes of fuel assemblies recognition on the acoustic and neutron noises parameters in the core of nuclear facilities. The two mathematical models for analysis of the random processes submitted to the automatic classification is proposed, i.e. statistical (using Bayes classifier of acoustic spectral density diagnosis signals) and geometrical (on the basis of formation in the featured space of dividing hyper-plane). The theoretical basis of the bubble boiling regimes in the fuel assemblies is formulated as identification of these regimes on the basis of random parameters of auto spectral density of acoustic noise (ASD) measured in the fuel assemblies (dynamic pressure in the upper plenum in the paper). The elaborated algorithms allow recognize realistic status of the fuel assemblies. For verification of the proposed mathematical models the analysis of experimental measurements was carried out. The research of the boiling onset and definition of the local values of the flow parameters in the seven-beam fuel assembly (length of 1.3 m, diameter of 6 mm) have shown the correct identification of the bubble boiling regimes. The experimental measurements on
Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach
Taufik, Afirah; Sakinah Syed Ahmad, Sharifah
2016-06-01
The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.
Interfacial area concentration in gas–liquid bubbly to churn-turbulent flow regime
International Nuclear Information System (INIS)
Ozar, B.; Dixit, A.; Chen, S.W.; Hibiki, T.; Ishii, M.
2012-01-01
Highlights: ► A systematic approach to predict the interfacial area concentration is presented. ► Two group approach for categorizing bubbles is used. ► Prediction of Group-1 bubble size and void fraction are key elements of this work. ► The proposed approach compares well with selected databases. - Abstract: There are very few established correlations to predict the interfacial area concentration beyond the bubbly flow regime in cap-slug and churn-turbulent flow regimes. Present study shows a systematic approach to estimate the interfacial area concentration in bubbly, cap-slug and churn-turbulent flow regimes. Ishii and Mishima’s (1980) formulation and the two group approach for categorizing bubbles (Group-1: spherical or distorted bubble, Group-2: cap bubble) are used to estimate the interfacial area concentration. The key parameters in this framework are the estimation of Group-1 bubble size and the amount of void in the liquid slug, which is a function of Group-1 void fraction. Hibiki and Ishii’s (2002) correlation is utilized to predict the size of the Group-1 bubbles. A correlation is developed to estimate the Group-1 void fraction. The developed model for the estimation of interfacial area concentration is compared with the three existing datasets. These are data for air–water flow taken in annular geometry and round tube and also for air–NaOH solution taken in round tube. The estimation accuracies for these data sets are ±36.4%, ±26.5% and ±37.4%, respectively. These datasets cover a wide range of flow regimes and different physical properties.
Contextual segment-based classification of airborne laser scanner data
Vosselman, George; Coenen, Maximilian; Rottensteiner, Franz
2017-01-01
Classification of point clouds is needed as a first step in the extraction of various types of geo-information from point clouds. We present a new approach to contextual classification of segmented airborne laser scanning data. Potential advantages of segment-based classification are easily offset
Zhang, Yong; Gong, Dun-Wei; Cheng, Jian
2017-01-01
Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing. Generally, there are some situations where a user is interested in not only maximizing the classification performance but also minimizing the cost that may be associated with features. This kind of problem is called cost-based feature selection. However, most existing feature selection approaches treat this task as a single-objective optimization problem. This paper presents the first study of multi-objective particle swarm optimization (PSO) for cost-based feature selection problems. The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications. In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO. The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets. Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.
Transition from weak wave turbulence regime to solitonic regime
Hassani, Roumaissa; Mordant, Nicolas
2017-11-01
The Weak Turbulence Theory (WTT) is a statistical theory describing the interaction of a large ensemble of random waves characterized by very different length scales. For both weak non-linearity and weak dispersion a different regime is predicted where solitons propagate while keeping their shape unchanged. The question under investigation here is which regime between weak turbulence or soliton gas does the system choose ? We report an experimental investigation of wave turbulence at the surface of finite depth water in the gravity-capillary range. We tune the wave dispersion and the level of nonlinearity by modifying the depth of water and the forcing respectively. We use space-time resolved profilometry to reconstruct the deformed surface of water. When decreasing the water depth, we observe a drastic transition between weak turbulence at the weakest forcing and a solitonic regime at stronger forcing. We characterize the transition between both states by studying their Fourier Spectra. We also study the efficiency of energy transfer in the weak turbulence regime. We report a loss of efficiency of angular transfer as the dispersion of the wave is reduced until the system bifurcates into the solitonic regime. This project has recieved funding from the European Research Council (ERC, Grant Agreement No. 647018-WATU).
AOSpine subaxial cervical spine injury classification system
Vaccaro, Alexander R.; Koerner, John D.; Radcliff, Kris E.; Oner, F. Cumhur; Reinhold, Maximilian; Schnake, Klaus J.; Kandziora, Frank; Fehlings, Michael G.; Dvorak, Marcel F.; Aarabi, Bizhan; Rajasekaran, Shanmuganathan; Schroeder, Gregory D.; Kepler, Christopher K.; Vialle, Luiz R.
2016-01-01
Purpose: This project describes a morphology-based subaxial cervical spine traumatic injury classification system. Using the same approach as the thoracolumbar system, the goal was to develop a comprehensive yet simple classification system with high intra- and interobserver reliability to be used
Three naive Bayes approaches for discrimination-free classification
Calders, T.G.K.; Verwer, S.E.
2010-01-01
In this paper, we investigate how to modify the naive Bayes classifier in order to perform classification that is restricted to be independent with respect to a given sensitive attribute. Such independency restrictions occur naturally when the decision process leading to the labels in the data-set
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-01-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value. PMID:27905520
International regime formation: Ozone depletion and global climate change
International Nuclear Information System (INIS)
Busmann, N.E.
1994-03-01
Two theoretical perspectives, neorealism and neoliberal institutionalism, dominate in international relations. An assessment is made of whether these perspectives provide compelling explanations of why a regime with specific targets and timetables was formed for ozone depletion, while a regime with such specificity was not formed for global climate change. In so doing, the assumptions underlying neorealism and neoliberal institutionalism are examined. A preliminary assessment is offered of the policymaking and institutional bargaining process. Patterns of interstate behavior are evolving toward broader forms of cooperation, at least with regard to global environmental issues, although this process is both slow and cautious. State coalitions on specific issues are not yet powerful enough to create a strong community of states in which states are willing to devolve power to international institutions. It is shown that regime analysis is a useful analytic framework, but it should not be mistaken for theory. Regime analysis provides an organizational framework offering a set of questions regarding the principles and norms that govern cooperation and conflict in an issue area, and whether forces independent of states exist which affect the scope of state behavior. An examination of both neorealism and neoliberal institutionalism, embodied by four approaches to regime formation, demonstrates that neither has sufficient scope to account for contextual dynamics in either the ozone depletion or global climate change regime formation processes. 261 refs
Classification as clustering: a Pareto cooperative-competitive GP approach.
McIntyre, Andrew R; Heywood, Malcolm I
2011-01-01
Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.
Buoyancy-driven chaotic regimes during solute dispersion in pore networks
International Nuclear Information System (INIS)
Tsakiroglou, C.D.; Theodoropoulou, M.A.; Karoutsos, V.
2005-01-01
. Periodic and quasi-periodic solute dispersion regimes are favored by relatively high Pe values and low degree of pore scale heterogeneities, whereas chaotic regimes are favored by low Pe values and high degree of pore-scale heterogeneities. Some ambiguity concerning the classification of the observed solute dispersion regimes is due to the fact that the short length of the time series does not allow the processing of datasets with the nonlinear methods of state-space analysis. (authors)
Directory of Open Access Journals (Sweden)
I. Dombrowsky
2008-02-01
. At the same time, the Commission's reporting to the public served as an enforcement mechanism. From a methodological point of view, the paper highlights the opportunities and limitations of a combined quantitative and qualitative approach to determining regime effectiveness.
International Nuclear Information System (INIS)
Rank, Christopher M; Tremmel, Christoph; Hünemohr, Nora; Nagel, Armin M; Jäkel, Oliver; Greilich, Steffen
2013-01-01
In order to benefit from the highly conformal irradiation of tumors in ion radiotherapy, sophisticated treatment planning and simulation are required. The purpose of this study was to investigate the potential of MRI for ion radiotherapy treatment plan simulation and adaptation using a classification-based approach. Firstly, a voxelwise tissue classification was applied to derive pseudo CT numbers from MR images using up to 8 contrasts. Appropriate MR sequences and parameters were evaluated in cross-validation studies of three phantoms. Secondly, ion radiotherapy treatment plans were optimized using both MRI-based pseudo CT and reference CT and recalculated on reference CT. Finally, a target shift was simulated and a treatment plan adapted to the shift was optimized on a pseudo CT and compared to reference CT optimizations without plan adaptation. The derivation of pseudo CT values led to mean absolute errors in the range of 81 - 95 HU. Most significant deviations appeared at borders between air and different tissue classes and originated from partial volume effects. Simulations of ion radiotherapy treatment plans using pseudo CT for optimization revealed only small underdosages in distal regions of a target volume with deviations of the mean dose of PTV between 1.4 - 3.1% compared to reference CT optimizations. A plan adapted to the target volume shift and optimized on the pseudo CT exhibited a comparable target dose coverage as a non-adapted plan optimized on a reference CT. We were able to show that a MRI-based derivation of pseudo CT values using a purely statistical classification approach is feasible although no physical relationship exists. Large errors appeared at compact bone classes and came from an imperfect distinction of bones and other tissue types in MRI. In simulations of treatment plans, it was demonstrated that these deviations are comparable to uncertainties of a target volume shift of 2 mm in two directions indicating that especially
Classification of iconic images
Zrianina, Mariia; Kopf, Stephan
2016-01-01
Iconic images represent an abstract topic and use a presentation that is intuitively understood within a certain cultural context. For example, the abstract topic “global warming” may be represented by a polar bear standing alone on an ice floe. Such images are widely used in media and their automatic classification can help to identify high-level semantic concepts. This paper presents a system for the classification of iconic images. It uses a variation of the Bag of Visual Words approach wi...
Directory of Open Access Journals (Sweden)
Pérez-Marín, D.
2013-04-01
Full Text Available The classification of Iberian pig carcasses into different commercial categories according to feeding regime was evaluated by means of a non-destructive analysis of the subcutaneous adipose tissue using Near Infrared Spectroscopy (NIRS. A quantitative approach was used to predict the Acorn-Grass Weight Gain Index (AGWGI, and a set of criteria was established for commercial classification purposes. A total of 719 animals belonging to various batches, reflecting a wide range of feeding regimes, production systems and years, were analyzed with a view to developing and evaluating quantitative NIRS models. Results for the external validation of these models indicate that NIRS made clear differentiation of batches as a function of three feeding regimes possible with high accuracy (Acorn, Recebo and Feed, on the basis of the mean representative spectra of each batch. Moreover, individual analysis of the animals showed a broad consensus between field inspection information and the classification based on the AGWGI NIRS prediction, especially for extreme categories (Acorn and Feed.La clasificación en distintas categorías comerciales según régimen alimenticio de canales de cerdo Ibérico fue evaluada mediante el análisis no destructivo de muestras de tejido adiposo subcutáneo por Espectroscopía del Infrarrojo Cercano (NIRS. Partiendo de una aproximación cuantitativa para predecir el Índice de Reposición en Montanera (IRM se establecieron una serie de criterios para proceder a su clasificación comercial. Se analizaron un total de 719 animales pertenecientes a diversas partidas, que recogen una amplia variabilidad de muestras de distintos regímenes alimenticios, campañas y sistemas productivos, para el desarrollo y evaluación de los modelos NIRS cuantitativos. Los resultados de validación externa de los modelos indicaron que es posible discriminar con una gran exactitud entre partidas de distintos categorías (Bellota, Recebo y Cebo, en base
THE NITROGEN REGIME OF THE SASAR RIVER, IN BAIA MARE SECTION, THE PERIOD 2000-2010
Directory of Open Access Journals (Sweden)
ADRIANA MUNTEAN
2011-03-01
Full Text Available The Baia Mare city - the residence of Maramures county, is known as one of the most polluted cities from Romania, following a long history of mining activities and ores processing. The improper treatment of wastewater from flotation and treatment with cyanide ores and their discharge into the river Sasar led, slowly, but surely destroying the ecosystem. In addition to mining activities have contributed of course, and the metallurgical activities in the area. One of constant pollutions is and disposal of sewage waste water (treated poorly or not at all in the mass water of the Sasar river. The nitrogen regime may provide clues as to the possibility of developing various forms of life, being an indicator of the nutrient regime of aquatic life. This study aims at assessing the quality of the nitrogen regime of Sasar river, in the section upstream and downstream of Baia Mare, in the period 2000-2010, with reference to the Order 161/2006 - regarding the classification of surface water quality to determine the ecological status of water bodies .
Sentiment classification with interpolated information diffusion kernels
Raaijmakers, S.
2007-01-01
Information diffusion kernels - similarity metrics in non-Euclidean information spaces - have been found to produce state of the art results for document classification. In this paper, we present a novel approach to global sentiment classification using these kernels. We carry out a large array of
Directory of Open Access Journals (Sweden)
Karayannis Nicholas V
2012-02-01
Full Text Available Abstract Background Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". Methods A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT, Treatment Based Classification (TBC, Pathoanatomic Based Classification (PBC, Movement System Impairment Classification (MSI, and O'Sullivan Classification System (OCS schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Results Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i loading strategies (MDT, TBC, PBC aimed at eliciting a phenomenon
EEG Eye State Identification Using Incremental Attribute Learning with Time-Series Classification
Directory of Open Access Journals (Sweden)
Ting Wang
2014-01-01
Full Text Available Eye state identification is a kind of common time-series classification problem which is also a hot spot in recent research. Electroencephalography (EEG is widely used in eye state classification to detect human's cognition state. Previous research has validated the feasibility of machine learning and statistical approaches for EEG eye state classification. This paper aims to propose a novel approach for EEG eye state identification using incremental attribute learning (IAL based on neural networks. IAL is a novel machine learning strategy which gradually imports and trains features one by one. Previous studies have verified that such an approach is applicable for solving a number of pattern recognition problems. However, in these previous works, little research on IAL focused on its application to time-series problems. Therefore, it is still unknown whether IAL can be employed to cope with time-series problems like EEG eye state classification. Experimental results in this study demonstrates that, with proper feature extraction and feature ordering, IAL can not only efficiently cope with time-series classification problems, but also exhibit better classification performance in terms of classification error rates in comparison with conventional and some other approaches.
Multispectral Image classification using the theories of neural networks
International Nuclear Information System (INIS)
Ardisasmita, M.S.; Subki, M.I.R.
1997-01-01
Image classification is the one of the important part of digital image analysis. the objective of image classification is to identify and regroup the features occurring in an image into one or several classes in terms of the object. basic to the understanding of multispectral classification is the concept of the spectral response of an object as a function of the electromagnetic radiation and the wavelength of the spectrum. new approaches to classification has been developed to improve the result of analysis, these state-of-the-art classifiers are based upon the theories of neural networks. Neural network classifiers are algorithmes which mimic the computational abilities of the human brain. Artificial neurons are simple emulation's of biological neurons; they take in information from sensors or other artificial neurons, perform very simple operations on this data, and pass the result to other recognize the spectral signature of each image pixel. Neural network image classification has been divided into supervised and unsupervised training procedures. In the supervised approach, examples of each cover type can be located and the computer can compute spectral signatures to categorize all pixels in a digital image into several land cover classes. In supervised classification, spectral signatures are generated by mathematically grouping and it does not require analyst-specified training data. Thus, in the supervised approach we define useful information categories and then examine their spectral reparability; in the unsupervised approach the computer determines spectrally sapable classes and then we define thei information value
A hierarchical approach of hybrid image classification for land use and land cover mapping
Directory of Open Access Journals (Sweden)
Rahdari Vahid
2018-01-01
Full Text Available Remote sensing data analysis can provide thematic maps describing land-use and land-cover (LULC in a short period. Using proper image classification method in an area, is important to overcome the possible limitations of satellite imageries for producing land-use and land-cover maps. In the present study, a hierarchical hybrid image classification method was used to produce LULC maps using Landsat Thematic mapper TM for the year of 1998 and operational land imager OLI for the year of 2016. Images were classified using the proposed hybrid image classification method, vegetation cover crown percentage map from normalized difference vegetation index, Fisher supervised classification and object-based image classification methods. Accuracy assessment results showed that the hybrid classification method produced maps with total accuracy up to 84 percent with kappa statistic value 0.81. Results of this study showed that the proposed classification method worked better with OLI sensor than with TM. Although OLI has a higher radiometric resolution than TM, the produced LULC map using TM is almost accurate like OLI, which is because of LULC definitions and image classification methods used.
MeMoVolc report on classification and dynamics of volcanic explosive eruptions
Bonadonna, C.; Cioni, R.; Costa, A.; Druitt, T.; Phillips, J.; Pioli, L.; Andronico, D.; Harris, A.; Scollo, S.; Bachmann, O.; Bagheri, G.; Biass, S.; Brogi, F.; Cashman, K.; Dominguez, L.; Dürig, T.; Galland, O.; Giordano, G.; Gudmundsson, M.; Hort, M.; Höskuldsson, A.; Houghton, B.; Komorowski, J. C.; Küppers, U.; Lacanna, G.; Le Pennec, J. L.; Macedonio, G.; Manga, M.; Manzella, I.; Vitturi, M. de'Michieli; Neri, A.; Pistolesi, M.; Polacci, M.; Ripepe, M.; Rossi, E.; Scheu, B.; Sulpizio, R.; Tripoli, B.; Valade, S.; Valentine, G.; Vidal, C.; Wallenstein, N.
2016-11-01
Classifications of volcanic eruptions were first introduced in the early twentieth century mostly based on qualitative observations of eruptive activity, and over time, they have gradually been developed to incorporate more quantitative descriptions of the eruptive products from both deposits and observations of active volcanoes. Progress in physical volcanology, and increased capability in monitoring, measuring and modelling of explosive eruptions, has highlighted shortcomings in the way we classify eruptions and triggered a debate around the need for eruption classification and the advantages and disadvantages of existing classification schemes. Here, we (i) review and assess existing classification schemes, focussing on subaerial eruptions; (ii) summarize the fundamental processes that drive and parameters that characterize explosive volcanism; (iii) identify and prioritize the main research that will improve the understanding, characterization and classification of volcanic eruptions and (iv) provide a roadmap for producing a rational and comprehensive classification scheme. In particular, classification schemes need to be objective-driven and simple enough to permit scientific exchange and promote transfer of knowledge beyond the scientific community. Schemes should be comprehensive and encompass a variety of products, eruptive styles and processes, including for example, lava flows, pyroclastic density currents, gas emissions and cinder cone or caldera formation. Open questions, processes and parameters that need to be addressed and better characterized in order to develop more comprehensive classification schemes and to advance our understanding of volcanic eruptions include conduit processes and dynamics, abrupt transitions in eruption regime, unsteadiness, eruption energy and energy balance.
Differential Classification of Dementia
Directory of Open Access Journals (Sweden)
E. Mohr
1995-01-01
Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.
Directory of Open Access Journals (Sweden)
Y.Dorosh
2016-10-01
Full Text Available Analyzed the legal framework of restrictions in land use and their regime facilities (laws, regulations, rules, regulations, standards and classifications. Found that the current classification provides for the conduct of State Land Cadastre is flawed because it does not cover all kinds of restrictions, making impossible to use it for practical purposes. Therefore, we proposed territorial restrictions in the use of land classified by types and species. The classification confirms expediency to distinguish meaningful component of the project land to establish limits restrictions in land use and their rezhymoutvoryuyuchyh objects from the standard procedure of development for all types of project documentation provided by the Law of Ukraine "On Land Management". The article contains an updated block model for the drafting of land to establish the limits of restrictions on land use and regime facilities.The project land boundaries to establish restrictions on land use and regime facilities include: 1 drafting task on land; 2 an explanatory note; 3 the decision of the local government of drafting; 4 characterization of the natural environment; 5 certificate containing a summary of land (territory; 6 Cartogram agro-industrial groups of soils and steep slopes; 7 geodetic surveys and materials of Land Management 8 information on the current state of land use and protection (including registered in the State Land Cadastre restrictions on land use; 9 description of the territory by establishing usage of land of natural reserve fund and other environmental protection, health, recreational, historical, cultural, forestry purposes, land and water resources protection zones, restrictions in land use and their regime facilities; 10 within the limits the settlement - a copy of the graphic part of the master plan of settlement (if applicable, and outside the village - a copy of the appropriate planning documentation (if any and a copy of the decision on the
Early signatures of regime shifts in gene expression dynamics
Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani
2013-06-01
Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence.
Early signatures of regime shifts in gene expression dynamics
International Nuclear Information System (INIS)
Pal, Mainak; Pal, Amit Kumar; Ghosh, Sayantari; Bose, Indrani
2013-01-01
Recently, a large number of studies have been carried out on the early signatures of sudden regime shifts in systems as diverse as ecosystems, financial markets, population biology and complex diseases. The signatures of regime shifts in gene expression dynamics are less systematically investigated. In this paper, we consider sudden regime shifts in the gene expression dynamics described by a fold-bifurcation model involving bistability and hysteresis. We consider two alternative models, models 1 and 2, of competence development in the bacterial population B. subtilis and determine some early signatures of the regime shifts between competence and noncompetence. We use both deterministic and stochastic formalisms for the purpose of our study. The early signatures studied include the critical slowing down as a transition point is approached, rising variance and the lag-1 autocorrelation function, skewness and a ratio of two mean first passage times. Some of the signatures could provide the experimental basis for distinguishing between bistability and excitability as the correct mechanism for the development of competence. (paper)
Malekzadeh, Shima; Roohi, Ehsan
2015-06-01
Here we aimed to investigate various droplet formation regimes in a two-dimensional T-junction microchannel geometry using the open source software OpenFOAM. Two incompressible fluids, continuous phase in the main channel and dispersed phase in the lateral channel, have been considered. The interFoam solver was used to simulate laminar flow with two incompressible and isothermal phases. We evaluated the capability of "Compressive Interface Capturing Scheme for Arbitrary Meshes (CICSAM)" volume of fluid (VOF) technique of the OpenFOAM for modeling of the droplet formation and movement in different regimes. The flow behavior in the T-junction microchannel over a wide range of capillary numbers (0.006 to 0.12), volume flow rate ratio (0.125, 0.25, 0.5), and contact angle (130° to 180°) in the squeezing, dripping and jetting regimes were examined.The importance of parameters such as contact angle, capillary number, flow rate ratio, and Reynolds number at the time of separation, as well as the formation of droplets, was investigated in different regimes. We found that droplet detachment time increases by increasing the contact angle in the squeezing regime while increasing the contact angle in the dripping regime results in a decrease in the droplet detachment time. We compare the role of pressure gradient and shear stress forces in the droplet formation process in both dripping and squeezing regimes in details. We also provide a classification of two-phase flow regimes in the investigated T-junction microchannel in terms of three main parameters of, e.g., flow rate ratio, contact angle, and capillary number.
A decision-theoretic approach for segmental classification
Yau, Christopher; Holmes, Christopher C.
2013-01-01
This paper is concerned with statistical methods for the segmental classification of linear sequence data where the task is to segment and classify the data according to an underlying hidden discrete state sequence. Such analysis is commonplace in the empirical sciences including genomics, finance and speech processing. In particular, we are interested in answering the following question: given data $y$ and a statistical model $\\pi(x,y)$ of the hidden states $x$, what should we report as the ...
Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei
2013-05-21
A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.
The rationale behind Pierre Duhem's natural classification.
Bhakthavatsalam, Sindhuja
2015-06-01
The central concern of this paper is the interpretation of Duhem's attitude towards physical theory. Based on his view that the classification of experimental laws yielded by theory progressively approaches a natural classification-a classification reflecting that of underlying realities-Duhem has been construed as a realist of sorts in recent literature. Here I argue that his positive attitude towards the theoretic classification of laws had rather to do with the pragmatic rationality of the physicist. Duhem's idea of natural classification was an intuitive idea in the mind of the physicist that had to be affirmed in order to justify the physicist's pursuit of theory. Copyright © 2015 Elsevier Ltd. All rights reserved.
CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH
Directory of Open Access Journals (Sweden)
V. A. Ayma
2015-03-01
Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.
Mapping Land Management Regimes in Western Ukraine Using Optical and SAR Data
Directory of Open Access Journals (Sweden)
Jan Stefanski
2014-06-01
Full Text Available The global demand for agricultural products is surging due to population growth, more meat-based diets, and the increasing role of bioenergy. Three strategies can increase agricultural production: (1 expanding agriculture into natural ecosystems; (2 intensifying existing farmland; or (3 recultivating abandoned farmland. Because agricultural expansion entails substantial environmental trade-offs, intensification and recultivation are currently gaining increasing attention. Assessing where these strategies may be pursued, however, requires improved spatial information on land use intensity, including where farmland is active and fallow. We developed a framework to integrate optical and radar data in order to advance the mapping of three farmland management regimes: (1 large-scale, mechanized agriculture; (2 small-scale, subsistence agriculture; and (3 fallow or abandoned farmland. We applied this framework to our study area in western Ukraine, a region characterized by marked spatial heterogeneity in management intensity due to the legacies from Soviet land management, the breakdown of the Soviet Union in 1991, and the recent integration of this region into world markets. We mapped land management regimes using a hierarchical, object-based framework. Image segmentation for delineating objects was performed by using the Superpixel Contour algorithm. We then applied Random Forest classification to map land management regimes and validated our map using randomly sampled in-situ data, obtained during an extensive field campaign. Our results showed that farmland management regimes were mapped reliably, resulting in a final map with an overall accuracy of 83.4%. Comparing our land management regimes map with a soil map revealed that most fallow land occurred on soils marginally suited for agriculture, but some areas within our study region contained considerable potential for recultivation. Overall, our study highlights the potential for an improved
Ranking Regime and the Future of Vernacular Scholarship
Ishikawa, Mayumi
2014-01-01
World university rankings and their global popularity present a number of far-reaching impacts for vernacular scholarship. This article employs a multidimensional approach to analyze the ranking regime's threat to local scholarship and knowledge construction through a study of Japanese research universities. First, local conditions that have led…
Classifications of track structures
International Nuclear Information System (INIS)
Paretzke, H.G.
1984-01-01
When ionizing particles interact with matter they produce random topological structures of primary activations which represent the initial boundary conditions for all subsequent physical, chemical and/or biological reactions. There are two important aspects of research on such track structures, namely their experimental or theoretical determination on one hand and the quantitative classification of these complex structures which is a basic pre-requisite for the understanding of mechanisms of radiation actions. This paper deals only with the latter topic, i.e. the problems encountered in and possible approaches to quantitative ordering and grouping of these multidimensional objects by their degrees of similarity with respect to their efficiency in producing certain final radiation effects, i.e. to their ''radiation quality.'' Various attempts of taxonometric classification with respect to radiation efficiency have been made in basic and applied radiation research including macro- and microdosimetric concepts as well as track entities and stopping power based theories. In this paper no review of those well-known approaches is given but rather an outline and discussion of alternative methods new to this field of radiation research which have some very promising features and which could possibly solve at least some major classification problems
Decision tree approach for classification of remotely sensed satellite ...
Indian Academy of Sciences (India)
sensed satellite data using open source support. Richa Sharma .... Decision tree classification techniques have been .... the USGS Earth Resource Observation Systems. (EROS) ... for shallow water, 11% were for sparse and dense built-up ...
Classification of multiple sclerosis lesions using adaptive dictionary learning.
Deshpande, Hrishikesh; Maurel, Pierre; Barillot, Christian
2015-12-01
This paper presents a sparse representation and an adaptive dictionary learning based method for automated classification of multiple sclerosis (MS) lesions in magnetic resonance (MR) images. Manual delineation of MS lesions is a time-consuming task, requiring neuroradiology experts to analyze huge volume of MR data. This, in addition to the high intra- and inter-observer variability necessitates the requirement of automated MS lesion classification methods. Among many image representation models and classification methods that can be used for such purpose, we investigate the use of sparse modeling. In the recent years, sparse representation has evolved as a tool in modeling data using a few basis elements of an over-complete dictionary and has found applications in many image processing tasks including classification. We propose a supervised classification approach by learning dictionaries specific to the lesions and individual healthy brain tissues, which include white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF). The size of the dictionaries learned for each class plays a major role in data representation but it is an even more crucial element in the case of competitive classification. Our approach adapts the size of the dictionary for each class, depending on the complexity of the underlying data. The algorithm is validated using 52 multi-sequence MR images acquired from 13 MS patients. The results demonstrate the effectiveness of our approach in MS lesion classification. Copyright © 2015 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Gabriel Cepaluni
2005-06-01
Full Text Available Este artigo mostra que não existe apenas uma única teoria sobre regimes internacionais, mas um conjunto de estudos teóricos e empíricos que, isoladamente ou em conjunto, não constituem uma "teoria geral" das relações internacionais. Três abordagens resumem os principais debates sobre os regimes: o realismo-estrutural, o neoliberalismo e o cognitivismo. A primeira perspectiva - a realista - considera que o poder é o principal conceito para explicar os regimes internacionais. O neoliberalismo considera que o interesse é a principal noção analítica para entender a criação e a manutenção dos regimes. Finalmente, o cognitivismo coloca as idéias e os valores no centro de suas explicações. Feitas essas considerações, analisa-se o contencioso das patentes farmacêuticas entre o Brasil e os Estados Unidos (1988-2001 utilizando insights fornecidos pelos estudos de regimes internacionais - privilegiando-se a abordagem neoliberal. A partir do conflito entre o Brasil e os Estados Unidos, também são desenhadas algumas estratégias que podem ser utilizadas pelos países em desenvolvimento para maximizar seus ganhos no cenário internacional.This article demonstrates that there is not one international regime theory, but a set of theoretical and empirical studies that, alone or together, do not constitute a "general theory" of International Relations. Three approaches summarise the main debates on regimes: structural realism, neoliberalism, and cognitivism. The first perspective - the realist - considers power as the main concept for explaining international regimes. Neoliberalism considers interest as the main analytical tool to understand the creation and maintenance of regimes. Finally, cognitivism places ideas and values at the center of its explanations. After establishing these perspectives, the pharmaceutical patents dispute between Brazil and the United States (1988-2001 is analyzed, utilizing insights gained by the study of
Characteristics and application study of AP1000 NPPs equipment reliability classification method
International Nuclear Information System (INIS)
Guan Gao
2013-01-01
AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)
Decision tree approach for classification of remotely sensed satellite
Indian Academy of Sciences (India)
DTC) algorithm for classification of remotely sensed satellite data (Landsat TM) using open source support. The decision tree is constructed by recursively partitioning the spectral distribution of the training dataset using WEKA, open source ...
Fixing extensions to general relativity in the nonlinear regime
Cayuso, Juan; Ortiz, Néstor; Lehner, Luis
2017-10-01
The question of what gravitational theory could supersede General Relativity has been central in theoretical physics for decades. Many disparate alternatives have been proposed motivated by cosmology, quantum gravity and phenomenological angles, and have been subjected to tests derived from cosmological, solar system and pulsar observations typically restricted to linearized regimes. Gravitational waves from compact binaries provide new opportunities to probe these theories in the strongly gravitating/highly dynamical regimes. To this end however, a reliable understanding of the dynamics in such a regime is required. Unfortunately, most of these theories fail to define well posed initial value problems, which prevents at face value from meeting such challenge. In this work, we introduce a consistent program able to remedy this situation. This program is inspired in the approach to "fixing" viscous relativistic hydrodynamics introduced by Israel and Stewart in the late 70's. We illustrate how to implement this approach to control undesirable effects of higher order derivatives in gravity theories and argue how the modified system still captures the true dynamics of the putative underlying theories in 3 +1 dimensions. We sketch the implementation of this idea in a couple of effective theories of gravity, one in the context of Noncommutative Geometry, and one in the context of Chern-Simons modified General Relativity.
RESEARCH OF CLASSIFICATION FEATURES OF THE FINANCIAL CONTROL
Directory of Open Access Journals (Sweden)
Knarik K. Arabyan
2013-01-01
Full Text Available One of the major problems is an improvement of classification features in the financial control theory. There is not a consensus concerning the form classification and the methods of financial control. This factor hinders the development of methodology and investigation of other issues of the financial control theory. The author summarizes scientists’ approaches to studying the classification features of financial control in the article.
MODEL-BASED CLUSTERING FOR CLASSIFICATION OF AQUATIC SYSTEMS AND DIAGNOSIS OF ECOLOGICAL STRESS
Clustering approaches were developed using the classification likelihood, the mixture likelihood, and also using a randomization approach with a model index. Using a clustering approach based on the mixture and classification likelihoods, we have developed an algorithm that...
Directory of Open Access Journals (Sweden)
B Ritschel
2012-10-01
Full Text Available The Semantic Web is a W3C approach that integrates the different sources of semantics within documents and services using ontology-based techniques. The main objective of this approach in the geoscience domain is the improvement of understanding, integration, and usage of Earth and space science related web content in terms of data, information, and knowledge for machines and people. The modeling and representation of semantic attributes and relations within and among documents can be realized by human readable concept maps and machine readable OWL documents. The objectives for the usage of the Semantic Web approach in the GFZ data center ISDC project are the design of an extended classification of metadata documents for product types related to instruments, platforms, and projects as well as the integration of different types of metadata related to data product providers, users, and data centers. Sources of content and semantics for the description of Earth and space science product types and related classes are standardized metadata documents (e.g., DIF documents, publications, grey literature, and Web pages. Other sources are information provided by users, such as tagging data and social navigation information. The integration of controlled vocabularies as well as folksonomies plays an important role in the design of well formed ontologies.
Two-Stage Classification Approach for Human Detection in Camera Video in Bulk Ports
Directory of Open Access Journals (Sweden)
Mi Chao
2015-09-01
Full Text Available With the development of automation in ports, the video surveillance systems with automated human detection begun to be applied in open-air handling operation areas for safety and security. The accuracy of traditional human detection based on the video camera is not high enough to meet the requirements of operation surveillance. One of the key reasons is that Histograms of Oriented Gradients (HOG features of the human body will show great different between front & back standing (F&B and side standing (Side human body. Therefore, the final training for classifier will only gain a few useful specific features which have contribution to classification and are insufficient to support effective classification, while using the HOG features directly extracted by the samples from different human postures. This paper proposes a two-stage classification method to improve the accuracy of human detection. In the first stage, during preprocessing classification, images is mainly divided into possible F&B human body and not F&B human body, and then they were put into the second-stage classification among side human and non-human recognition. The experimental results in Tianjin port show that the two-stage classifier can improve the classification accuracy of human detection obviously.
Metabolic responses of Eucalyptus species to different temperature regimes
Mokochinski, Joao Benhur; Mazzafera, Paulo; Sawaya, Alexandra Christine Helena Frankland; Mumm, Roland; Vos, de Ric Cornelis Hendricus; Hall, Robert David
2018-01-01
Species and hybrids of Eucalyptus are the world's most widely planted hardwood trees. They are cultivated across a wide range of latitudes and therefore environmental conditions. In this context, comprehensive metabolomics approaches have been used to assess how different temperature regimes may
A Literature Survey of Early Time Series Classification and Deep Learning
Santos, Tiago; Kern, Roman
2017-01-01
This paper provides an overview of current literature on time series classification approaches, in particular of early time series classification. A very common and effective time series classification approach is the 1-Nearest Neighbor classier, with different distance measures such as the Euclidean or dynamic time warping distances. This paper starts by reviewing these baseline methods. More recently, with the gain in popularity in the application of deep neural networks to the eld of...
The NPT regime, present and future global security: an American view
International Nuclear Information System (INIS)
Thompson, Sam.
1987-01-01
Although not perfect, an international non-proliferation regime as set out by the IAEA and Non-Proliferation Treaty is in existence. The history of the involvement of the United States in the development of this regime is mentioned as a background to explaining the current approach of the Reagan Administration to non-proliferation. Trends and challenges which may affect future global security are then identified and discussed. The author is optimistic about the future. (U.K.)
Featureless classification of light curves
Kügler, S. D.; Gianniotis, N.; Polsterer, K. L.
2015-08-01
In the era of rapidly increasing amounts of time series data, classification of variable objects has become the main objective of time-domain astronomy. Classification of irregularly sampled time series is particularly difficult because the data cannot be represented naturally as a vector which can be directly fed into a classifier. In the literature, various statistical features serve as vector representations. In this work, we represent time series by a density model. The density model captures all the information available, including measurement errors. Hence, we view this model as a generalization to the static features which directly can be derived, e.g. as moments from the density. Similarity between each pair of time series is quantified by the distance between their respective models. Classification is performed on the obtained distance matrix. In the numerical experiments, we use data from the OGLE (Optical Gravitational Lensing Experiment) and ASAS (All Sky Automated Survey) surveys and demonstrate that the proposed representation performs up to par with the best currently used feature-based approaches. The density representation preserves all static information present in the observational data, in contrast to a less-complete description by features. The density representation is an upper boundary in terms of information made available to the classifier. Consequently, the predictive power of the proposed classification depends on the choice of similarity measure and classifier, only. Due to its principled nature, we advocate that this new approach of representing time series has potential in tasks beyond classification, e.g. unsupervised learning.
A Neural-Network-Based Approach to White Blood Cell Classification
Directory of Open Access Journals (Sweden)
Mu-Chun Su
2014-01-01
Full Text Available This paper presents a new white blood cell classification system for the recognition of five types of white blood cells. We propose a new segmentation algorithm for the segmentation of white blood cells from smear images. The core idea of the proposed segmentation algorithm is to find a discriminating region of white blood cells on the HSI color space. Pixels with color lying in the discriminating region described by an ellipsoidal region will be regarded as the nucleus and granule of cytoplasm of a white blood cell. Then, through a further morphological process, we can segment a white blood cell from a smear image. Three kinds of features (i.e., geometrical features, color features, and LDP-based texture features are extracted from the segmented cell. These features are fed into three different kinds of neural networks to recognize the types of the white blood cells. To test the effectiveness of the proposed white blood cell classification system, a total of 450 white blood cells images were used. The highest overall correct recognition rate could reach 99.11% correct. Simulation results showed that the proposed white blood cell classification system was very competitive to some existing systems.
In silico prediction of ROCK II inhibitors by different classification approaches.
Cai, Chuipu; Wu, Qihui; Luo, Yunxia; Ma, Huili; Shen, Jiangang; Zhang, Yongbin; Yang, Lei; Chen, Yunbo; Wen, Zehuai; Wang, Qi
2017-11-01
ROCK II is an important pharmacological target linked to central nervous system disorders such as Alzheimer's disease. The purpose of this research is to generate ROCK II inhibitor prediction models by machine learning approaches. Firstly, four sets of descriptors were calculated with MOE 2010 and PaDEL-Descriptor, and optimized by F-score and linear forward selection methods. In addition, four classification algorithms were used to initially build 16 classifiers with k-nearest neighbors [Formula: see text], naïve Bayes, Random forest, and support vector machine. Furthermore, three sets of structural fingerprint descriptors were introduced to enhance the predictive capacity of classifiers, which were assessed with fivefold cross-validation, test set validation and external test set validation. The best two models, MFK + MACCS and MLR + SubFP, have both MCC values of 0.925 for external test set. After that, a privileged substructure analysis was performed to reveal common chemical features of ROCK II inhibitors. Finally, binding modes were analyzed to identify relationships between molecular descriptors and activity, while main interactions were revealed by comparing the docking interaction of the most potent and the weakest ROCK II inhibitors. To the best of our knowledge, this is the first report on ROCK II inhibitors utilizing machine learning approaches that provides a new method for discovering novel ROCK II inhibitors.
Nonlinear programming for classification problems in machine learning
Astorino, Annabella; Fuduli, Antonio; Gaudioso, Manlio
2016-10-01
We survey some nonlinear models for classification problems arising in machine learning. In the last years this field has become more and more relevant due to a lot of practical applications, such as text and web classification, object recognition in machine vision, gene expression profile analysis, DNA and protein analysis, medical diagnosis, customer profiling etc. Classification deals with separation of sets by means of appropriate separation surfaces, which is generally obtained by solving a numerical optimization model. While linear separability is the basis of the most popular approach to classification, the Support Vector Machine (SVM), in the recent years using nonlinear separating surfaces has received some attention. The objective of this work is to recall some of such proposals, mainly in terms of the numerical optimization models. In particular we tackle the polyhedral, ellipsoidal, spherical and conical separation approaches and, for some of them, we also consider the semisupervised versions.
Proshutinsky, Andrey; Dukhovskoy, Dmitry; Timmermans, Mary-Louise; Krishfield, Richard; Bamber, Jonathan L
2015-10-13
Between 1948 and 1996, mean annual environmental parameters in the Arctic experienced a well-pronounced decadal variability with two basic circulation patterns: cyclonic and anticyclonic alternating at 5 to 7 year intervals. During cyclonic regimes, low sea-level atmospheric pressure (SLP) dominated over the Arctic Ocean driving sea ice and the upper ocean counterclockwise; the Arctic atmosphere was relatively warm and humid, and freshwater flux from the Arctic Ocean towards the subarctic seas was intensified. By contrast, during anticylonic circulation regimes, high SLP dominated driving sea ice and the upper ocean clockwise. Meanwhile, the atmosphere was cold and dry and the freshwater flux from the Arctic to the subarctic seas was reduced. Since 1997, however, the Arctic system has been under the influence of an anticyclonic circulation regime (17 years) with a set of environmental parameters that are atypical for this regime. We discuss a hypothesis explaining the causes and mechanisms regulating the intensity and duration of Arctic circulation regimes, and speculate how changes in freshwater fluxes from the Arctic Ocean and Greenland impact environmental conditions and interrupt their decadal variability. © 2015 The Authors.
Causas estruturais e consequências dos regimes internacionais: regimes como variáveis intervenientes
Directory of Open Access Journals (Sweden)
Stephen D. Krasner
2012-06-01
Full Text Available Os regimes internacionais são definidos como princípios, normas, regras e procedimentos de tomada de decisões ao redor dos quais as expectativas dos atores convergem em uma dada área-tema. Como ponto de partida, os regimes são conceituados como variáveis intervenientes, estando entre fatores causais básicos e os resultados e comportamentos relacionados. Há três visões a respeito da importância dos regimes: as orientações estruturais convencionais desvalorizam os regimes como sendo, na melhor das hipóteses, ineficazes; as orientações grocianas vêem os regimes como componentes íntimos do sistema internacional; as perspectivas estruturalistas modificadas vêem os regimes como significativos somente em certas condições restritas. Para os argumentos grociano e estruturalista modificado - que concordam com a visão de que os regimes podem influenciar resultados e comportamentos - , o desenvolvimento de regimes é visto como uma função de cinco variáveis causais básicas: auto-interesse egoísta; poder político; normas e princípios difusos; usos e costumes; conhecimento.
Lauren classification and individualized chemotherapy in gastric cancer
MA, JUNLI; SHEN, HONG; KAPESA, LINDA; ZENG, SHAN
2016-01-01
Gastric cancer is one of the most common malignancies worldwide. During the last 50 years, the histological classification of gastric carcinoma has been largely based on Lauren's criteria, in which gastric cancer is classified into two major histological subtypes, namely intestinal type and diffuse type adenocarcinoma. This classification was introduced in 1965, and remains currently widely accepted and employed, since it constitutes a simple and robust classification approach. The two histol...
Empirical Studies On Machine Learning Based Text Classification Algorithms
Shweta C. Dharmadhikari; Maya Ingle; Parag Kulkarni
2011-01-01
Automatic classification of text documents has become an important research issue now days. Properclassification of text documents requires information retrieval, machine learning and Natural languageprocessing (NLP) techniques. Our aim is to focus on important approaches to automatic textclassification based on machine learning techniques viz. supervised, unsupervised and semi supervised.In this paper we present a review of various text classification approaches under machine learningparadig...
Pang, Kun-Jing; Meng, Hong; Hu, Sheng-Shou; Wang, Hao; Hsi, David; Hua, Zhong-Dong; Pan, Xiang-Bin; Li, Shou-Jun
2017-08-01
Selecting an appropriate surgical approach for double-outlet right ventricle (DORV), a complex congenital cardiac malformation with many anatomic variations, is difficult. Therefore, we determined the feasibility of using an echocardiographic classification system, which describes the anatomic variations in more precise terms than the current system does, to determine whether it could help direct surgical plans. Our system includes 8 DORV subtypes, categorized according to 3 factors: the relative positions of the great arteries (normal or abnormal), the relationship between the great arteries and the ventricular septal defect (committed or noncommitted), and the presence or absence of right ventricular outflow tract obstruction (RVOTO). Surgical approaches in 407 patients were based on their DORV subtype, as determined by echocardiography. We found that the optimal surgical management of patients classified as normal/committed/no RVOTO, normal/committed/RVOTO, and abnormal/committed/no RVOTO was, respectively, like that for patients with large ventricular septal defects, tetralogy of Fallot, and transposition of the great arteries without RVOTO. Patients with abnormal/committed/RVOTO anatomy and those with abnormal/noncommitted/RVOTO anatomy underwent intraventricular repair and double-root translocation. For patients with other types of DORV, choosing the appropriate surgical approach and biventricular repair techniques was more complex. We think that our classification system accurately groups DORV patients and enables surgeons to select the best approach for each patient's cardiac anatomy.
Buildings classification from airborne LiDAR point clouds through OBIA and ontology driven approach
Tomljenovic, Ivan; Belgiu, Mariana; Lampoltshammer, Thomas J.
2013-04-01
In the last years, airborne Light Detection and Ranging (LiDAR) data proved to be a valuable information resource for a vast number of applications ranging from land cover mapping to individual surface feature extraction from complex urban environments. To extract information from LiDAR data, users apply prior knowledge. Unfortunately, there is no consistent initiative for structuring this knowledge into data models that can be shared and reused across different applications and domains. The absence of such models poses great challenges to data interpretation, data fusion and integration as well as information transferability. The intention of this work is to describe the design, development and deployment of an ontology-based system to classify buildings from airborne LiDAR data. The novelty of this approach consists of the development of a domain ontology that specifies explicitly the knowledge used to extract features from airborne LiDAR data. The overall goal of this approach is to investigate the possibility for classification of features of interest from LiDAR data by means of domain ontology. The proposed workflow is applied to the building extraction process for the region of "Biberach an der Riss" in South Germany. Strip-adjusted and georeferenced airborne LiDAR data is processed based on geometrical and radiometric signatures stored within the point cloud. Region-growing segmentation algorithms are applied and segmented regions are exported to the GeoJSON format. Subsequently, the data is imported into the ontology-based reasoning process used to automatically classify exported features of interest. Based on the ontology it becomes possible to define domain concepts, associated properties and relations. As a consequence, the resulting specific body of knowledge restricts possible interpretation variants. Moreover, ontologies are machinable and thus it is possible to run reasoning on top of them. Available reasoners (FACT++, JESS, Pellet) are used to check
Determination of aerodynamic sensitivity coefficients in the transonic and supersonic regimes
Elbanna, Hesham M.; Carlson, Leland A.
1989-01-01
The quasi-analytical approach is developed to compute airfoil aerodynamic sensitivity coefficients in the transonic and supersonic flight regimes. Initial investigation verifies the feasibility of this approach as applied to the transonic small perturbation residual expression. Results are compared to those obtained by the direct (finite difference) approach and both methods are evaluated to determine their computational accuracies and efficiencies. The quasi-analytical approach is shown to be superior and worth further investigation.
Unsupervised classification of operator workload from brain signals
Schultze-Kraft, Matthias; Dähne, Sven; Gugler, Manfred; Curio, Gabriel; Blankertz, Benjamin
2016-06-01
Objective. In this study we aimed for the classification of operator workload as it is expected in many real-life workplace environments. We explored brain-signal based workload predictors that differ with respect to the level of label information required for training, including entirely unsupervised approaches. Approach. Subjects executed a task on a touch screen that required continuous effort of visual and motor processing with alternating difficulty. We first employed classical approaches for workload state classification that operate on the sensor space of EEG and compared those to the performance of three state-of-the-art spatial filtering methods: common spatial patterns (CSPs) analysis, which requires binary label information; source power co-modulation (SPoC) analysis, which uses the subjects’ error rate as a target function; and canonical SPoC (cSPoC) analysis, which solely makes use of cross-frequency power correlations induced by different states of workload and thus represents an unsupervised approach. Finally, we investigated the effects of fusing brain signals and peripheral physiological measures (PPMs) and examined the added value for improving classification performance. Main results. Mean classification accuracies of 94%, 92% and 82% were achieved with CSP, SPoC, cSPoC, respectively. These methods outperformed the approaches that did not use spatial filtering and they extracted physiologically plausible components. The performance of the unsupervised cSPoC is significantly increased by augmenting it with PPM features. Significance. Our analyses ensured that the signal sources used for classification were of cortical origin and not contaminated with artifacts. Our findings show that workload states can be successfully differentiated from brain signals, even when less and less information from the experimental paradigm is used, thus paving the way for real-world applications in which label information may be noisy or entirely unavailable.
CLASSIFICATION AND COMPLEX STATE VALUE OF SHOPPING CENTERS: PROJECT-ORIENTED APPROACH
Directory of Open Access Journals (Sweden)
Юрій Павлович РАК
2016-02-01
Full Text Available Was done the analysis of projects objects of trade and entertainment centers from the perspective of improving the life safety and is proposed the definition of "Trade and entertainment center", "Trade and entertainment center" and "Complex value of trade and entertainment center." A classification of shopping centers on the classification criteria and the criteria are characterized by increased security status and attractiveness of their operation. The classification of trade and entertainment centers on the criteria of classification features were made. It characterizes the security situation and will increase the attractiveness of their operation. In the nearest future the most secure and modern TEC will be those buildings who will have unique qualities such as safety systems, excellent customer service, and thus by a high level of trust (the client to the mall. The important role will play those TEC, who have clearly formed value oriented project management, including communication values using innovative methods and models. Trade and entertainment centers as an organization are included in the complex process of interaction management. They being both as an enterprise that serves the public and satisfying a great range of his interests and architectural site, which is leased and increases the business attractiveness of the district of TEC location. This duality of the essence of TEC center makes difficult to assess the effectiveness of its security.
Adaptive SVM for Data Stream Classification
Directory of Open Access Journals (Sweden)
Isah A. Lawal
2017-07-01
Full Text Available In this paper, we address the problem of learning an adaptive classifier for the classification of continuous streams of data. We present a solution based on incremental extensions of the Support Vector Machine (SVM learning paradigm that updates an existing SVM whenever new training data are acquired. To ensure that the SVM effectiveness is guaranteed while exploiting the newly gathered data, we introduce an on-line model selection approach in the incremental learning process. We evaluated the proposed method on real world applications including on-line spam email filtering and human action classification from videos. Experimental results show the effectiveness and the potential of the proposed approach.
A Systematic Methodology for Gearbox Health Assessment and Fault Classification
Directory of Open Access Journals (Sweden)
Jay Lee
2011-01-01
Full Text Available A systematic methodology for gearbox health assessment and fault classification is developed and evaluated for 560 data sets of gearbox vibration data provided by the Prognostics and Health Management Society for the 2009 data challenge competition. A comprehensive set of signal processing and feature extraction methods are used to extract over 200 features, including features extracted from the raw time signal, time synchronous signal, wavelet decomposition signal, frequency domain spectrum, envelope spectrum, among others. A regime segmentation approach using the tachometer signal, a spectrum similarity metric, and gear mesh frequency peak information are used to segment the data by gear type, input shaft speed, and braking torque load. A health assessment method that finds the minimum feature vector sum in each regime is used to classify and find the 80 baseline healthy data sets. A fault diagnosis method based on a distance calculation from normal along with specific features correlated to different fault signatures is used to diagnosis specific faults. The fault diagnosis method is evaluated for the diagnosis of a gear tooth breakage, input shaft imbalance, bent shaft, bearing inner race defect, and bad key, and the method could be further extended for other faults as long as a set of features can be correlated with a known fault signature. Future work looks to further refine the distance calculation algorithm for fault diagnosis, as well as further evaluate other signal processing method such as the empirical mode decomposition to see if an improved set of features can be used to improve the fault diagnosis accuracy.
Classification of malignant and benign liver tumors using a radiomics approach
Starmans, Martijn P. A.; Miclea, Razvan L.; van der Voort, Sebastian R.; Niessen, Wiro J.; Thomeer, Maarten G.; Klein, Stefan
2018-03-01
Correct diagnosis of the liver tumor phenotype is crucial for treatment planning, especially the distinction between malignant and benign lesions. Clinical practice includes manual scoring of the tumors on Magnetic Resonance (MR) images by a radiologist. As this is challenging and subjective, it is often followed by a biopsy. In this study, we propose a radiomics approach as an objective and non-invasive alternative for distinguishing between malignant and benign phenotypes. T2-weighted (T2w) MR sequences of 119 patients from multiple centers were collected. We developed an efficient semi-automatic segmentation method, which was used by a radiologist to delineate the tumors. Within these regions, features quantifying tumor shape, intensity, texture, heterogeneity and orientation were extracted. Patient characteristics and semantic features were added for a total of 424 features. Classification was performed using Support Vector Machines (SVMs). The performance was evaluated using internal random-split cross-validation. On the training set within each iteration, feature selection and hyperparameter optimization were performed. To this end, another cross validation was performed by splitting the training sets in training and validation parts. The optimal settings were evaluated on the independent test sets. Manual scoring by a radiologist was also performed. The radiomics approach resulted in 95% confidence intervals of the AUC of [0.75, 0.92], specificity [0.76, 0.96] and sensitivity [0.52, 0.82]. These approach the performance of the radiologist, which were an AUC of 0.93, specificity 0.70 and sensitivity 0.93. Hence, radiomics has the potential to predict the liver tumor benignity in an objective and non-invasive manner.
An algebraic approach towards the classification of 2 dimensional conformal field theories
International Nuclear Information System (INIS)
Bouwknegt, P.G.
1988-01-01
This thesis treats an algebraic method for the construction of 2-dimensional conformal field theories. The method consists of the study of the representation theory of the Virasoro algebra and suitable extensions of this. The classification of 2-dimensional conformal field theories is translated into the classification of combinations of representations which satisfy certain consistence conditions (unitarity and modular invariance). For a certain class of 2-dimensional field theories, namely the one with central charge c = 1 from the theory of Kac-Moody algebra's. there exist indications, but as yet mainly hope, that this construction will finally lead to a classification of 2-dimensional conformal field theories. 182 refs.; 2 figs.; 26 tabs
A Hybrid Feature Selection Approach for Arabic Documents Classification
Habib, Mena Badieh; Sarhan, Ahmed A. E.; Salem, Abdel-Badeeh M.; Fayed, Zaki T.; Gharib, Tarek F.
Text Categorization (classification) is the process of classifying documents into a predefined set of categories based on their content. Text categorization algorithms usually represent documents as bags of words and consequently have to deal with huge number of features. Feature selection tries to
Three-class classification in computer-aided diagnosis of breast cancer by support vector machine
Sun, Xuejun; Qian, Wei; Song, Dansheng
2004-05-01
Design of classifier in computer-aided diagnosis (CAD) scheme of breast cancer plays important role to its overall performance in sensitivity and specificity. Classification of a detected object as malignant lesion, benign lesion, or normal tissue on mammogram is a typical three-class pattern recognition problem. This paper presents a three-class classification approach by using two-stage classifier combined with support vector machine (SVM) learning algorithm for classification of breast cancer on mammograms. The first classification stage is used to detect abnormal areas and normal breast tissues, and the second stage is for classification of malignant or benign in detected abnormal objects. A series of spatial, morphology and texture features have been extracted on detected objects areas. By using genetic algorithm (GA), different feature groups for different stage classification have been investigated. Computerized free-response receiver operating characteristic (FROC) and receiver operating characteristic (ROC) analyses have been employed in different classification stages. Results have shown that obvious performance improvement in both sensitivity and specificity was observed through proposed classification approach compared with conventional two-class classification approaches, indicating its effectiveness in classification of breast cancer on mammograms.
Image Classification Based on Convolutional Denoising Sparse Autoencoder
Directory of Open Access Journals (Sweden)
Shuangshuang Chen
2017-01-01
Full Text Available Image classification aims to group images into corresponding semantic categories. Due to the difficulties of interclass similarity and intraclass variability, it is a challenging issue in computer vision. In this paper, an unsupervised feature learning approach called convolutional denoising sparse autoencoder (CDSAE is proposed based on the theory of visual attention mechanism and deep learning methods. Firstly, saliency detection method is utilized to get training samples for unsupervised feature learning. Next, these samples are sent to the denoising sparse autoencoder (DSAE, followed by convolutional layer and local contrast normalization layer. Generally, prior in a specific task is helpful for the task solution. Therefore, a new pooling strategy—spatial pyramid pooling (SPP fused with center-bias prior—is introduced into our approach. Experimental results on the common two image datasets (STL-10 and CIFAR-10 demonstrate that our approach is effective in image classification. They also demonstrate that none of these three components: local contrast normalization, SPP fused with center-prior, and l2 vector normalization can be excluded from our proposed approach. They jointly improve image representation and classification performance.
Adaptive Matrices for Color Texture Classification
Bunte, Kerstin; Giotis, Ioannis; Petkov, Nicolai; Biehl, Michael; Real, P; DiazPernil, D; MolinaAbril, H; Berciano, A; Kropatsch, W
2011-01-01
In this paper we introduce an integrative approach towards color texture classification learned by a supervised framework. Our approach is based on the Generalized Learning Vector Quantization (GLVQ), extended by an adaptive distance measure which is defined in the Fourier domain and 2D Gabor
Typecasting catchments: Classification, directionality, and the pursuit of universality
Smith, Tyler; Marshall, Lucy; McGlynn, Brian
2018-02-01
Catchment classification poses a significant challenge to hydrology and hydrologic modeling, restricting widespread transfer of knowledge from well-studied sites. The identification of important physical, climatological, or hydrologic attributes (to varying degrees depending on application/data availability) has traditionally been the focus for catchment classification. Classification approaches are regularly assessed with regard to their ability to provide suitable hydrologic predictions - commonly by transferring fitted hydrologic parameters at a data-rich catchment to a data-poor catchment deemed similar by the classification. While such approaches to hydrology's grand challenges are intuitive, they often ignore the most uncertain aspect of the process - the model itself. We explore catchment classification and parameter transferability and the concept of universal donor/acceptor catchments. We identify the implications of the assumption that the transfer of parameters between "similar" catchments is reciprocal (i.e., non-directional). These concepts are considered through three case studies situated across multiple gradients that include model complexity, process description, and site characteristics. Case study results highlight that some catchments are more successfully used as donor catchments and others are better suited as acceptor catchments. These results were observed for both black-box and process consistent hydrologic models, as well as for differing levels of catchment similarity. Therefore, we suggest that similarity does not adequately satisfy the underlying assumptions being made in parameter regionalization approaches regardless of model appropriateness. Furthermore, we suggest that the directionality of parameter transfer is an important factor in determining the success of parameter regionalization approaches.
An approach for leukemia classification based on cooperative game theory.
Torkaman, Atefeh; Charkari, Nasrollah Moghaddam; Aghaeipour, Mahnaz
2011-01-01
Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO). Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5) with (90.16%) in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment of leukemic
International Nuclear Information System (INIS)
Martin, Michael A; Meyricke, Ramona; O'Neill, Terry; Roberts, Steven
2006-01-01
A critical choice facing breast cancer patients is which surgical treatment – mastectomy or breast conserving surgery (BCS) – is most appropriate. Several studies have investigated factors that impact the type of surgery chosen, identifying features such as place of residence, age at diagnosis, tumor size, socio-economic and racial/ethnic elements as relevant. Such assessment of 'propensity' is important in understanding issues such as a reported under-utilisation of BCS among women for whom such treatment was not contraindicated. Using Western Australian (WA) data, we further examine the factors associated with the type of surgical treatment for breast cancer using a classification tree approach. This approach deals naturally with complicated interactions between factors, and so allows flexible and interpretable models for treatment choice to be built that add to the current understanding of this complex decision process. Data was extracted from the WA Cancer Registry on women diagnosed with breast cancer in WA from 1990 to 2000. Subjects' treatment preferences were predicted from covariates using both classification trees and logistic regression. Tumor size was the primary determinant of patient choice, subjects with tumors smaller than 20 mm in diameter preferring BCS. For subjects with tumors greater than 20 mm in diameter factors such as patient age, nodal status, and tumor histology become relevant as predictors of patient choice. Classification trees perform as well as logistic regression for predicting patient choice, but are much easier to interpret for clinical use. The selected tree can inform clinicians' advice to patients
Alom, Md. Zahangir; Awwal, Abdul A. S.; Lowe-Webb, Roger; Taha, Tarek M.
2017-08-01
Deep-learning methods are gaining popularity because of their state-of-the-art performance in image classification tasks. In this paper, we explore classification of laser-beam images from the National Ignition Facility (NIF) using a novel deeplearning approach. NIF is the world's largest, most energetic laser. It has nearly 40,000 optics that precisely guide, reflect, amplify, and focus 192 laser beams onto a fusion target. NIF utilizes four petawatt lasers called the Advanced Radiographic Capability (ARC) to produce backlighting X-ray illumination to capture implosion dynamics of NIF experiments with picosecond temporal resolution. In the current operational configuration, four independent short-pulse ARC beams are created and combined in a split-beam configuration in each of two NIF apertures at the entry of the pre-amplifier. The subaperture beams then propagate through the NIF beampath up to the ARC compressor. Each ARC beamlet is separately compressed with a dedicated set of four gratings and recombined as sub-apertures for transport to the parabola vessel, where the beams are focused using parabolic mirrors and pointed to the target. Small angular errors in the compressor gratings can cause the sub-aperture beams to diverge from one another and prevent accurate alignment through the transport section between the compressor and parabolic mirrors. This is an off-normal condition that must be detected and corrected. The goal of the off-normal check is to determine whether the ARC beamlets are sufficiently overlapped into a merged single spot or diverged into two distinct spots. Thus, the objective of the current work is three-fold: developing a simple algorithm to perform off-normal classification, exploring the use of Convolutional Neural Network (CNN) for the same task, and understanding the inter-relationship of the two approaches. The CNN recognition results are compared with other machine-learning approaches, such as Deep Neural Network (DNN) and Support
Computational Intelligence Paradigms in Advanced Pattern Classification
Jain, Lakhmi
2012-01-01
This monograph presents selected areas of application of pattern recognition and classification approaches including handwriting recognition, medical image analysis and interpretation, development of cognitive systems for image computer understanding, moving object detection, advanced image filtration and intelligent multi-object labelling and classification. It is directed to the scientists, application engineers, professors, professors and students will find this book useful.
Transport processes in magnetically confined plasmas in the nonlinear regime.
Sonnino, Giorgio
2006-06-01
A field theory approach to transport phenomena in magnetically confined plasmas is presented. The thermodynamic field theory (TFT), previously developed for treating the generic thermodynamic system out of equilibrium, is applied to plasmas physics. Transport phenomena are treated here as the effect of the field linking the thermodynamic forces with their conjugate flows combined with statistical mechanics. In particular, the Classical and the Pfirsch-Schluter regimes are analyzed by solving the thermodynamic field equations of the TFT in the weak-field approximation. We found that, the TFT does not correct the expressions of the ionic heat fluxes evaluated by the neoclassical theory in these two regimes. On the other hand, the fluxes of matter and electronic energy (heat flow) is further enhanced in the nonlinear Classical and Pfirsch-Schluter regimes. These results seem to be in line with the experimental observations. The complete set of the electronic and ionic transport equations in the nonlinear Banana regime, is also reported. A paper showing the comparison between our theoretic results and the experimental observations in the JET machine is currently in preparation.
Directory of Open Access Journals (Sweden)
Sofia Siachalou
2015-03-01
Full Text Available Vegetation monitoring and mapping based on multi-temporal imagery has recently received much attention due to the plethora of medium-high spatial resolution satellites and the improved classification accuracies attained compared to uni-temporal approaches. Efficient image processing strategies are needed to exploit the phenological information present in temporal image sequences and to limit data redundancy and computational complexity. Within this framework, we implement the theory of Hidden Markov Models in crop classification, based on the time-series analysis of phenological states, inferred by a sequence of remote sensing observations. More specifically, we model the dynamics of vegetation over an agricultural area of Greece, characterized by spatio-temporal heterogeneity and small-sized fields, using RapidEye and Landsat ETM+ imagery. In addition, the classification performance of image sequences with variable spatial and temporal characteristics is evaluated and compared. The classification model considering one RapidEye and four pan-sharpened Landsat ETM+ images was found superior, resulting in a conditional kappa from 0.77 to 0.94 per class and an overall accuracy of 89.7%. The results highlight the potential of the method for operational crop mapping in Euro-Mediterranean areas and provide some hints for optimal image acquisition windows regarding major crop types in Greece.
Mapping a classification system to architectural education
DEFF Research Database (Denmark)
Hermund, Anders; Klint, Lars; Rostrup, Nicolai
2015-01-01
This paper examines to what extent a new classification system, Cuneco Classification System, CCS, proves useful in the education of architects, and to what degree the aim of an architectural education, rather based on an arts and crafts approach than a polytechnic approach, benefits from...... the distinct terminology of the classification system. The method used to examine the relationship between education, practice and the CCS bifurcates in a quantitative and a qualitative exploration: Quantitative comparison of the curriculum with the students’ own descriptions of their studies through...... a questionnaire survey among 88 students in graduate school. Qualitative interviews with a handful of practicing architects, to be able to cross check the relevance of the education with the profession. The examination indicates the need of a new definition, in addition to the CCS’s scale, covering the earliest...
Minimum Error Entropy Classification
Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A
2013-01-01
This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.
Learning classification models with soft-label information.
Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos
2014-01-01
Learning of classification models in medicine often relies on data labeled by a human expert. Since labeling of clinical data may be time-consuming, finding ways of alleviating the labeling costs is critical for our ability to automatically learn such models. In this paper we propose a new machine learning approach that is able to learn improved binary classification models more efficiently by refining the binary class information in the training phase with soft labels that reflect how strongly the human expert feels about the original class labels. Two types of methods that can learn improved binary classification models from soft labels are proposed. The first relies on probabilistic/numeric labels, the other on ordinal categorical labels. We study and demonstrate the benefits of these methods for learning an alerting model for heparin induced thrombocytopenia. The experiments are conducted on the data of 377 patient instances labeled by three different human experts. The methods are compared using the area under the receiver operating characteristic curve (AUC) score. Our AUC results show that the new approach is capable of learning classification models more efficiently compared to traditional learning methods. The improvement in AUC is most remarkable when the number of examples we learn from is small. A new classification learning framework that lets us learn from auxiliary soft-label information provided by a human expert is a promising new direction for learning classification models from expert labels, reducing the time and cost needed to label data.
Zimmermann, Jesko; Jones, Michael
2016-04-01
error (RMSE < RMSE95) or bias (RE< RE95). A general trend observed was that model performance declined with increased fertilisation rates. Overall, DayCent showed the best performance, however it does not provide the possibility to model the addition urease inhibitors. The results suggest that modelling changes in fertiliser regime on a large scale may require a multi-model approach to assure best performance. Ultimately, the research aims to develop a GIS based platform to apply such an approach on a regional scale.
Automated classification of cell morphology by coherence-controlled holographic microscopy
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.
Systemic classification for a new diagnostic approach to acute abdominal pain in children.
Kim, Ji Hoi; Kang, Hyun Sik; Han, Kyung Hee; Kim, Seung Hyo; Shin, Kyung-Sue; Lee, Mu Suk; Jeong, In Ho; Kim, Young Sil; Kang, Ki-Soo
2014-12-01
With previous methods based on only age and location, there are many difficulties in identifying the etiology of acute abdominal pain in children. We sought to develop a new systematic classification of acute abdominal pain and to give some helps to physicians encountering difficulties in diagnoses. From March 2005 to May 2010, clinical data were collected retrospectively from 442 children hospitalized due to acute abdominal pain with no apparent underlying disease. According to the final diagnoses, diseases that caused acute abdominal pain were classified into nine groups. The nine groups were group I "catastrophic surgical abdomen" (7 patients, 1.6%), group II "acute appendicitis and mesenteric lymphadenitis" (56 patients, 12.7%), group III "intestinal obstruction" (57 patients, 12.9%), group IV "viral and bacterial acute gastroenteritis" (90 patients, 20.4%), group V "peptic ulcer and gastroduodenitis" (66 patients, 14.9%), group VI "hepatobiliary and pancreatic disease" (14 patients, 3.2%), group VII "febrile viral illness and extraintestinal infection" (69 patients, 15.6%), group VIII "functional gastrointestinal disorder (acute manifestation)" (20 patients, 4.5%), and group IX "unclassified acute abdominal pain" (63 patients, 14.3%). Four patients were enrolled in two disease groups each. Patients were distributed unevenly across the nine groups of acute abdominal pain. In particular, the "unclassified abdominal pain" only group was not uncommon. Considering a systemic classification for acute abdominal pain may be helpful in the diagnostic approach in children.
A novel approach for classification of abnormalities in digitized ...
Indian Academy of Sciences (India)
Feature extraction is an important process for the overall system performance in classification. The objective of this article is to reveal the effectiveness of texture feature analysis for detecting the abnormalities in digitized mammograms using Self Adaptive Resource Allocation Network (SRAN) classifier. Thus, we proposed a ...
A law & economics approach to the study of integrated management regimes of estuaries
van de Griendt, W.E.
2004-01-01
In this paper it is proposed to analyse legal regimes for integrated management of estuaries with the help of institutional legal theory and the Schlager & Ostrom framework for types of ownership. Estuaries are highly valued and valuable and therefore need protection. The problem is that they
Williams, Jennifer A.; Schmitter-Edgecombe, Maureen; Cook, Diane J.
2016-01-01
Introduction Reducing the amount of testing required to accurately detect cognitive impairment is clinically relevant. The aim of this research was to determine the fewest number of clinical measures required to accurately classify participants as healthy older adult, mild cognitive impairment (MCI) or dementia using a suite of classification techniques. Methods Two variable selection machine learning models (i.e., naive Bayes, decision tree), a logistic regression, and two participant datasets (i.e., clinical diagnosis, clinical dementia rating; CDR) were explored. Participants classified using clinical diagnosis criteria included 52 individuals with dementia, 97 with MCI, and 161 cognitively healthy older adults. Participants classified using CDR included 154 individuals CDR = 0, 93 individuals with CDR = 0.5, and 25 individuals with CDR = 1.0+. Twenty-seven demographic, psychological, and neuropsychological variables were available for variable selection. Results No significant difference was observed between naive Bayes, decision tree, and logistic regression models for classification of both clinical diagnosis and CDR datasets. Participant classification (70.0 – 99.1%), geometric mean (60.9 – 98.1%), sensitivity (44.2 – 100%), and specificity (52.7 – 100%) were generally satisfactory. Unsurprisingly, the MCI/CDR = 0.5 participant group was the most challenging to classify. Through variable selection only 2 – 9 variables were required for classification and varied between datasets in a clinically meaningful way. Conclusions The current study results reveal that machine learning techniques can accurately classifying cognitive impairment and reduce the number of measures required for diagnosis. PMID:26332171
Cheese Classification, Characterization, and Categorization: A Global Perspective.
Almena-Aliste, Montserrat; Mietton, Bernard
2014-02-01
Cheese is one of the most fascinating, complex, and diverse foods enjoyed today. Three elements constitute the cheese ecosystem: ripening agents, consisting of enzymes and microorganisms; the composition of the fresh cheese; and the environmental conditions during aging. These factors determine and define not only the sensory quality of the final cheese product but also the vast diversity of cheeses produced worldwide. How we define and categorize cheese is a complicated matter. There are various approaches to cheese classification, and a global approach for classification and characterization is needed. We review current cheese classification schemes and the limitations inherent in each of the schemes described. While some classification schemes are based on microbiological criteria, others rely on descriptions of the technologies used for cheese production. The goal of this review is to present an overview of comprehensive and practical integrative classification models in order to better describe cheese diversity and the fundamental differences within cheeses, as well as to connect fundamental technological, microbiological, chemical, and sensory characteristics to contribute to an overall characterization of the main families of cheese, including the expanding world of American artisanal cheeses.
INTRODUCTION OF A SECTORAL APPROACH TO TRANSPORT SECTOR FOR POST-2012 CLIMATE REGIME
Directory of Open Access Journals (Sweden)
Atit TIPPICHAI
2009-01-01
Full Text Available Recently, the concept of sectoral approaches has been discussed actively under the UNFCCC framework as it could realize GHG mitigations for the Kyoto Protocol and beyond. However, most studies have never introduced this approach to the transport sector explicitly or analyzed its impacts quantitatively. In this paper, we introduce a sectoral approach which aims to set sector-specific emission reduction targets for the transport sector for the post-2012 climate regime. We suppose that developed countries will commit to the sectoral reduction target and key developing countries such as China and India will have the sectoral no-lose targets — no penalties for the failure to meet targets but the right to sell exceeding reductions — for the medium term commitment, i.e. 2013–2020. Six scenarios of total CO2 emission reduction target in the transport sector in 2020, varying from 5% to 30% reductions from the 2005 level are established. The paper preliminarily analyzes shares of emission reductions and abatement costs to meet the targets for key developed countries including the USA, EU-15, Russia, Japan and Canada. To analyze the impacts of the proposed approach, we generate sectoral marginal abatement cost (MAC curves by region through extending a top-down economic model, namely the AIM/CGE model. The total emission reduction targets are analyzed against the developed MAC curves for the transport sector in order to obtain an equal marginal abatement cost which derives optimal emission reduction for each country and minimizes total abatement cost. The results indicate that the USA will play a crucial role in GHG mitigations in the transport sector as it is most responsible for emission reductions (i.e. accounts for more than 70% while Japan will least reduce (i.e. accounts for about 3% for all scenarios. In the case of a 5% reduction, the total abatement is equal to 171.1 MtCO2 with a total cost of 1.61 billion USD; and in the case of a 30
Numerical approach to optimal portfolio in a power utility regime-switching model
Gyulov, Tihomir B.; Koleva, Miglena N.; Vulkov, Lubin G.
2017-12-01
We consider a system of weakly coupled degenerate semi-linear parabolic equations of optimal portfolio in a regime-switching with power utility function, derived by A.R. Valdez and T. Vargiolu [14]. First, we discuss some basic properties of the solution of this system. Then, we develop and analyze implicit-explicit, flux limited finite difference schemes for the differential problem. Numerical experiments are discussed.
Gallego, C.; Costa, A.; Cuerva, A.
2010-09-01
-ANN model (without regime classification) is adopted as a reference model. Both models are evaluated in terms of Improvement over Persistence on the Mean Square Error basis (IoP%) when predicting horizons form 1 time-step to 5. The case of a wind farm located in the complex terrain of Alaiz (north of Spain) has been considered. Three years of available power output data with a hourly resolution have been employed: two years for training and validation of the model and the last year for assessing the accuracy. Results showed that the RS-ANN overcame the single-ANN model for one step-ahead forecasts: the overall IoP% was up to 8.66% for the RS-ANN model (depending on the gradient criterion selected to consider the ramp regime triggered) and 6.16% for the single-ANN. However, both models showed similar accuracy for larger horizons. A locally-weighted evaluation during ramp events for one-step ahead was also performed. It was found that the IoP% during ramps-up increased from 17.60% (case of single-ANN) to 22.25% (case of RS-ANN); however, during the ramps-down events this improvement increased from 18.55% to 19.55%. Three main conclusions are derived from this case study: It highlights the importance of considering statistical models capable of differentiate several regimes showed by the output power time series in order to improve the forecasting during extreme events like ramps. On-line regime classification based on available power output data didn't seem to contribute to improve forecasts for horizons beyond one-step ahead. Tacking into account other explanatory variables (local wind measurements, NWP outputs) could lead to a better understanding of ramp events, improving the regime assessment also for further horizons. The RS-ANN model slightly overcame the single-ANN during ramp-down events. If further research reinforce this effect, special attention should be addressed to understand the underlying processes during ramp-down events.
Projected estimators for robust semi-supervised classification
DEFF Research Database (Denmark)
Krijthe, Jesse H.; Loog, Marco
2017-01-01
For semi-supervised techniques to be applied safely in practice we at least want methods to outperform their supervised counterparts. We study this question for classification using the well-known quadratic surrogate loss function. Unlike other approaches to semi-supervised learning, the procedure...... specifically, we prove that, measured on the labeled and unlabeled training data, this semi-supervised procedure never gives a lower quadratic loss than the supervised alternative. To our knowledge this is the first approach that offers such strong, albeit conservative, guarantees for improvement over...... the supervised solution. The characteristics of our approach are explicated using benchmark datasets to further understand the similarities and differences between the quadratic loss criterion used in the theoretical results and the classification accuracy typically considered in practice....
Classifications of objects on hyperspectral images
DEFF Research Database (Denmark)
Kucheryavskiy, Sergey
. In the present work a classification method that combines classic image classification approach and MIA is proposed. The basic idea is to group all pixels and calculate spectral properties of the pixel group to be used further as a vector of predictors for calibration and class prediction. The grouping can...... be done with mathematical morphology methods applied to a score image where objects are well separated. In the case of small overlapping a watershed transformation can be applied to disjoint the objects. The method has been tested on several simulated and real cases and showed good results and significant...... improvements in comparison with a standard MIA approach. The results as well as method details will be reported....
Raster Vs. Point Cloud LiDAR Data Classification
El-Ashmawy, N.; Shaker, A.
2014-09-01
Airborne Laser Scanning systems with light detection and ranging (LiDAR) technology is one of the fast and accurate 3D point data acquisition techniques. Generating accurate digital terrain and/or surface models (DTM/DSM) is the main application of collecting LiDAR range data. Recently, LiDAR range and intensity data have been used for land cover classification applications. Data range and Intensity, (strength of the backscattered signals measured by the LiDAR systems), are affected by the flying height, the ground elevation, scanning angle and the physical characteristics of the objects surface. These effects may lead to uneven distribution of point cloud or some gaps that may affect the classification process. Researchers have investigated the conversion of LiDAR range point data to raster image for terrain modelling. Interpolation techniques have been used to achieve the best representation of surfaces, and to fill the gaps between the LiDAR footprints. Interpolation methods are also investigated to generate LiDAR range and intensity image data for land cover classification applications. In this paper, different approach has been followed to classifying the LiDAR data (range and intensity) for land cover mapping. The methodology relies on the classification of the point cloud data based on their range and intensity and then converted the classified points into raster image. The gaps in the data are filled based on the classes of the nearest neighbour. Land cover maps are produced using two approaches using: (a) the conventional raster image data based on point interpolation; and (b) the proposed point data classification. A study area covering an urban district in Burnaby, British Colombia, Canada, is selected to compare the results of the two approaches. Five different land cover classes can be distinguished in that area: buildings, roads and parking areas, trees, low vegetation (grass), and bare soil. The results show that an improvement of around 10 % in the
2010-01-01
Background Theory in ecology points out the potential link between the degree of specialisation of organisms and their responses to disturbances and suggests that this could be a key element for understanding the assembly of communities. We evaluated this question for the arable weed flora as this group has scarcely been the focus of ecological studies so far and because weeds are restricted to habitats characterised by very high degrees of disturbance. As such, weeds offer a case study to ask how specialization relates to abundance and distribution of species in relation to the varying disturbance regimes occurring in arable crops. Results We used data derived from an extensive national monitoring network of approximately 700 arable fields scattered across France to quantify the degree of specialisation of 152 weed species using six different ecological methods. We then explored the impact of the level of disturbance occurring in arable fields by comparing the degree of specialisation of weed communities in contrasting field situations. The classification of species as specialist or generalist was consistent between different ecological indices. When applied on a large-scale data set across France, this classification highlighted that monoculture harbour significantly more specialists than crop rotations, suggesting that crop rotation increases abundance of generalist species rather than sets of species that are each specialised to the individual crop types grown in the rotation. Applied to a diachronic dataset, the classification also shows that the proportion of specialist weed species has significantly decreased in cultivated fields over the last 30 years which suggests a biotic homogenization of agricultural landscapes. Conclusions This study shows that the concept of generalist/specialist species is particularly relevant to understand the effect of anthropogenic disturbances on the evolution of plant community composition and that ecological theories
[The importance of classifications in psychiatry].
Lempérière, T
1995-12-01
The classifications currently used in psychiatry have different aims: to facilitate communication between researchers and clinicians at national and international levels through the use of a common language, or at least a clearly and precisely defined nomenclature; to provide a nosographical reference system which can be used in practice (diagnosis, prognosis, treatment); to optimize research by ensuring that sample cases are as homogeneous as possible; to facilitate statistical records for public health institutions. A classification is of practical interest only if it is reliable, valid and acceptable to all potential users. In recent decades, there has been a considerable systematic and coordinated effort to improve the methodological approach to classification and categorization in the field of psychiatry, including attempts to create operational definitions, field trials of inter-assessor reliability, attempts to validate the selected nosological categories by analysis of correlation between progression, treatment response, family history and additional examinations. The introduction of glossaries, and particularly of diagnostic criteria, marked a decisive step in this new approach. The key problem remains that of the validity of diagnostic criteria. Ideally, these should be based on demonstrable etiologic or pathogenic data, but such information is rarely available in psychiatry. Current classifications rely on the use of extremely diverse elements in differing degrees: descriptive criteria, evolutive criteria, etiopathogenic criteria, psychopathogenic criteria, etc. Certain syndrome-based classifications such as DSM III and its successors aim to be atheoretical and pragmatic. Others, such as ICD-10, while more eclectic than the different versions of DSM, follow suit by abandoning the terms "disease" and "illness" in favor of the more consensual "disorder". The legitimacy of classifications in the field of psychiatry has been fiercely contested, being
Discriminative Bayesian Dictionary Learning for Classification.
Akhtar, Naveed; Shafait, Faisal; Mian, Ajmal
2016-12-01
We propose a Bayesian approach to learn discriminative dictionaries for sparse representation of data. The proposed approach infers probability distributions over the atoms of a discriminative dictionary using a finite approximation of Beta Process. It also computes sets of Bernoulli distributions that associate class labels to the learned dictionary atoms. This association signifies the selection probabilities of the dictionary atoms in the expansion of class-specific data. Furthermore, the non-parametric character of the proposed approach allows it to infer the correct size of the dictionary. We exploit the aforementioned Bernoulli distributions in separately learning a linear classifier. The classifier uses the same hierarchical Bayesian model as the dictionary, which we present along the analytical inference solution for Gibbs sampling. For classification, a test instance is first sparsely encoded over the learned dictionary and the codes are fed to the classifier. We performed experiments for face and action recognition; and object and scene-category classification using five public datasets and compared the results with state-of-the-art discriminative sparse representation approaches. Experiments show that the proposed Bayesian approach consistently outperforms the existing approaches.
Temporal Data Fusion Approaches to Remote Sensing-Based Wetland Classification
Montgomery, Joshua S. M.
This thesis investigates the ecology of wetlands and associated classification in prairie and boreal environments of Alberta, Canada, using remote sensing technology to enhance classification of wetlands in the province. Objectives of the thesis are divided into two case studies, 1) examining how satellite borne Synthetic Aperture Radar (SAR), optical (RapidEye & SPOT) can be used to evaluate surface water trends in a prairie pothole environment (Shepard Slough); and 2) investigating a data fusion methodology combining SAR, optical and Lidar data to characterize wetland vegetation and surface water attributes in a boreal environment (Utikuma Regional Study Area (URSA)). Surface water extent and hydroperiod products were derived from SAR data, and validated using optical imagery with high accuracies (76-97% overall) for both case studies. High resolution Lidar Digital Elevation Models (DEM), Digital Surface Models (DSM), and Canopy Height Model (CHM) products provided the means for data fusion to extract riparian vegetation communities and surface water; producing model accuracies of (R2 0.90) for URSA, and RMSE of 0.2m to 0.7m at Shepard Slough when compared to field and optical validation data. Integration of Alberta and Canadian wetland classifications systems used to classify and determine economic value of wetlands into the methodology produced thematic maps relevant for policy and decision makers for potential wetland monitoring and policy development.
Causas estruturais e consequências dos regimes internacionais: regimes como variáveis intervenientes
Krasner, Stephen D.
2012-01-01
Os regimes internacionais são definidos como princípios, normas, regras e procedimentos de tomada de decisões ao redor dos quais as expectativas dos atores convergem em uma dada área-tema. Como ponto de partida, os regimes são conceituados como variáveis intervenientes, estando entre fatores causais básicos e os resultados e comportamentos relacionados. Há três visões a respeito da importância dos regimes: as orientações estruturais convencionais desvalorizam os regimes como sendo, na melhor ...
Detecting spatial regimes in ecosystems | Science Inventory ...
Research on early warning indicators has generally focused on assessing temporal transitions with limited application of these methods to detecting spatial regimes. Traditional spatial boundary detection procedures that result in ecoregion maps are typically based on ecological potential (i.e. potential vegetation), and often fail to account for ongoing changes due to stressors such as land use change and climate change and their effects on plant and animal communities. We use Fisher information, an information theory based method, on both terrestrial and aquatic animal data (US Breeding Bird Survey and marine zooplankton) to identify ecological boundaries, and compare our results to traditional early warning indicators, conventional ecoregion maps, and multivariate analysis such as nMDS (non-metric Multidimensional Scaling) and cluster analysis. We successfully detect spatial regimes and transitions in both terrestrial and aquatic systems using Fisher information. Furthermore, Fisher information provided explicit spatial information about community change that is absent from other multivariate approaches. Our results suggest that defining spatial regimes based on animal communities may better reflect ecological reality than do traditional ecoregion maps, especially in our current era of rapid and unpredictable ecological change. Use an information theory based method to identify ecological boundaries and compare our results to traditional early warning
Deep learning for tumor classification in imaging mass spectrometry.
Behrmann, Jens; Etmann, Christian; Boskamp, Tobias; Casadonte, Rita; Kriegsmann, Jörg; Maaß, Peter
2018-04-01
Tumor classification using imaging mass spectrometry (IMS) data has a high potential for future applications in pathology. Due to the complexity and size of the data, automated feature extraction and classification steps are required to fully process the data. Since mass spectra exhibit certain structural similarities to image data, deep learning may offer a promising strategy for classification of IMS data as it has been successfully applied to image classification. Methodologically, we propose an adapted architecture based on deep convolutional networks to handle the characteristics of mass spectrometry data, as well as a strategy to interpret the learned model in the spectral domain based on a sensitivity analysis. The proposed methods are evaluated on two algorithmically challenging tumor classification tasks and compared to a baseline approach. Competitiveness of the proposed methods is shown on both tasks by studying the performance via cross-validation. Moreover, the learned models are analyzed by the proposed sensitivity analysis revealing biologically plausible effects as well as confounding factors of the considered tasks. Thus, this study may serve as a starting point for further development of deep learning approaches in IMS classification tasks. https://gitlab.informatik.uni-bremen.de/digipath/Deep_Learning_for_Tumor_Classification_in_IMS. jbehrmann@uni-bremen.de or christianetmann@uni-bremen.de. Supplementary data are available at Bioinformatics online.
The research on business rules classification and specification methods
Baltrušaitis, Egidijus
2005-01-01
The work is based on the research of business rules classification and specification methods. The basics of business rules approach are discussed. The most common business rules classification and modeling methods are analyzed. Business rules modeling techniques and tools for supporting them in the information systems are presented. Basing on the analysis results business rules classification method is proposed. Templates for every business rule type are presented. Business rules structuring ...
Recognition Using Classification and Segmentation Scoring
National Research Council Canada - National Science Library
Kimball, Owen; Ostendorf, Mari; Rohlicek, Robin
1992-01-01
.... We describe an approach to connected word recognition that allows the use of segmental information through an explicit decomposition of the recognition criterion into classification and segmentation scoring...
Directory of Open Access Journals (Sweden)
Fan Yang
2015-07-01
Full Text Available Normally, polarimetric SAR classification is a high-dimensional nonlinear mapping problem. In the realm of pattern recognition, sparse representation is a very efficacious and powerful approach. As classical descriptors of polarimetric SAR, covariance and coherency matrices are Hermitian semidefinite and form a Riemannian manifold. Conventional Euclidean metrics are not suitable for a Riemannian manifold, and hence, normal sparse representation classification cannot be applied to polarimetric SAR directly. This paper proposes a new land cover classification approach for polarimetric SAR. There are two principal novelties in this paper. First, a Stein kernel on a Riemannian manifold instead of Euclidean metrics, combined with sparse representation, is employed for polarimetric SAR land cover classification. This approach is named Stein-sparse representation-based classification (SRC. Second, using simultaneous sparse representation and reasonable assumptions of the correlation of representation among different frequency bands, Stein-SRC is generalized to simultaneous Stein-SRC for multi-frequency polarimetric SAR classification. These classifiers are assessed using polarimetric SAR images from the Airborne Synthetic Aperture Radar (AIRSAR sensor of the Jet Propulsion Laboratory (JPL and the Electromagnetics Institute Synthetic Aperture Radar (EMISAR sensor of the Technical University of Denmark (DTU. Experiments on single-band and multi-band data both show that these approaches acquire more accurate classification results in comparison to many conventional and advanced classifiers.
An Approach for Leukemia Classification Based on Cooperative Game Theory
Directory of Open Access Journals (Sweden)
Atefeh Torkaman
2011-01-01
Full Text Available Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO. Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5 with (90.16% in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment
A NOVEL APPROACH TO ARRHYTHMIA CLASSIFICATION USING RR INTERVAL AND TEAGER ENERGY
Directory of Open Access Journals (Sweden)
CHANDRAKAR KAMATH
2012-12-01
Full Text Available It is hypothesized that a key characteristic of electrocardiogram (ECG signal is its nonlinear dynamic behaviour and that the nonlinear component changes more significantly between normal and arrhythmia conditions than the linear component. The usual statistical descriptors used in RR (R to R interval analysis do not capture the nonlinear disposition of RR interval variability. In this paper we explore a novel approach to extract the features from nonlinear component of the RR interval signal using Teager energy operator (TEO. The key feature of Teager energy is that it models the energy of the source that generated the signal rather than the energy of the signal itself. Hence any deviations in regular rhythmic activity of the heart get reflected in the Teager energy function. The classification evaluated on MIT-BIH database, with RR interval and mean of Teager energy computed over RR interval as features, exhibits an average accuracy that exceeds 99.79%.
Linda Tedrow; Wendel J. Hann
2015-01-01
The Fire Regime Condition Class (FRCC) is a composite departure measure that compares current vegetation structure and fire regime to historical reference conditions. FRCC is computed as the average of: 1) Vegetation departure (VDEP) and 2) Regime (frequency and severity) departure (RDEP). In addition to the FRCC rating, the Vegetation Condition Class (VCC) and Regime...
Which way forward : issues in developing an effective climate regime after 2012
International Nuclear Information System (INIS)
Cosbey, A.; Bell, W.; Murphy, D.; Parry, J.E.; Drexhage, J.; Hammill, A.; Van Ham, J.
2005-01-01
This book proposed that a post-2012 climate regime will need to balance the needs of all countries while aiming to prevent the potentially serious economic and social consequences of the impacts of climate change. Four elements were presented to support the emergence of an internationally acceptable approach: (1) the need to ensure sustainable economic development; (2) the effective development and penetration of clean technologies; (3) the establishment of an effective international carbon market over the long term; and (4) the integration of adaptation in development and natural resource management decision-making. A series of discussion papers were presented which reviewed options on how best to create an effective and inclusive international climate regime that will achieve large reductions in global emissions and equitably reflect the diverse circumstances of countries while promoting sustainable economic development. The first paper highlighted some of the characteristics of an international policy framework for cooperatively engaging the best tools of the scientific and policy communities to address challenges over the long and short term. The second paper examined how a post-2012 global climate regime could promote the development, deployment and diffusion of the appropriate technologies expected to play a critical role in mitigating and adapting to climate change. The third paper examined market-based approaches to enable cost-effective reductions and increase the feasibility of achieving long-term reductions as well as the promotion and development of low carbon energy technologies. The final paper examined research and policy developments relevant to determining how a future regime could support a long-term, integrated approach to addressing adaptation to climate change by all countries. refs., tabs., figs
Toward noncooperative iris recognition: a classification approach using multiple signatures.
Proença, Hugo; Alexandre, Luís A
2007-04-01
This paper focuses on noncooperative iris recognition, i.e., the capture of iris images at large distances, under less controlled lighting conditions, and without active participation of the subjects. This increases the probability of capturing very heterogeneous images (regarding focus, contrast, or brightness) and with several noise factors (iris obstructions and reflections). Current iris recognition systems are unable to deal with noisy data and substantially increase their error rates, especially the false rejections, in these conditions. We propose an iris classification method that divides the segmented and normalized iris image into six regions, makes an independent feature extraction and comparison for each region, and combines each of the dissimilarity values through a classification rule. Experiments show a substantial decrease, higher than 40 percent, of the false rejection rates in the recognition of noisy iris images.
Mapping of the Universe of Knowledge in Different Classification Schemes
Directory of Open Access Journals (Sweden)
M. P. Satija
2017-06-01
Full Text Available Given the variety of approaches to mapping the universe of knowledge that have been presented and discussed in the literature, the purpose of this paper is to systematize their main principles and their applications in the major general modern library classification schemes. We conducted an analysis of the literature on classification and the main classification systems, namely Dewey/Universal Decimal Classification, Cutter’s Expansive Classification, Subject Classification of J.D. Brown, Colon Classification, Library of Congress Classification, Bibliographic Classification, Rider’s International Classification, Bibliothecal Bibliographic Klassification (BBK, and Broad System of Ordering (BSO. We conclude that the arrangement of the main classes can be done following four principles that are not mutually exclusive: ideological principle, social purpose principle, scientific order, and division by discipline. The paper provides examples and analysis of each system. We also conclude that as knowledge is ever-changing, classifications also change and present a different structure of knowledge depending upon the society and time of their design.
International Nuclear Information System (INIS)
Luke, S.J.
2011-01-01
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the
Energy Technology Data Exchange (ETDEWEB)
Luke, S J
2011-12-20
This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the
Observation versus classification in supervised category learning.
Levering, Kimery R; Kurtz, Kenneth J
2015-02-01
The traditional supervised classification paradigm encourages learners to acquire only the knowledge needed to predict category membership (a discriminative approach). An alternative that aligns with important aspects of real-world concept formation is learning with a broader focus to acquire knowledge of the internal structure of each category (a generative approach). Our work addresses the impact of a particular component of the traditional classification task: the guess-and-correct cycle. We compare classification learning to a supervised observational learning task in which learners are shown labeled examples but make no classification response. The goals of this work sit at two levels: (1) testing for differences in the nature of the category representations that arise from two basic learning modes; and (2) evaluating the generative/discriminative continuum as a theoretical tool for understand learning modes and their outcomes. Specifically, we view the guess-and-correct cycle as consistent with a more discriminative approach and therefore expected it to lead to narrower category knowledge. Across two experiments, the observational mode led to greater sensitivity to distributional properties of features and correlations between features. We conclude that a relatively subtle procedural difference in supervised category learning substantially impacts what learners come to know about the categories. The results demonstrate the value of the generative/discriminative continuum as a tool for advancing the psychology of category learning and also provide a valuable constraint for formal models and associated theories.
Hogenboom, A.C.; Ketter, W.; Dalen, van Jan; Kaymak, U.; Collins, J.; Gupta, Alok
2009-01-01
Dynamic product pricing is a vital, yet non-trivial task in complex supply chains -- especially in case of limited visibility of the market environment. We propose to differentiate product pricing strategies using economic regimes. In our approach, we use economic regimes (characterizing market
Automated classification of cell morphology by coherence-controlled holographic microscopy.
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
The Treaty of Lisbon and the European Border Control Regime
Directory of Open Access Journals (Sweden)
Marianne Takle
2012-08-01
Full Text Available The question raised in the article is how the new provisions of the Lisbon Treaty and the Stockholm programme concerning the EU’s asylum and migration policy might consolidate existing trends within the European border control regime. The regime is defined by a combination of three features: (i a harmonisation of categories among the EU/Schengen member states, (ii a growing use of new technology in networked databases and (iii an increasing sorting of individuals based on security concerns. Although none of these features is new, the combination gives a new impetus to the European border control regime. The article concludes that the Lisbon Treaty and the Stockholm programme consolidate and strengthen existing trends. This implies that policies on border control, asylum, immigration, judicial cooperation and police cooperation are consolidated in a broader approach to border control, and that there is a strengthening of EU foreign policy within the European border control regime. The boundaries between previously dispersed policy areas are blurred. The combination of different aspects of security and various levels of authority requires coordination of policies with substantially different goals, and goes beyond mere border control.
A Classification-based Review Recommender
O'Mahony, Michael P.; Smyth, Barry
Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.
Weather regimes in the South American sector and neighbouring oceans during winter
Energy Technology Data Exchange (ETDEWEB)
Solman, S.A.; Menendez, C.G. [Centro de Investigaciones del Mar y la Atmosfera (CIMA-CONICET/UBA), Ciudad Universitaria, Buenos Aires (Argentina)
2003-07-01
We classified 34 years of winter daily 500 hPa geopotential height patterns over the eastern South Pacific-South America-South Atlantic region using the K-means clustering method. We found a significant classification into five weather regimes (WRs) defined as the most frequent large-scale circulation anomalies: WR1 (trough centred downstream of the Drake Passage), WR2 (trough over the SW Pacific and ridge downstream), WR3 (ridge over the SE Pacific and NW-SE trough downstream), WR4 (trough over the SE Pacific and NW-SE ridge downstream) and WR5 (weak ridge to the west of southern South America). We also analysed their persistence and temporal evolution, including transitions between them and development around onsets and breaks of each regime. The preferred transitions, WR1{yields}WR3{yields}WR2{yields}WR4{yields}WR1 and also WR1{yields}WR3{yields}WR2{yields}WR1, suggest the progression of a Rossby wave-like pattern in which each of the regimes resemble the Pacific-South America modes. Significant influence of the WRs on local climate over Argentina was found. The preferred transitions WR1{yields}WR3 and WR3{yields}WR2 induce sustained cold conditions over Patagonia and over northern Argentina, respectively. The most significant change in precipitation frequency is found for WR3, with wetter conditions over all the analysed regions. Finally, the interannual to interdecadal significant variations in the occurrence of these regimes were discussed. WR1 and WR3 are more frequent and WR2 is less frequent during El Nino, and WR2 and WR5 are more frequent and WR1 is less frequent during La Nina. A significant decrease in WR2 and increase of WR4 and WR5 during the 1970s and early 1980s were found. (orig.)
Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-01
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.
Managò, Stefano; Valente, Carmen; Mirabelli, Peppino; Circolo, Diego; Basile, Filomena; Corda, Daniela; de Luca, Anna Chiara
2016-04-01
Acute lymphoblastic leukemia type B (B-ALL) is a neoplastic disorder that shows high mortality rates due to immature lymphocyte B-cell proliferation. B-ALL diagnosis requires identification and classification of the leukemia cells. Here, we demonstrate the use of Raman spectroscopy to discriminate normal lymphocytic B-cells from three different B-leukemia transformed cell lines (i.e., RS4;11, REH, MN60 cells) based on their biochemical features. In combination with immunofluorescence and Western blotting, we show that these Raman markers reflect the relative changes in the potential biological markers from cell surface antigens, cytoplasmic proteins, and DNA content and correlate with the lymphoblastic B-cell maturation/differentiation stages. Our study demonstrates the potential of this technique for classification of B-leukemia cells into the different differentiation/maturation stages, as well as for the identification of key biochemical changes under chemotherapeutic treatments. Finally, preliminary results from clinical samples indicate high consistency of, and potential applications for, this Raman spectroscopy approach.
An Active Learning Framework for Hyperspectral Image Classification Using Hierarchical Segmentation
Zhang, Zhou; Pasolli, Edoardo; Crawford, Melba M.; Tilton, James C.
2015-01-01
Augmenting spectral data with spatial information for image classification has recently gained significant attention, as classification accuracy can often be improved by extracting spatial information from neighboring pixels. In this paper, we propose a new framework in which active learning (AL) and hierarchical segmentation (HSeg) are combined for spectral-spatial classification of hyperspectral images. The spatial information is extracted from a best segmentation obtained by pruning the HSeg tree using a new supervised strategy. The best segmentation is updated at each iteration of the AL process, thus taking advantage of informative labeled samples provided by the user. The proposed strategy incorporates spatial information in two ways: 1) concatenating the extracted spatial features and the original spectral features into a stacked vector and 2) extending the training set using a self-learning-based semi-supervised learning (SSL) approach. Finally, the two strategies are combined within an AL framework. The proposed framework is validated with two benchmark hyperspectral datasets. Higher classification accuracies are obtained by the proposed framework with respect to five other state-of-the-art spectral-spatial classification approaches. Moreover, the effectiveness of the proposed pruning strategy is also demonstrated relative to the approaches based on a fixed segmentation.
Enterprise Potential: Essence, Classification and Types
Directory of Open Access Journals (Sweden)
Turylo Anatolii M.
2014-02-01
Full Text Available The article considers existing approaches to classification of the enterprise potential as an economic notion. It offers own vision of classification of enterprise potential, which meets modern tendencies of enterprise development. Classification ensures a possibility of a wider description and assessment of enterprise potential and also allows identification of its most significant characteristics. Classification of the enterprise potential is developed by different criteria: by functions, by resource support, by ability to adapt, by the level of detection, by the spectrum of taking into account possibilities, by the period of coverage of possibilities and by the level of use. Analysis of components of the enterprise potential allows obtaining a complete and trustworthy assessment of the state of an enterprise. Adaptation potential of an enterprise is based on principles systemacy and dynamism, it characterises possibilities of adjustment of an enterprise to external and internal economic conditions.
Automated Classification of Asteroids into Families at Work
Knežević, Zoran; Milani, Andrea; Cellino, Alberto; Novaković, Bojan; Spoto, Federica; Paolicchi, Paolo
2014-07-01
We have recently proposed a new approach to the asteroid family classification by combining the classical HCM method with an automated procedure to add newly discovered members to existing families. This approach is specifically intended to cope with ever increasing asteroid data sets, and consists of several steps to segment the problem and handle the very large amount of data in an efficient and accurate manner. We briefly present all these steps and show the results from three subsequent updates making use of only the automated step of attributing the newly numbered asteroids to the known families. We describe the changes of the individual families membership, as well as the evolution of the classification due to the newly added intersections between the families, resolved candidate family mergers, and emergence of the new candidates for the mergers. We thus demonstrate how by the new approach the asteroid family classification becomes stable in general terms (converging towards a permanent list of confirmed families), and in the same time evolving in details (to account for the newly discovered asteroids) at each update.
Paula Durkin; Esteban Muldavin; Mike Bradley; Stacey E. Carr
1996-01-01
The riparian wetland vegetation communities of the upper and middle Rio Grande watersheds in New Mexico were surveyed in 1992 through 1994. The communities are hierarchically classified in terms of species composition and vegetation structure. The resulting Community Types are related to soil conditions, hydrological regime, and temporal dynamics. The classification is...
Possession States: Approaches to Clinical Evaluation and Classification
Directory of Open Access Journals (Sweden)
S. McCormick
1992-01-01
Full Text Available The fields of anthropology and sociology have produced a large quantity of literature on possession states, physicians however rarely report on such phenomena. As a result clinical description of possession states has suffered, even though these states may be more common and less deviant than supposed. Both ICD-10 and DSM-IV may include specific criteria for possession disorders. The authors briefly review Western notions about possession and kindred states and present guidelines for evaluation and classification.
Improvement of Classification of Enterprise Circulating Funds
Directory of Open Access Journals (Sweden)
Rohanova Hanna O.
2014-02-01
Full Text Available The goal of the article lies in revelation of possibilities of increase of efficiency of managing enterprise circulating funds by means of improvement of their classification features. Having analysed approaches of many economists to classification of enterprise circulating funds, systemised and supplementing them, the article offers grouping classification features of enterprise circulating funds. In the result of the study the article offers an expanded classification of circulating funds, which clearly shows the role of circulating funds in managing enterprise finance and economy in general. The article supplements and groups classification features of enterprise circulating funds by: the organisation level, functioning character, sources of formation and their cost, and level of management efficiency. The article shows that the provided grouping of classification features of circulating funds allows exerting all-sided and purposeful influence upon indicators of efficiency of circulating funds functioning and facilitates their rational management in general. The prospect of further studies in this direction is identification of the level of attraction of loan resources by production enterprises for financing circulating funds.
Cloud field classification based on textural features
Sengupta, Sailes Kumar
1989-01-01
An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes
Classification Accuracy Increase Using Multisensor Data Fusion
Makarau, A.; Palubinskas, G.; Reinartz, P.
2011-09-01
The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to
Rational kernels for Arabic Root Extraction and Text Classification
Directory of Open Access Journals (Sweden)
Attia Nehar
2016-04-01
Full Text Available In this paper, we address the problems of Arabic Text Classification and root extraction using transducers and rational kernels. We introduce a new root extraction approach on the basis of the use of Arabic patterns (Pattern Based Stemmer. Transducers are used to model these patterns and root extraction is done without relying on any dictionary. Using transducers for extracting roots, documents are transformed into finite state transducers. This document representation allows us to use and explore rational kernels as a framework for Arabic Text Classification. Root extraction experiments are conducted on three word collections and yield 75.6% of accuracy. Classification experiments are done on the Saudi Press Agency dataset and N-gram kernels are tested with different values of N. Accuracy and F1 report 90.79% and 62.93% respectively. These results show that our approach, when compared with other approaches, is promising specially in terms of accuracy and F1.
DEFF Research Database (Denmark)
Rudwaleit, M; Jurik, A G; Hermann, K-G A
2009-01-01
BACKGROUND: Magnetic resonance imaging (MRI) of sacroiliac joints has evolved as the most relevant imaging modality for diagnosis and classification of early axial spondyloarthritis (SpA) including early ankylosing spondylitis. OBJECTIVES: To identify and describe MRI findings in sacroiliitis and...... relevant for sacroiliitis have been defined by consensus by a group of rheumatologists and radiologists. These definitions should help in applying correctly the imaging feature "active sacroiliitis by MRI" in the new ASAS classification criteria for axial SpA.......BACKGROUND: Magnetic resonance imaging (MRI) of sacroiliac joints has evolved as the most relevant imaging modality for diagnosis and classification of early axial spondyloarthritis (SpA) including early ankylosing spondylitis. OBJECTIVES: To identify and describe MRI findings in sacroiliitis...... conditions which may mimic SpA. Descriptions of the pathological findings and technical requirements for the appropriate acquisition were formulated. In a consensual approach MRI findings considered to be essential for sacroiliitis were defined. RESULTS: Active inflammatory lesions such as bone marrow oedema...
An Authentication Technique Based on Classification
Institute of Scientific and Technical Information of China (English)
李钢; 杨杰
2004-01-01
We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.
Application of wavelet transform for PDZ domain classification.
Directory of Open Access Journals (Sweden)
Khaled Daqrouq
Full Text Available PDZ domains have been identified as part of an array of signaling proteins that are often unrelated, except for the well-conserved structural PDZ domain they contain. These domains have been linked to many disease processes including common Avian influenza, as well as very rare conditions such as Fraser and Usher syndromes. Historically, based on the interactions and the nature of bonds they form, PDZ domains have most often been classified into one of three classes (class I, class II and others - class III, that is directly dependent on their binding partner. In this study, we report on three unique feature extraction approaches based on the bigram and trigram occurrence and existence rearrangements within the domain's primary amino acid sequences in assisting PDZ domain classification. Wavelet packet transform (WPT and Shannon entropy denoted by wavelet entropy (WE feature extraction methods were proposed. Using 115 unique human and mouse PDZ domains, the existence rearrangement approach yielded a high recognition rate (78.34%, which outperformed our occurrence rearrangements based method. The recognition rate was (81.41% with validation technique. The method reported for PDZ domain classification from primary sequences proved to be an encouraging approach for obtaining consistent classification results. We anticipate that by increasing the database size, we can further improve feature extraction and correct classification.
An Incremental Classification Algorithm for Mining Data with Feature Space Heterogeneity
Directory of Open Access Journals (Sweden)
Yu Wang
2014-01-01
Full Text Available Feature space heterogeneity often exists in many real world data sets so that some features are of different importance for classification over different subsets. Moreover, the pattern of feature space heterogeneity might dynamically change over time as more and more data are accumulated. In this paper, we develop an incremental classification algorithm, Supervised Clustering for Classification with Feature Space Heterogeneity (SCCFSH, to address this problem. In our approach, supervised clustering is implemented to obtain a number of clusters such that samples in each cluster are from the same class. After the removal of outliers, relevance of features in each cluster is calculated based on their variations in this cluster. The feature relevance is incorporated into distance calculation for classification. The main advantage of SCCFSH lies in the fact that it is capable of solving a classification problem with feature space heterogeneity in an incremental way, which is favorable for online classification tasks with continuously changing data. Experimental results on a series of data sets and application to a database marketing problem show the efficiency and effectiveness of the proposed approach.
Weakly supervised classification in high energy physics
Energy Technology Data Exchange (ETDEWEB)
Dery, Lucio Mwinmaarong [Physics Department, Stanford University,Stanford, CA, 94305 (United States); Nachman, Benjamin [Physics Division, Lawrence Berkeley National Laboratory,1 Cyclotron Rd, Berkeley, CA, 94720 (United States); Rubbo, Francesco; Schwartzman, Ariel [SLAC National Accelerator Laboratory, Stanford University,2575 Sand Hill Rd, Menlo Park, CA, 94025 (United States)
2017-05-29
As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.
Weakly supervised classification in high energy physics
International Nuclear Information System (INIS)
Dery, Lucio Mwinmaarong; Nachman, Benjamin; Rubbo, Francesco; Schwartzman, Ariel
2017-01-01
As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.
ACCUWIND - Methods for classification of cup anemometers
DEFF Research Database (Denmark)
Dahlberg, J.-Å.; Friis Pedersen, Troels; Busche, P.
2006-01-01
the errors associated with the use of cup anemometers, and to develop a classification system for quantification of systematic errors of cup anemometers. This classification system has now been implementedin the IEC 61400-12-1 standard on power performance measurements in annex I and J. The classification...... of cup anemometers requires general external climatic operational ranges to be applied for the analysis of systematic errors. A Class A categoryclassification is connected to reasonably flat sites, and another Class B category is connected to complex terrain, General classification indices are the result...... developed in the CLASSCUP projectand earlier. A number of approaches including the use of two cup anemometer models, two methods of torque coefficient measurement, two angular response measurements, and inclusion and exclusion of influence of friction have been implemented in theclassification process...
DEFF Research Database (Denmark)
Debus, Michael S.
2017-01-01
This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....
Classification of high resolution imagery based on fusion of multiscale texture features
International Nuclear Information System (INIS)
Liu, Jinxiu; Liu, Huiping; Lv, Ying; Xue, Xiaojuan
2014-01-01
In high resolution data classification process, combining texture features with spectral bands can effectively improve the classification accuracy. However, the window size which is difficult to choose is regarded as an important factor influencing overall classification accuracy in textural classification and current approaches to image texture analysis only depend on a single moving window which ignores different scale features of various land cover types. In this paper, we propose a new method based on the fusion of multiscale texture features to overcome these problems. The main steps in new method include the classification of fixed window size spectral/textural images from 3×3 to 15×15 and comparison of all the posterior possibility values for every pixel, as a result the biggest probability value is given to the pixel and the pixel belongs to a certain land cover type automatically. The proposed approach is tested on University of Pavia ROSIS data. The results indicate that the new method improve the classification accuracy compared to results of methods based on fixed window size textural classification
Directory of Open Access Journals (Sweden)
M. Valaee
2016-09-01
Full Text Available Introduction: Soil moisture regime refers to the presence or absence either of ground water or of water held at a tension of less than 1500 kPa in the soil or in specific horizons during periods of the year. It is the most important factor in soil formation, soil evolution and fertility affecting on crop production and management. Also, it widely is practical in soil classification and soil mapping. The soil moisture regime depends on the soil properties, climatic and weather conditions, characteristics of natural plant formations and, in cultivated soils, is affected by the characteristics of crops grown, as well as the cultivation practices. Determination of soil moisture regime within a landscape scale requires high information and data about moisture balance of soil profile during some years according to Soil Survey Manual (2010. This approach is very expensive, labor, time and cost consuming. Therefore, achievement to an alternative approach is seems essential to overcome these problems. The main hypothesis of this study was to use capability of magnetic susceptibility as a cheap and rapid technique could determine the soil moisture regimes. Magnetic properties of soils reflect the impacts of soil mineral composition, particularly the quantity of ferrimagnetic minerals such as maghemite and magnetite. Magnetic susceptibility measurements can serve a variety of applications including the changes in soil forming processes and ecological services, understanding of lithological effects, insight of sedimentation processes and soil drainage. Materials and Methods: This study was conducted in an area located between 36°46َ 10˝ and 37° 2’ 28˝ N latitudes, and 54° 29’ 31˝ and 55° 12’ 47˝ E longitudes in Golestan province, northern Iran. In the study region mean annual temperature varies from 12.4 to 19.4 °C. The average annual rainfall and evapotranspiration varies from 230 mm and 2335 mm in Inchebrun district (Aridic regime, to 732
Rapid Classification of Ordinary Chondrites Using Raman Spectroscopy
Fries, M.; Welzenbach, L.
2014-01-01
Classification of ordinary chondrites is typically done through measurements of the composition of olivine and pyroxenes. Historically, this measurement has usually been performed via electron microprobe, oil immersion or other methods which can be costly through lost sample material during thin section preparation. Raman microscopy can perform the same measurements but considerably faster and with much less sample preparation allowing for faster classification. Raman spectroscopy can facilitate more rapid classification of large amounts of chondrites such as those retrieved from North Africa and potentially Antarctica, are present in large collections, or are submitted to a curation facility by the public. With development, this approach may provide a completely automated classification method of all chondrite types.
Clever Toolbox - the Art of Automated Genre Classification
DEFF Research Database (Denmark)
2005-01-01
Automatic musical genre classification can be defined as the science of finding computer algorithms that a digitized sound clip as input and yield a musical genre as output. The goal of automated genre classification is, of course, that the musical genre should agree with the human classificasion....... This demo illustrates an approach to the problem that first extract frequency-based sound features followed by a "linear regression" classifier. The basic features are the so-called mel-frequency cepstral coefficients (MFCCs), which are extracted on a time-scale of 30 msec. From these MFCC features, auto......) is subsequently used for classification. This classifier is rather simple; current research investigates more advanced methods of classification....
Strengthening the nuclear non-proliferation regime
International Nuclear Information System (INIS)
Carlson, J.
2003-01-01
Although the nuclear non-proliferation regime has enjoyed considerable success, today the regime has never been under greater threat. Three states have challenged the objectives of the NPT, and there is a technology challenge - the spread of centrifuge enrichment technology and know-how. A major issue confronting the international community is, how to deal with a determined proliferator? Despite this gloomy scenario, however, the non-proliferation regime has considerable strengths - many of which can be developed further. The regime comprises complex interacting and mutually reinforcing elements. At its centre is the NPT - with IAEA safeguards as the Treaty's verification mechanism. Important complementary elements include: restraint in the supply and the acquisition of sensitive technologies; multilateral regimes such as the CTBT and proposed FMCT; various regional and bilateral regimes; the range of security and arms control arrangements outside the nuclear area (including other WMD regimes); and the development of proliferation-resistant technologies. Especially important are political incentives and sanctions in support of non-proliferation objectives. This paper outlines some of the key issues facing the non-proliferation regime
Autonomia e relevância dos regimes The autonomy and relevance of regimes
Directory of Open Access Journals (Sweden)
Gustavo Seignemartin de Carvalho
2005-12-01
Full Text Available Teorias institucionalistas na disciplina de relações internacionais usualmente definem regimes como um conjunto de normas e regras formais ou informais que permitem a convergência de expectativas ou a padronização do comportamento de seus participantes em uma determinada área de interesses com o objetivo de resolver problemas de coordenação que tenderiam a resultados não pareto-eficientes. Como estas definições baseadas meramente na "eficiência" dos regimes não parecem suficientes para explicar sua efetividade, o presente artigo propõe uma definição diferente para regimes: a de arranjos políticos que permitem a redistribuição dos ganhos da cooperação pelos participantes em uma determinada área de interesses em um contexto de interdependência. Regimes possuiriam efetividade pela sua autonomia e relevância, ou seja, por possuírem existência objetiva autônoma da de seus participantes e por influenciarem seu comportamento e expectativas de maneiras que não podem ser reduzidas à ação individual de nenhum deles. O artigo inicia-se com uma breve discussão sobre as dificuldades terminológicas associadas ao estudo de regimes e a definição dos conceitos de autonomia e relevância. Em seguida, classifica os diversos autores participantes do debate em duas perspectivas distintas, uma que nega (não-autonomistas e outra que atribui (autonomistas aos regimes autonomia e relevância, e faz uma breve análise dos autores e tradições mais significativos para o debate, aprofundando-se nos autonomistas e nos argumentos que reforçam a hipótese aqui apresentada. Ao final, o artigo propõe uma decomposição analítica dos regimes nos quatro elementos principais que lhes propiciam autonomia e relevância: normatividade, atores, especificidade da área de interesses e interdependência complexa com o contexto.Regimes are defined by institutionalist theories in the discipline of International Relations as formal or informal sets
Inguinal hernia recurrence: Classification and approach
Directory of Open Access Journals (Sweden)
Campanelli Giampiero
2006-01-01
Full Text Available The authors reviewed the records of 2,468 operations of groin hernia in 2,350 patients, including 277 recurrent hernias updated to January 2005. The data obtained - evaluating technique, results and complications - were used to propose a simple anatomo-clinical classification into three types which could be used to plan the surgical strategy:Type R1: first recurrence ′high,′ oblique external, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R2: first recurrence ′low,′ direct, reducible hernia with small (< 2 cm defect in non-obese patients, after pure tissue or mesh repairType R3: all the other recurrences - including femoral recurrences; recurrent groin hernia with big defect (inguinal eventration; multirecurrent hernias; nonreducible, linked with a controlateral primitive or recurrent hernia; and situations compromised from aggravating factors (for example obesity or anyway not easily included in R1 or R2, after pure tissue or mesh repair.
Quality Evaluation of Land-Cover Classification Using Convolutional Neural Network
Dang, Y.; Zhang, J.; Zhao, Y.; Luo, F.; Ma, W.; Yu, F.
2018-04-01
Land-cover classification is one of the most important products of earth observation, which focuses mainly on profiling the physical characters of the land surface with temporal and distribution attributes and contains the information of both natural and man-made coverage elements, such as vegetation, soil, glaciers, rivers, lakes, marsh wetlands and various man-made structures. In recent years, the amount of high-resolution remote sensing data has increased sharply. Accordingly, the volume of land-cover classification products increases, as well as the need to evaluate such frequently updated products that is a big challenge. Conventionally, the automatic quality evaluation of land-cover classification is made through pixel-based classifying algorithms, which lead to a much trickier task and consequently hard to keep peace with the required updating frequency. In this paper, we propose a novel quality evaluation approach for evaluating the land-cover classification by a scene classification method Convolutional Neural Network (CNN) model. By learning from remote sensing data, those randomly generated kernels that serve as filter matrixes evolved to some operators that has similar functions to man-crafted operators, like Sobel operator or Canny operator, and there are other kernels learned by the CNN model that are much more complex and can't be understood as existing filters. The method using CNN approach as the core algorithm serves quality-evaluation tasks well since it calculates a bunch of outputs which directly represent the image's membership grade to certain classes. An automatic quality evaluation approach for the land-cover DLG-DOM coupling data (DLG for Digital Line Graphic, DOM for Digital Orthophoto Map) will be introduced in this paper. The CNN model as an robustness method for image evaluation, then brought out the idea of an automatic quality evaluation approach for land-cover classification. Based on this experiment, new ideas of quality evaluation
THE PROBLEMS OF FIXED ASSETS CLASSIFICATION FOR ACCOUNTING
Directory of Open Access Journals (Sweden)
Sophiia Kafka
2016-06-01
Full Text Available This article provides a critical analysis of research in accounting of fixed assets; the basic issues of fixed assets accounting that have been developed by the Ukrainian scientists during 1999-2016 have been determined. It is established that the problems of non-current assets taxation and their classification are the most noteworthy. In the dissertations the issues of fixed assets classification are of exclusively particular branch nature, so its improvement is important. The purpose of the article is developing science-based classification of fixed assets for accounting purposes since their composition is quite diverse. The classification of fixed assets for accounting purposes have been summarized and developed in Figure 1 according to the results of the research. The accomplished analysis of existing approaches to classification of fixed assets has made it possible to specify its basic types and justify the classification criteria of fixed assets for the main objects of fixed assets. Key words: non-current assets, fixed assets, accounting, valuation, classification of the fixed assets. JEL:G M41
Proteomic classification of breast cancer.
LENUS (Irish Health Repository)
Kamel, Dalia
2012-11-01
Being a significant health problem that affects patients in various age groups, breast cancer has been extensively studied to date. Recently, molecular breast cancer classification has advanced significantly with the availability of genomic profiling technologies. Proteomic technologies have also advanced from traditional protein assays including enzyme-linked immunosorbent assay, immunoblotting and immunohistochemistry to more comprehensive approaches including mass spectrometry and reverse phase protein lysate arrays (RPPA). The purpose of this manuscript is to review the current protein markers that influence breast cancer prediction and prognosis and to focus on novel advances in proteomic classification of breast cancer.
Decimal Classification Editions
Zenovia Niculescu
2009-01-01
The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.
Habitat typing versus advanced vegetation classification in western forests
Tony Kusbach; John Shaw; James Long; Helga Van Miegroet
2012-01-01
Major habitat and community types in northern Utah were compared with plant alliances and associations that were derived from fidelity- and diagnostic-species classification concepts. Each of these classification approaches was associated with important environmental factors. Within a 20,000-ha watershed, 103 forest ecosystems were described by physiographic features,...
Multi-Temporal Land Cover Classification with Long Short-Term Memory Neural Networks
Rußwurm, M.; Körner, M.
2017-05-01
Land cover classification (LCC) is a central and wide field of research in earth observation and has already put forth a variety of classification techniques. Many approaches are based on classification techniques considering observation at certain points in time. However, some land cover classes, such as crops, change their spectral characteristics due to environmental influences and can thus not be monitored effectively with classical mono-temporal approaches. Nevertheless, these temporal observations should be utilized to benefit the classification process. After extensive research has been conducted on modeling temporal dynamics by spectro-temporal profiles using vegetation indices, we propose a deep learning approach to utilize these temporal characteristics for classification tasks. In this work, we show how long short-term memory (LSTM) neural networks can be employed for crop identification purposes with SENTINEL 2A observations from large study areas and label information provided by local authorities. We compare these temporal neural network models, i.e., LSTM and recurrent neural network (RNN), with a classical non-temporal convolutional neural network (CNN) model and an additional support vector machine (SVM) baseline. With our rather straightforward LSTM variant, we exceeded state-of-the-art classification performance, thus opening promising potential for further research.
MULTI-TEMPORAL LAND COVER CLASSIFICATION WITH LONG SHORT-TERM MEMORY NEURAL NETWORKS
Directory of Open Access Journals (Sweden)
M. Rußwurm
2017-05-01
Full Text Available Land cover classification (LCC is a central and wide field of research in earth observation and has already put forth a variety of classification techniques. Many approaches are based on classification techniques considering observation at certain points in time. However, some land cover classes, such as crops, change their spectral characteristics due to environmental influences and can thus not be monitored effectively with classical mono-temporal approaches. Nevertheless, these temporal observations should be utilized to benefit the classification process. After extensive research has been conducted on modeling temporal dynamics by spectro-temporal profiles using vegetation indices, we propose a deep learning approach to utilize these temporal characteristics for classification tasks. In this work, we show how long short-term memory (LSTM neural networks can be employed for crop identification purposes with SENTINEL 2A observations from large study areas and label information provided by local authorities. We compare these temporal neural network models, i.e., LSTM and recurrent neural network (RNN, with a classical non-temporal convolutional neural network (CNN model and an additional support vector machine (SVM baseline. With our rather straightforward LSTM variant, we exceeded state-of-the-art classification performance, thus opening promising potential for further research.
Directory of Open Access Journals (Sweden)
Choon Sen Seah
2017-12-01
Full Text Available Microarray technology has become one of the elementary tools for researchers to study the genome of organisms. As the complexity and heterogeneity of cancer is being increasingly appreciated through genomic analysis, cancerous classification is an emerging important trend. Significant directed random walk is proposed as one of the cancerous classification approach which have higher sensitivity of risk gene prediction and higher accuracy of cancer classification. In this paper, the methodology and material used for the experiment are presented. Tuning parameter selection method and weight as parameter are applied in proposed approach. Gene expression dataset is used as the input datasets while pathway dataset is used to build a directed graph, as reference datasets, to complete the bias process in random walk approach. In addition, we demonstrate that our approach can improve sensitive predictions with higher accuracy and biological meaningful classification result. Comparison result takes place between significant directed random walk and directed random walk to show the improvement in term of sensitivity of prediction and accuracy of cancer classification.
Gereige, Issam
2012-09-01
Photolithography is a fundamental process in the semiconductor industry and it is considered as the key element towards extreme nanoscale integration. In this technique, a polymer photo sensitive mask with the desired patterns is created on the substrate to be etched. Roughly speaking, the areas to be etched are not covered with polymer. Thus, no residual layer should remain on these areas in order to insure an optimal transfer of the patterns on the substrate. In this paper, we propose a nondestructive method based on a classification approach achieved by artificial neural network for automatic residual layer detection from an ellipsometric signature. Only the case of regular defect, i.e. homogenous residual layer, will be considered. The limitation of the method will be discussed. Then, an experimental result on a 400 nm period grating manufactured with nanoimprint lithography is analyzed with our method. © 2012 Elsevier B.V. All rights reserved.
Automatic Genre Classification of Musical Signals
Barbedo, Jayme Garcia sArnal; Lopes, Amauri
2006-12-01
We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.
Automotive System for Remote Surface Classification.
Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail
2017-04-01
In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions.
Tarai, Madhumita; Kumar, Keshav; Divya, O; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar
2017-09-05
The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix. Copyright © 2017 Elsevier B.V. All rights reserved.
Classification for Inconsistent Decision Tables
Azad, Mohammad
2016-09-28
Decision trees have been used widely to discover patterns from consistent data set. But if the data set is inconsistent, where there are groups of examples with equal values of conditional attributes but different labels, then to discover the essential patterns or knowledge from the data set is challenging. Three approaches (generalized, most common and many-valued decision) have been considered to handle such inconsistency. The decision tree model has been used to compare the classification results among three approaches. Many-valued decision approach outperforms other approaches, and M_ws_entM greedy algorithm gives faster and better prediction accuracy.
Classification for Inconsistent Decision Tables
Azad, Mohammad; Moshkov, Mikhail
2016-01-01
Decision trees have been used widely to discover patterns from consistent data set. But if the data set is inconsistent, where there are groups of examples with equal values of conditional attributes but different labels, then to discover the essential patterns or knowledge from the data set is challenging. Three approaches (generalized, most common and many-valued decision) have been considered to handle such inconsistency. The decision tree model has been used to compare the classification results among three approaches. Many-valued decision approach outperforms other approaches, and M_ws_entM greedy algorithm gives faster and better prediction accuracy.
Energy Technology Data Exchange (ETDEWEB)
Pasqualini, G. [Ecole Superieure d`Electricite (France)
1997-08-01
Direct current motors, asynchronous and variable speed synchronous motors are generally supplied with static converters. Speed variation is obtained by voltage variation in DC motors and by frequency variation in AC motors. In these conditions, these motors are running continuously in transient regimes: the DC motors current is not direct and the AC motors current is not sinusoidal. This situation leads to pulsing couples in the shaft line and to an increase of Joule effect losses. The aim of this paper is to present the methods of study of the electric motors functioning using the shape of the power voltages given by converters and mathematical models of these machines. The synchronous machines are rapidly described while the asynchronous machines are studied using Ku`s transformation instead of Park`s transformation for simplification. For each type of machine, calculation methods allow to determine their current, additional losses and couple characteristics. The transient regimes considered are those remaining when the motor is running at a constant speed and defined regime (supply voltages are periodical functions of time). These transient regimes are identically reproducing with a frequency which is a multiple of the converters supply frequency. Transient regimes due to functioning changes of the motor, such as resisting couple or power supply frequency variations, are not considered in this study. (J.S.) 9 refs.
Atmospheric circulation classification comparison based on wildfires in Portugal
Pereira, M. G.; Trigo, R. M.
2009-04-01
variables. To achieve these objectives we consider the main classifications for Iberia developed within the framework of COST action 733 (Radan Huth et al., 2008). This European project aims to provide a wide range of atmospheric circulation classifications for Europe and sub-regions (http://www.cost733.org/) with an ambitious objective of assessing, comparing and classifying all relevant weather situations in Europe. Pereira et al. (2005) "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology,129, 11-25. Radan Huth et al. (2008) "Classifications of Atmospheric circulation patterns. Recent advances and applications". Trends and Directions in Climate Research: Ann. N.Y. Acad. Sci. 1146: 105-152. doi: 10.1196/annals.1446.019. Trigo R.M., DaCamara C. (2000) "Circulation Weather Types and their impact on the precipitation regime in Portugal". Int J of Climatology, 20, 1559-1581.
Classification of Franchise Networks in the Retail Trade
Directory of Open Access Journals (Sweden)
Grygorenko Tetyana M.
2016-11-01
Full Text Available The article clarifies the definitions of the concepts of «franchise network», «franchise trade network», «franchise retail network», which is substantiated by the lack of a unified approach to interpretation of these concepts. The classification of franchise networks in the retail trade taking into account peculiarities in the operation of this sub-sector of the market economy is developed; classification attributes are identified and types of franchise retail chains are characterized. The proposed classification of franchise retail networks is adapted to the economic situation in Ukraine and specifics of the national franchise relations. It will facilitate a deeper understanding of the essence of the formation and operation of franchise retail chains and also help Ukrainian entrepreneurs to justify choosing the most suitable for them franchising model and allow to establish such a network with regard to various attributes using a complex approach.
Development of objective flow regime identification method using self-organizing neural network
International Nuclear Information System (INIS)
Lee, Jae Young; Kim, Nam Seok; Kwak, Nam Yee
2004-01-01
Two-phase flow shows various flow patterns according to the amount of the void and its relative velocity to the liquid flow. This variation directly affect the interfacial transfer which is the key factor for the design or analysis of the phase change systems. Especially the safety analysis of the nuclear power plant has been performed based on the numerical code furnished with the proper constitutive relations depending highly upon the flow regimes. Heavy efforts have been focused to identify the flow regime and at this moment we stand on relative very stable engineering background compare to the other research field. However, the issues related to objectiveness and transient flow regime are still open to study. Lee et al. and Ishii developed the method for the objective and instantaneous flow regime identification based on the neural network and new index of probability distribution of the flow regime which allows just one second observation for the flow regime identification. In the present paper, we developed the self-organized neural network for more objective approach to this problem. Kohonen's Self-Organizing Map (SOM) has been used for clustering, visualization, and abstraction. The SOM is trained through unsupervised competitive learning using a 'winner takes it all' policy. Therefore, its unsupervised training character delete the possible interference of the regime developer to the neural network training. After developing the computer code, we evaluate the performance of the code with the vertically upward two-phase flow in the pipes of 25.4 and 50.4 cmm I.D. Also, the sensitivity of the number of the clusters to the flow regime identification was made
On the dynamics of liquids in their viscous regime approaching the glass transition.
Chen, Z; Angell, C A; Richert, R
2012-07-01
Recently, Mallamace et al. (Eur. Phys. J. E 34, 94 (2011)) proposed a crossover temperature, T(×), and claimed that the dynamics of many supercooled liquids follow an Arrhenius-type temperature dependence between T(×) and the glass transition temperature T(g). The opposite, namely super-Arrhenius behavior in this viscous regime, has been demonstrated repeatedly for molecular glass-former, for polymers, and for the majority of the exhaustively studied inorganic glasses of technological interest. Therefore, we subject the molecular systems of the Mallamace et al. study to a "residuals" analysis and include not only viscosity data but also the more precise data available from dielectric relaxation experiments over the same temperature range. Although many viscosity data sets are inconclusive due to their noise level, we find that Arrhenius behavior is not a general feature of viscosity in the T(g) to T(×) range. Moreover, the residuals of dielectric relaxation times with respect to an Arrhenius law clearly reveal systematic curvature consistent with super-Arrhenius behavior being an endemic feature of transport properties in this viscous regime. We also observe a common pattern of how dielectric relaxation times decouple slightly from viscosity.
Vision-Based Perception and Classification of Mosquitoes Using Support Vector Machine
Directory of Open Access Journals (Sweden)
Masataka Fuchida
2017-01-01
Full Text Available The need for a novel automated mosquito perception and classification method is becoming increasingly essential in recent years, with steeply increasing number of mosquito-borne diseases and associated casualties. There exist remote sensing and GIS-based methods for mapping potential mosquito inhabitants and locations that are prone to mosquito-borne diseases, but these methods generally do not account for species-wise identification of mosquitoes in closed-perimeter regions. Traditional methods for mosquito classification involve highly manual processes requiring tedious sample collection and supervised laboratory analysis. In this research work, we present the design and experimental validation of an automated vision-based mosquito classification module that can deploy in closed-perimeter mosquito inhabitants. The module is capable of identifying mosquitoes from other bugs such as bees and flies by extracting the morphological features, followed by support vector machine-based classification. In addition, this paper presents the results of three variants of support vector machine classifier in the context of mosquito classification problem. This vision-based approach to the mosquito classification problem presents an efficient alternative to the conventional methods for mosquito surveillance, mapping and sample image collection. Experimental results involving classification between mosquitoes and a predefined set of other bugs using multiple classification strategies demonstrate the efficacy and validity of the proposed approach with a maximum recall of 98%.
A kernel-based multi-feature image representation for histopathology image classification
International Nuclear Information System (INIS)
Moreno J; Caicedo J Gonzalez F
2010-01-01
This paper presents a novel strategy for building a high-dimensional feature space to represent histopathology image contents. Histogram features, related to colors, textures and edges, are combined together in a unique image representation space using kernel functions. This feature space is further enhanced by the application of latent semantic analysis, to model hidden relationships among visual patterns. All that information is included in the new image representation space. Then, support vector machine classifiers are used to assign semantic labels to images. Processing and classification algorithms operate on top of kernel functions, so that; the structure of the feature space is completely controlled using similarity measures and a dual representation. The proposed approach has shown a successful performance in a classification task using a dataset with 1,502 real histopathology images in 18 different classes. The results show that our approach for histological image classification obtains an improved average performance of 20.6% when compared to a conventional classification approach based on SVM directly applied to the original kernel.
A KERNEL-BASED MULTI-FEATURE IMAGE REPRESENTATION FOR HISTOPATHOLOGY IMAGE CLASSIFICATION
Directory of Open Access Journals (Sweden)
J Carlos Moreno
2010-09-01
Full Text Available This paper presents a novel strategy for building a high-dimensional feature space to represent histopathology image contents. Histogram features, related to colors, textures and edges, are combined together in a unique image representation space using kernel functions. This feature space is further enhanced by the application of Latent Semantic Analysis, to model hidden relationships among visual patterns. All that information is included in the new image representation space. Then, Support Vector Machine classifiers are used to assign semantic labels to images. Processing and classification algorithms operate on top of kernel functions, so that, the structure of the feature space is completely controlled using similarity measures and a dual representation. The proposed approach has shown a successful performance in a classification task using a dataset with 1,502 real histopathology images in 18 different classes. The results show that our approach for histological image classification obtains an improved average performance of 20.6% when compared to a conventional classification approach based on SVM directly applied to the original kernel.
Land Cover Classification in a Complex Urban-Rural Landscape with Quickbird Imagery
Moran, Emilio Federico.
2010-01-01
High spatial resolution images have been increasingly used for urban land use/cover classification, but the high spectral variation within the same land cover, the spectral confusion among different land covers, and the shadow problem often lead to poor classification performance based on the traditional per-pixel spectral-based classification methods. This paper explores approaches to improve urban land cover classification with Quickbird imagery. Traditional per-pixel spectral-based supervi...
A macromodel for squeeze-film air damping in the free-molecule regime
Hong, Gang; Ye, Wenjing
2010-01-01
A three-dimensional Monte Carlo(MC) simulation approach is developed for the accurate prediction of the squeeze-film air damping on microresonators in the free-molecule gas regime. Based on the MC simulations and the analytical traveling
Nasir, Muhammad; Attique Khan, Muhammad; Sharif, Muhammad; Lali, Ikram Ullah; Saba, Tanzila; Iqbal, Tassawar
2018-02-21
Melanoma is the deadliest type of skin cancer with highest mortality rate. However, the annihilation in early stage implies a high survival rate therefore, it demands early diagnosis. The accustomed diagnosis methods are costly and cumbersome due to the involvement of experienced experts as well as the requirements for highly equipped environment. The recent advancements in computerized solutions for these diagnoses are highly promising with improved accuracy and efficiency. In this article, we proposed a method for the classification of melanoma and benign skin lesions. Our approach integrates preprocessing, lesion segmentation, features extraction, features selection, and classification. Preprocessing is executed in the context of hair removal by DullRazor, whereas lesion texture and color information are utilized to enhance the lesion contrast. In lesion segmentation, a hybrid technique has been implemented and results are fused using additive law of probability. Serial based method is applied subsequently that extracts and fuses the traits such as color, texture, and HOG (shape). The fused features are selected afterwards by implementing a novel Boltzman Entropy method. Finally, the selected features are classified by Support Vector Machine. The proposed method is evaluated on publically available data set PH2. Our approach has provided promising results of sensitivity 97.7%, specificity 96.7%, accuracy 97.5%, and F-score 97.5%, which are significantly better than the results of existing methods available on the same data set. The proposed method detects and classifies melanoma significantly good as compared to existing methods. © 2018 Wiley Periodicals, Inc.
2013-12-01
virulent strains escaping from laboratories with inadequate biosecurity and biosafety regimes into a world with insufficient public health surveillance...Unclassified 19. SECURITY CLASSIFICATION OF ABSTRACT Unclassified 20. LIMITATION OF ABSTRACT UU NSN 7540–01–280–5500 Standard Form 298 (Rev. 2–89...of codification and legal measures to stop terrorist use and to a lesser degree to the international double standard on beliefs regarding the
Modeling and forecasting of wind power generation - Regime-switching approaches
DEFF Research Database (Denmark)
Trombe, Pierre-Julien
The present thesis addresses a number of challenges emerging from the increasing penetration of renewable energy sources into power systems. Focus is placed on wind energy and large-scale offshore wind farms. Indeed, offshore wind power variability is becoming a serious obstacle to the integration...... of more renewable energy into power systems since these systems are subjected to maintain a strict balance between electricity consumption and production, at any time. For this purpose, wind power forecasts offer an essential support to power system operators. In particular, there is a growing demand...... case study is the Horns Rev wind farm located in the North Sea. Regime-switching aspects of offshore wind power fluctuations are investigated. Several formulations of Markov-Switching models are proposed in order to better characterize the stochastic behavior of the underlying process and improve its...
Directory of Open Access Journals (Sweden)
Makiko Tazaki
2014-01-01
Full Text Available There are two primary challenges for establishing nuclear third party liability (TPL regimes within multilateral nuclear approaches (MNA to nuclear fuel cycle facilities in the Asian region. The first challenge is to ensure secure and prompt compensation, especially for transboundary damages, which is also a challenge for a nation-based facility. One possible solution is that in order to share common nuclear TPL principles, all states in the region participate in the same international nuclear TPL convention, such as the Convention on Supplementary Compensation for Nuclear Damage (CSC, with a view to its entry into force in the future. One problem with this approach is that many states in the Asian region need to raise their amount of financial security in order to be able to participate in the CSC. The second challenge lies with the multiple MNA member states and encompasses the question of how decisions are to be made and responsabilities of an installation state are to be shared in case of a nuclear incident. Principally, a host state of the MNA facility takes on this responsibility. However, in certain situations and in agreement with all MNA member states, such responsibilities can be indirectly shared among all MNA member states. This can be done through internal arrangements within the MNA framework, such as reimbursement to a host state based on pre-agreed shares in accordance with investment and/or making deposits on such reimbursements in case of an incident.
Primordial black holes in linear and non-linear regimes
Energy Technology Data Exchange (ETDEWEB)
Allahyari, Alireza; Abolhasani, Ali Akbar [Department of Physics, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Firouzjaee, Javad T., E-mail: allahyari@physics.sharif.edu, E-mail: j.taghizadeh.f@ipm.ir [School of Astronomy, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)
2017-06-01
We revisit the formation of primordial black holes (PBHs) in the radiation-dominated era for both linear and non-linear regimes, elaborating on the concept of an apparent horizon. Contrary to the expectation from vacuum models, we argue that in a cosmological setting a density fluctuation with a high density does not always collapse to a black hole. To this end, we first elaborate on the perturbation theory for spherically symmetric space times in the linear regime. Thereby, we introduce two gauges. This allows to introduce a well defined gauge-invariant quantity for the expansion of null geodesics. Using this quantity, we argue that PBHs do not form in the linear regime irrespective of the density of the background. Finally, we consider the formation of PBHs in non-linear regimes, adopting the spherical collapse picture. In this picture, over-densities are modeled by closed FRW models in the radiation-dominated era. The difference of our approach is that we start by finding an exact solution for a closed radiation-dominated universe. This yields exact results for turn-around time and radius. It is important that we take the initial conditions from the linear perturbation theory. Additionally, instead of using uniform Hubble gauge condition, both density and velocity perturbations are admitted in this approach. Thereby, the matching condition will impose an important constraint on the initial velocity perturbations δ {sup h} {sub 0} = −δ{sub 0}/2. This can be extended to higher orders. Using this constraint, we find that the apparent horizon of a PBH forms when δ > 3 at turn-around time. The corrections also appear from the third order. Moreover, a PBH forms when its apparent horizon is outside the sound horizon at the re-entry time. Applying this condition, we infer that the threshold value of the density perturbations at horizon re-entry should be larger than δ {sub th} > 0.7.
Sustainable urban regime adjustments
DEFF Research Database (Denmark)
Quitzau, Maj-Britt; Jensen, Jens Stissing; Elle, Morten
2013-01-01
The endogenous agency that urban governments increasingly portray by making conscious and planned efforts to adjust the regimes they operate within is currently not well captured in transition studies. There is a need to acknowledge the ambiguity of regime enactment at the urban scale. This direc...
Dynamic Modeling Strategy for Flow Regime Transition in Gas-Liquid Two-Phase Flows
Directory of Open Access Journals (Sweden)
Xia Wang
2012-12-01
Full Text Available In modeling gas-liquid two-phase flows, the concept of flow regimes has been widely used to characterize the global interfacial structure of the flows. Nearly all constitutive relations that provide closures to the interfacial transfers in two-phase flow models, such as the two-fluid model, are flow regime dependent. Current nuclear reactor safety analysis codes, such as RELAP5, classify flow regimes using flow regime maps or transition criteria that were developed for steady-state, fully-developed flows. As two-phase flows are dynamic in nature, it is important to model the flow regime transitions dynamically to more accurately predict the two-phase flows. The present work aims to develop a dynamic modeling strategy to determine flow regimes in gas-liquid two-phase flows through introduction of interfacial area transport equations (IATEs within the framework of a two-fluid model. The IATE is a transport equation that models the interfacial area concentration by considering the creation of the interfacial area, fluid particle (bubble or liquid droplet disintegration, boiling and evaporation, and the destruction of the interfacial area, fluid particle coalescence and condensation. For flow regimes beyond bubbly flows, a two-group IATE has been proposed, in which bubbles are divided into two groups based on their size and shapes, namely group-1 and group-2 bubbles. A preliminary approach to dynamically identify the flow regimes is discussed, in which discriminators are based on the predicted information, such as the void fraction and interfacial area concentration. The flow regime predicted with this method shows good agreement with the experimental observations.
Decimal Classification Editions
Directory of Open Access Journals (Sweden)
Zenovia Niculescu
2009-01-01
Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.
Classification of maize kernels using NIR hyperspectral imaging
DEFF Research Database (Denmark)
Williams, Paul; Kucheryavskiy, Sergey V.
2016-01-01
NIR hyperspectral imaging was evaluated to classify maize kernels of three hardness categories: hard, medium and soft. Two approaches, pixel-wise and object-wise, were investigated to group kernels according to hardness. The pixel-wise classification assigned a class to every pixel from individual...... and specificity of 0.95 and 0.93). Both feature extraction methods can be recommended for classification of maize kernels on production scale....
Adaptive Matrices and Filters for Color Texture Classification
Giotis, Ioannis; Bunte, Kerstin; Petkov, Nicolai; Biehl, Michael
In this paper we introduce an integrative approach towards color texture classification and recognition using a supervised learning framework. Our approach is based on Generalized Learning Vector Quantization (GLVQ), extended by an adaptive distance measure, which is defined in the Fourier domain,
Sun, Ziheng; Fang, Hui; Di, Liping; Yue, Peng
2016-09-01
It was an untouchable dream for remote sensing experts to realize total automatic image classification without inputting any parameter values. Experts usually spend hours and hours on tuning the input parameters of classification algorithms in order to obtain the best results. With the rapid development of knowledge engineering and cyberinfrastructure, a lot of data processing and knowledge reasoning capabilities become online accessible, shareable and interoperable. Based on these recent improvements, this paper presents an idea of parameterless automatic classification which only requires an image and automatically outputs a labeled vector. No parameters and operations are needed from endpoint consumers. An approach is proposed to realize the idea. It adopts an ontology database to store the experiences of tuning values for classifiers. A sample database is used to record training samples of image segments. Geoprocessing Web services are used as functionality blocks to finish basic classification steps. Workflow technology is involved to turn the overall image classification into a total automatic process. A Web-based prototypical system named PACS (Parameterless Automatic Classification System) is implemented. A number of images are fed into the system for evaluation purposes. The results show that the approach could automatically classify remote sensing images and have a fairly good average accuracy. It is indicated that the classified results will be more accurate if the two databases have higher quality. Once the experiences and samples in the databases are accumulated as many as an expert has, the approach should be able to get the results with similar quality to that a human expert can get. Since the approach is total automatic and parameterless, it can not only relieve remote sensing workers from the heavy and time-consuming parameter tuning work, but also significantly shorten the waiting time for consumers and facilitate them to engage in image
Influence of nuclei segmentation on breast cancer malignancy classification
Jelen, Lukasz; Fevens, Thomas; Krzyzak, Adam
2009-02-01
Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides and its influence on malignancy classification. Classification of malignancy plays a very important role during the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable information about the cancer malignancy grade which helps to choose an appropriate treatment. This process involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important. In this work we compare three powerful segmentation approaches and test their impact on the classification of breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation and textural segmentation based on co-occurrence matrix. Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes four different classifiers were trained and tested with previously extracted features. The compared classifiers are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network (PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the best results over the three compared approaches and leads to a good feature extraction with a lowest average error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron with an error rate of 3.07% using fuzzy c-means segmentation.
Multi-label literature classification based on the Gene Ontology graph
Directory of Open Access Journals (Sweden)
Lu Xinghua
2008-12-01
Full Text Available Abstract Background The Gene Ontology is a controlled vocabulary for representing knowledge related to genes and proteins in a computable form. The current effort of manually annotating proteins with the Gene Ontology is outpaced by the rate of accumulation of biomedical knowledge in literature, which urges the development of text mining approaches to facilitate the process by automatically extracting the Gene Ontology annotation from literature. The task is usually cast as a text classification problem, and contemporary methods are confronted with unbalanced training data and the difficulties associated with multi-label classification. Results In this research, we investigated the methods of enhancing automatic multi-label classification of biomedical literature by utilizing the structure of the Gene Ontology graph. We have studied three graph-based multi-label classification algorithms, including a novel stochastic algorithm and two top-down hierarchical classification methods for multi-label literature classification. We systematically evaluated and compared these graph-based classification algorithms to a conventional flat multi-label algorithm. The results indicate that, through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods can significantly improve predictions of the Gene Ontology terms implied by the analyzed text. Furthermore, the graph-based multi-label classifiers are capable of suggesting Gene Ontology annotations (to curators that are closely related to the true annotations even if they fail to predict the true ones directly. A software package implementing the studied algorithms is available for the research community. Conclusion Through utilizing the information from the structure of the Gene Ontology graph, the graph-based multi-label classification methods have better potential than the conventional flat multi-label classification approach to facilitate
Jeanne C. Chambers; David A. Pyke; Jeremy D. Maestas; Mike Pellant; Chad S. Boyd; Steven B. Campbell; Shawn Espinosa; Douglas W. Havlina; Kenneth E. Mayer; Amarina Wuenschel
2014-01-01
This Report provides a strategic approach for conservation of sagebrush ecosystems and Greater Sage- Grouse (sage-grouse) that focuses specifically on habitat threats caused by invasive annual grasses and altered fire regimes. It uses information on factors that influence (1) sagebrush ecosystem resilience to disturbance and resistance to invasive annual grasses and (2...
Energy Technology Data Exchange (ETDEWEB)
McManamay, Ryan A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Troia, Matthew J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeRolph, Christopher R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Samu, Nicole M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-01-01
Stream classifications are an inventory of different types of streams. Classifications help us explore similarities and differences among different types of streams, make inferences regarding stream ecosystem behavior, and communicate the complexities of ecosystems. We developed a nested, layered, and spatially contiguous stream classification to characterize the biophysical settings of stream reaches within the Eastern United States (~ 900,000 reaches). The classification is composed of five natural characteristics (hydrology, temperature, size, confinement, and substrate) along with several disturbance regime layers, and each was selected because of their relevance to hydropower mitigation. We developed the classification at the stream reach level using the National Hydrography Dataset Plus Version 1 (1:100k scale). The stream classification is useful to environmental mitigation for hydropower dams in multiple ways. First, it creates efficiency in the regulatory process by creating an objective and data-rich means to address meaningful mitigation actions. Secondly, the SCT addresses data gaps as it quickly provides an inventory of hydrology, temperature, morphology, and ecological communities for the immediate project area, but also surrounding streams. This includes identifying potential reference streams as those that are proximate to the hydropower facility and fall within the same class. These streams can potentially be used to identify ideal environmental conditions or identify desired ecological communities. In doing so, the stream provides some context for how streams may function, respond to dam regulation, and an overview of specific mitigation needs. Herein, we describe the methodology in developing each stream classification layer and provide a tutorial to guide applications of the classification (and associated data) in regulatory settings, such as hydropower (re)licensing.
IMPROVING CLASSIFICATIONS OF ECONOMIC SCIENCES IN A THESAURUS
Directory of Open Access Journals (Sweden)
Sergey Vladimirovich Lesnikov
2013-09-01
Full Text Available The goal is to study thesaurus as an instrument to define the classification of economic sciences, to adapt their classification to the increased information flow, to increase accuracy of allocation of information resources with consideration of the users’ needs, to suggest making alterations in the classification of economic sciences made by the Institute of Scientific Information for Social Sciences of the Russian Academy of Sciences (INION RAN in 2001.The authors see the classification of economic sciences as a product of social communications theory – a differentiated aspect of social research. Modern science is subdivided into various aspects with varied subjects and methods. The latter overlap and form a hierarchy of concepts in science within the same research subject. The authors stress the importance of information retrieval systems for developing scientific knowledge. Information retrieval systems can immediately deliver data from different areas of science to the user who can then integrate the information and obtain a vivid picture of the research subject. Search engines and rubricators are becoming increasingly important as there is a tendency to isolated thinking with many Internet users.The authors have devised a certain approach to using the thesaurus as the means of sciences classification and as a hyper language of science. The suggested methodological approach to structuring terms and notions via thesaurus have been tested at Syktyvkar State University and Syktyvkar branch of Saint-Petersburg Economic University.Methods: deduction, induction, analysis, synthesis, abstraction technique, classification.Results: there have been defined stages and main sections of the information-retrieval thesaurus of the hyperlanguage of economic science on the basis of existing classification systems of scientific knowledge.Scope of application of results: library services, information technology, education.DOI: http://dx.doi.org/10.12731/2218-7405-2013-8-22
Musah, Rabi A.; Espinoza, Edgard O.; Cody, Robert B.; Lesiak, Ashton D.; Christensen, Earl D.; Moore, Hannah E.; Maleknia, Simin; Drijfhout, Falko P.
2015-01-01
A high throughput method for species identification and classification through chemometric processing of direct analysis in real time (DART) mass spectrometry-derived fingerprint signatures has been developed. The method entails introduction of samples to the open air space between the DART ion source and the mass spectrometer inlet, with the entire observed mass spectral fingerprint subjected to unsupervised hierarchical clustering processing. A range of both polar and non-polar chemotypes are instantaneously detected. The result is identification and species level classification based on the entire DART-MS spectrum. Here, we illustrate how the method can be used to: (1) distinguish between endangered woods regulated by the Convention for the International Trade of Endangered Flora and Fauna (CITES) treaty; (2) assess the origin and by extension the properties of biodiesel feedstocks; (3) determine insect species from analysis of puparial casings; (4) distinguish between psychoactive plants products; and (5) differentiate between Eucalyptus species. An advantage of the hierarchical clustering approach to processing of the DART-MS derived fingerprint is that it shows both similarities and differences between species based on their chemotypes. Furthermore, full knowledge of the identities of the constituents contained within the small molecule profile of analyzed samples is not required. PMID:26156000
A Semisupervised Cascade Classification Algorithm
Directory of Open Access Journals (Sweden)
Stamatis Karlos
2016-01-01
Full Text Available Classification is one of the most important tasks of data mining techniques, which have been adopted by several modern applications. The shortage of enough labeled data in the majority of these applications has shifted the interest towards using semisupervised methods. Under such schemes, the use of collected unlabeled data combined with a clearly smaller set of labeled examples leads to similar or even better classification accuracy against supervised algorithms, which use labeled examples exclusively during the training phase. A novel approach for increasing semisupervised classification using Cascade Classifier technique is presented in this paper. The main characteristic of Cascade Classifier strategy is the use of a base classifier for increasing the feature space by adding either the predicted class or the probability class distribution of the initial data. The classifier of the second level is supplied with the new dataset and extracts the decision for each instance. In this work, a self-trained NB∇C4.5 classifier algorithm is presented, which combines the characteristics of Naive Bayes as a base classifier and the speed of C4.5 for final classification. We performed an in-depth comparison with other well-known semisupervised classification methods on standard benchmark datasets and we finally reached to the point that the presented technique has better accuracy in most cases.
A Gameplay Definition through Videogame Classification
Directory of Open Access Journals (Sweden)
Damien Djaouti
2008-01-01
Full Text Available This paper is part of an experimental approach aimed to raise a videogames classification. Being inspired by the methodology that Propp used for the classification of Russian fairy tales, we have identified recurrent diagrams within rules of videogames, that we called “Gameplay Bricks”. The combinations of these different bricks should allow us to represent a classification of all videogames in accordance with their rules. In this article, we will study the nature of these bricks, especially the link they seem to have with two types of game rules: the rules that allow the player to “manipulate” the elements of the game, and the rules defining the “goal” of the game. This study will lead to an hypothesis about the nature of gameplay.
Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.
Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck
2018-04-20
Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.
Vulnerable land ecosystems classification using spatial context and spectral indices
Ibarrola-Ulzurrun, Edurne; Gonzalo-Martín, Consuelo; Marcello, Javier
2017-10-01
Natural habitats are exposed to growing pressure due to intensification of land use and tourism development. Thus, obtaining information on the vegetation is necessary for conservation and management projects. In this context, remote sensing is an important tool for monitoring and managing habitats, being classification a crucial stage. The majority of image classifications techniques are based upon the pixel-based approach. An alternative is the object-based (OBIA) approach, in which a previous segmentation step merges image pixels to create objects that are then classified. Besides, improved results may be gained by incorporating additional spatial information and specific spectral indices into the classification process. The main goal of this work was to implement and assess object-based classification techniques on very-high resolution imagery incorporating spectral indices and contextual spatial information in the classification models. The study area was Teide National Park in Canary Islands (Spain) using Worldview-2 orthoready imagery. In the classification model, two common indices were selected Normalized Difference Vegetation Index (NDVI) and Optimized Soil Adjusted Vegetation Index (OSAVI), as well as two specific Worldview-2 sensor indices, Worldview Vegetation Index and Worldview Soil Index. To include the contextual information, Grey Level Co-occurrence Matrices (GLCM) were used. The classification was performed training a Support Vector Machine with sufficient and representative number of vegetation samples (Spartocytisus supranubius, Pterocephalus lasiospermus, Descurainia bourgaeana and Pinus canariensis) as well as urban, road and bare soil classes. Confusion Matrices were computed to evaluate the results from each classification model obtaining the highest overall accuracy (90.07%) combining both Worldview indices with the GLCM-dissimilarity.
Lauren classification and individualized chemotherapy in gastric cancer.
Ma, Junli; Shen, Hong; Kapesa, Linda; Zeng, Shan
2016-05-01
Gastric cancer is one of the most common malignancies worldwide. During the last 50 years, the histological classification of gastric carcinoma has been largely based on Lauren's criteria, in which gastric cancer is classified into two major histological subtypes, namely intestinal type and diffuse type adenocarcinoma. This classification was introduced in 1965, and remains currently widely accepted and employed, since it constitutes a simple and robust classification approach. The two histological subtypes of gastric cancer proposed by the Lauren classification exhibit a number of distinct clinical and molecular characteristics, including histogenesis, cell differentiation, epidemiology, etiology, carcinogenesis, biological behaviors and prognosis. Gastric cancer exhibits varied sensitivity to chemotherapy drugs and significant heterogeneity; therefore, the disease may be a target for individualized therapy. The Lauren classification may provide the basis for individualized treatment for advanced gastric cancer, which is increasingly gaining attention in the scientific field. However, few studies have investigated individualized treatment that is guided by pathological classification. The aim of the current review is to analyze the two major histological subtypes of gastric cancer, as proposed by the Lauren classification, and to discuss the implications of this for personalized chemotherapy.
Automatic Amharic text news classification: Aneural networks ...
African Journals Online (AJOL)
School of Computing and Electrical Engineering, Institute of Technology, Bahir Dar University, Bahir Dar ... The study is on classification of Amharic news automatically using neural networks approach. Learning Vector ... INTRODUCTION.
Kachach, Redouane; Cañas, José María
2016-05-01
Using video in traffic monitoring is one of the most active research domains in the computer vision community. TrafficMonitor, a system that employs a hybrid approach for automatic vehicle tracking and classification on highways using a simple stationary calibrated camera, is presented. The proposed system consists of three modules: vehicle detection, vehicle tracking, and vehicle classification. Moving vehicles are detected by an enhanced Gaussian mixture model background estimation algorithm. The design includes a technique to resolve the occlusion problem by using a combination of two-dimensional proximity tracking algorithm and the Kanade-Lucas-Tomasi feature tracking algorithm. The last module classifies the shapes identified into five vehicle categories: motorcycle, car, van, bus, and truck by using three-dimensional templates and an algorithm based on histogram of oriented gradients and the support vector machine classifier. Several experiments have been performed using both real and simulated traffic in order to validate the system. The experiments were conducted on GRAM-RTM dataset and a proper real video dataset which is made publicly available as part of this work.
Xu, Z.; Guan, K.; Peng, B.; Casler, N. P.; Wang, S. W.
2017-12-01
Landscape has complex three-dimensional features. These 3D features are difficult to extract using conventional methods. Small-footprint LiDAR provides an ideal way for capturing these features. Existing approaches, however, have been relegated to raster or metric-based (two-dimensional) feature extraction from the upper or bottom layer, and thus are not suitable for resolving morphological and intensity features that could be important to fine-scale land cover mapping. Therefore, this research combines airborne LiDAR and multi-temporal Landsat imagery to classify land cover types of Williamson County, Illinois that has diverse and mixed landscape features. Specifically, we applied a 3D convolutional neural network (CNN) method to extract features from LiDAR point clouds by (1) creating occupancy grid, intensity grid at 1-meter resolution, and then (2) normalizing and incorporating data into a 3D CNN feature extractor for many epochs of learning. The learned features (e.g., morphological features, intensity features, etc) were combined with multi-temporal spectral data to enhance the performance of land cover classification based on a Support Vector Machine classifier. We used photo interpretation for training and testing data generation. The classification results show that our approach outperforms traditional methods using LiDAR derived feature maps, and promises to serve as an effective methodology for creating high-quality land cover maps through fusion of complementary types of remote sensing data.
An evaluation of classification systems for stillbirth
Directory of Open Access Journals (Sweden)
Pattinson Robert
2009-06-01
Full Text Available Abstract Background Audit and classification of stillbirths is an essential part of clinical practice and a crucial step towards stillbirth prevention. Due to the limitations of the ICD system and lack of an international approach to an acceptable solution, numerous disparate classification systems have emerged. We assessed the performance of six contemporary systems to inform the development of an internationally accepted approach. Methods We evaluated the following systems: Amended Aberdeen, Extended Wigglesworth; PSANZ-PDC, ReCoDe, Tulip and CODAC. Nine teams from 7 countries applied the classification systems to cohorts of stillbirths from their regions using 857 stillbirth cases. The main outcome measures were: the ability to retain the important information about the death using the InfoKeep rating; the ease of use according to the Ease rating (both measures used a five-point scale with a score Results InfoKeep scores were significantly different across the classifications (p ≤ 0.01 due to low scores for Wigglesworth and Aberdeen. CODAC received the highest mean (SD score of 3.40 (0.73 followed by PSANZ-PDC, ReCoDe and Tulip [2.77 (1.00, 2.36 (1.21, 1.92 (1.24 respectively]. Wigglesworth and Aberdeen resulted in a high proportion of unexplained stillbirths and CODAC and Tulip the lowest. While Ease scores were different (p ≤ 0.01, all systems received satisfactory scores; CODAC received the highest score. Aberdeen and Wigglesworth showed poor agreement with kappas of 0.35 and 0.25 respectively. Tulip performed best with a kappa of 0.74. The remainder had good to fair agreement. Conclusion The Extended Wigglesworth and Amended Aberdeen systems cannot be recommended for classification of stillbirths. Overall, CODAC performed best with PSANZ-PDC and ReCoDe performing well. Tulip was shown to have the best agreement and a low proportion of unexplained stillbirths. The virtues of these systems need to be considered in the development of an
GPGPU Accelerated Deep Object Classification on a Heterogeneous Mobile Platform
Directory of Open Access Journals (Sweden)
Syed Tahir Hussain Rizvi
2016-12-01
Full Text Available Deep convolutional neural networks achieve state-of-the-art performance in image classification. The computational and memory requirements of such networks are however huge, and that is an issue on embedded devices due to their constraints. Most of this complexity derives from the convolutional layers and in particular from the matrix multiplications they entail. This paper proposes a complete approach to image classification providing common layers used in neural networks. Namely, the proposed approach relies on a heterogeneous CPU-GPU scheme for performing convolutions in the transform domain. The Compute Unified Device Architecture(CUDA-based implementation of the proposed approach is evaluated over three different image classification networks on a Tegra K1 CPU-GPU mobile processor. Experiments show that the presented heterogeneous scheme boasts a 50× speedup over the CPU-only reference and outperforms a GPU-based reference by 2×, while slashing the power consumption by nearly 30%.
A bayesian approach to classification criteria for spectacled eiders
Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.
1996-01-01
To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.
Christophersen, Knut-Andreas; Elstad, Eyvind; Turmo, Are
2012-01-01
This article focuses on the comparison of organizational antecedents of teachers' fostering of students' effort in two quite different accountability regimes: one management regime with an external-accountability system and one with no external accountability devices. The methodology involves cross-sectional surveys from two different management…
Failure diagnosis using deep belief learning based health state classification
International Nuclear Information System (INIS)
Tamilselvan, Prasanna; Wang, Pingfeng
2013-01-01
Effective health diagnosis provides multifarious benefits such as improved safety, improved reliability and reduced costs for operation and maintenance of complex engineered systems. This paper presents a novel multi-sensor health diagnosis method using deep belief network (DBN). DBN has recently become a popular approach in machine learning for its promised advantages such as fast inference and the ability to encode richer and higher order network structures. The DBN employs a hierarchical structure with multiple stacked restricted Boltzmann machines and works through a layer by layer successive learning process. The proposed multi-sensor health diagnosis methodology using DBN based state classification can be structured in three consecutive stages: first, defining health states and preprocessing sensory data for DBN training and testing; second, developing DBN based classification models for diagnosis of predefined health states; third, validating DBN classification models with testing sensory dataset. Health diagnosis using DBN based health state classification technique is compared with four existing diagnosis techniques. Benchmark classification problems and two engineering health diagnosis applications: aircraft engine health diagnosis and electric power transformer health diagnosis are employed to demonstrate the efficacy of the proposed approach
Directory of Open Access Journals (Sweden)
Angelo Del Vecchio
2007-05-01
Full Text Available O presente artigo visa a qualificação do regime político vigente no Brasil entre 1964-1985. Para tanto, desenvolvo o resgate do conceito de ditadura, especialmente a partir da obra de Carl Schimitt, e busco aferir a sua adequação à análise do período militar. Palavras-chave: Estado. Regime político. Ditadura. Regime militar. The present article tries to qualify the political regime in Brazil from 1964 to 1985, using the concept of dictatorship mainly developed by Carl Schimitt in his writings and tries to adequate this concept to the Brazilian military regime. Keywords: State. Political regime. Dictatorship. Military regime.
Directory of Open Access Journals (Sweden)
Gavril PANDI
2011-03-01
Full Text Available The influenced flow regimes. The presence and activities ofhumanity influences the uniform environmental system, and in this context, therivers water resources. In concordance with this, the natural runoff regime suffersbigger and deeper changes. The nature of these changes depending on the type anddegree of water uses. The multitude of the use cause different types of influence,whit different quantitative aspects. In the same time, the influences havequalitative connotations, too, regarding to the modifications of the yearly watervolume runoff. So the natural runoff regime is modified. After analyzing thedistribution laws of the monthly runoff, there have been differenced four types ofinfluenced runoff regimes. In the excess type the influenced runoff is bigger thanthe natural, continuously in the whole year. The deficient type is characterized byinverse rapports like the first type, in the whole year. In the sinusoidal type, theinfluenced runoff is smaller than the natural in the period when the water isretained in the lake reservoirs, and in the depletion period the situation inverts. Atthe irregular type the ratio between influenced and natural runoff is changeable ina random meaner monthly. The recognition of the influenced regime and the gradeof influence are necessary in the evaluation and analysis of the usable hydrologicalriver resources, in the flood defence activities, in the complex scheme of thehydrographic basins, in the environment design and so on.
Il futuro regime dei tassi di cambio.
Directory of Open Access Journals (Sweden)
J. WILLIAMSON
2014-08-01
Full Text Available When the Committee of Twenty decided to rename the exchange-rate regime rather than reform it, some two weeks after the system had collapsed, it immediately became clear that the ambitious attempt to write a new monetary constitution for the world had failed. If future attempts to reform world monetary relations are not to be equally doomed to failure, a primary requirement is a more realistic approach to this central issue. Accordingly, this paper is devoted to a consideration of the type of approach that wold be required. The author considers what solutions are feasible, managed floating versus crawling par values, the reference rate proposal and reference rates as the basis for a reformed system. JEL: E42, F33
Annotation and Classification of CRISPR-Cas Systems.
Makarova, Kira S; Koonin, Eugene V
2015-01-01
The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas (CRISPR-associated proteins) is a prokaryotic adaptive immune system that is represented in most archaea and many bacteria. Among the currently known prokaryotic defense systems, the CRISPR-Cas genomic loci show unprecedented complexity and diversity. Classification of CRISPR-Cas variants that would capture their evolutionary relationships to the maximum possible extent is essential for comparative genomic and functional characterization of this theoretically and practically important system of adaptive immunity. To this end, a multipronged approach has been developed that combines phylogenetic analysis of the conserved Cas proteins with comparison of gene repertoires and arrangements in CRISPR-Cas loci. This approach led to the current classification of CRISPR-Cas systems into three distinct types and ten subtypes for each of which signature genes have been identified. Comparative genomic analysis of the CRISPR-Cas systems in new archaeal and bacterial genomes performed over the 3 years elapsed since the development of this classification makes it clear that new types and subtypes of CRISPR-Cas need to be introduced. Moreover, this classification system captures only part of the complexity of CRISPR-Cas organization and evolution, due to the intrinsic modularity and evolutionary mobility of these immunity systems, resulting in numerous recombinant variants. Moreover, most of the cas genes evolve rapidly, complicating the family assignment for many Cas proteins and the use of family profiles for the recognition of CRISPR-Cas subtype signatures. Further progress in the comparative analysis of CRISPR-Cas systems requires integration of the most sensitive sequence comparison tools, protein structure comparison, and refined approaches for comparison of gene neighborhoods.
Annotation and Classification of CRISPR-Cas Systems
Makarova, Kira S.; Koonin, Eugene V.
2018-01-01
The clustered regularly interspaced short palindromic repeats (CRISPR)-Cas (CRISPR-associated proteins) is a prokaryotic adaptive immune system that is represented in most archaea and many bacteria. Among the currently known prokaryotic defense systems, the CRISPR-Cas genomic loci show unprecedented complexity and diversity. Classification of CRISPR-Cas variants that would capture their evolutionary relationships to the maximum possible extent is essential for comparative genomic and functional characterization of this theoretically and practically important system of adaptive immunity. To this end, a multipronged approach has been developed that combines phylogenetic analysis of the conserved Cas proteins with comparison of gene repertoires and arrangements in CRISPR-Cas loci. This approach led to the current classification of CRISPR-Cas systems into three distinct types and ten subtypes for each of which signature genes have been identified. Comparative genomic analysis of the CRISPR-Cas systems in new archaeal and bacterial genomes performed over the 3 years elapsed since the development of this classification makes it clear that new types and subtypes of CRISPR-Cas need to be introduced. Moreover, this classification system captures only part of the complexity of CRISPR-Cas organization and evolution, due to the intrinsic modularity and evolutionary mobility of these immunity systems, resulting in numerous recombinant variants. Moreover, most of the cas genes evolve rapidly, complicating the family assignment for many Cas proteins and the use of family profiles for the recognition of CRISPR-Cas subtype signatures. Further progress in the comparative analysis of CRISPR-Cas systems requires integration of the most sensitive sequence comparison tools, protein structure comparison, and refined approaches for comparison of gene neighborhoods. PMID:25981466
Is classification necessary after Google?
DEFF Research Database (Denmark)
Hjørland, Birger
2012-01-01
believe that the activity of “classification” is not worth the effort, as search engines can be improved without the heavy cost of providing metadata. Design/methodology/approach – The basic issue in classification is seen as providing criteria for deciding whether A should be classified as X...
Unifying description of the damping regimes of a stochastic particle in a periodic potential
Directory of Open Access Journals (Sweden)
Antonio Piscitelli, Massimo Pica Ciamarra
2017-09-01
Full Text Available We analyze the classical problem of the stochastic dynamics of a particle confined in a periodic potential, through the so called Il'in and Khasminskii model, with a novel semi-analytical approach. Our approach gives access to the transient and the asymptotic dynamics in all damping regimes, which are difficult to investigate in the usual Brownian model. We show that the crossover from the overdamped to the underdamped regime is associated with the loss of a typical time scale and of a typical length scale, as signaled by the divergence of the probability distribution of a certain dynamical event. In the underdamped regime, normal diffusion coexists with a non Gaussian displacement probability distribution for a long transient, as recently observed in a variety of different systems. We rationalize the microscopic physical processes leading to the non-Gaussian behavior, as well as the timescale to recover the Gaussian statistics. The theoretical results are supported by numerical calculations and are compared to those obtained for the Brownian model.
Indian Academy of Sciences (India)
First page Back Continue Last page Overview Graphics. Flux scaling: Ultimate regime. With the Nusselt number and the mixing length scales, we get the Nusselt number and Reynolds number (w'd/ν) scalings: and or. and. scaling expected to occur at extremely high Ra Rayleigh-Benard convection. Get the ultimate regime ...
Learning semantic histopathological representation for basal cell carcinoma classification
Gutiérrez, Ricardo; Rueda, Andrea; Romero, Eduardo
2013-03-01
Diagnosis of a histopathology glass slide is a complex process that involves accurate recognition of several structures, their function in the tissue and their relation with other structures. The way in which the pathologist represents the image content and the relations between those objects yields a better and accurate diagnoses. Therefore, an appropriate semantic representation of the image content will be useful in several analysis tasks such as cancer classification, tissue retrieval and histopahological image analysis, among others. Nevertheless, to automatically recognize those structures and extract their inner semantic meaning are still very challenging tasks. In this paper we introduce a new semantic representation that allows to describe histopathological concepts suitable for classification. The approach herein identify local concepts using a dictionary learning approach, i.e., the algorithm learns the most representative atoms from a set of random sampled patches, and then models the spatial relations among them by counting the co-occurrence between atoms, while penalizing the spatial distance. The proposed approach was compared with a bag-of-features representation in a tissue classification task. For this purpose, 240 histological microscopical fields of view, 24 per tissue class, were collected. Those images fed a Support Vector Machine classifier per class, using 120 images as train set and the remaining ones for testing, maintaining the same proportion of each concept in the train and test sets. The obtained classification results, averaged from 100 random partitions of training and test sets, shows that our approach is more sensitive in average than the bag-of-features representation in almost 6%.
Collaborative classification of hyperspectral and visible images with convolutional neural network
Zhang, Mengmeng; Li, Wei; Du, Qian
2017-10-01
Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.
Laser Theory for Optomechanics: Limit Cycles in the Quantum Regime
Directory of Open Access Journals (Sweden)
Niels Lörch
2014-01-01
Full Text Available Optomechanical systems can exhibit self-sustained limit cycles where the quantum state of the mechanical resonator possesses nonclassical characteristics such as a strongly negative Wigner density, as was shown recently in a numerical study by Qian et al. [Phys. Rev. Lett. 109, 253601 (2012]. Here, we derive a Fokker-Planck equation describing mechanical limit cycles in the quantum regime that correctly reproduces the numerically observed nonclassical features. The derivation starts from the standard optomechanical master equation and is based on techniques borrowed from the laser theory due to Haake and Lewenstein. We compare our analytical model with numerical solutions of the master equation based on Monte Carlo simulations and find very good agreement over a wide and so far unexplored regime of system parameters. As one main conclusion, we predict negative Wigner functions to be observable even for surprisingly classical parameters, i.e., outside the single-photon strong-coupling regime, for strong cavity drive and rather large limit-cycle amplitudes. The approach taken here provides a natural starting point for further studies of quantum effects in optomechanics.
A Weighted Block Dictionary Learning Algorithm for Classification
Shi, Zhongrong
2016-01-01
Discriminative dictionary learning, playing a critical role in sparse representation based classification, has led to state-of-the-art classification results. Among the existing discriminative dictionary learning methods, two different approaches, shared dictionary and class-specific dictionary, which associate each dictionary atom to all classes or a single class, have been studied. The shared dictionary is a compact method but with lack of discriminative information; the class-specific dict...
Integrated tracking, classification, and sensor management theory and applications
Krishnamurthy, Vikram; Vo, Ba-Ngu
2012-01-01
A unique guide to the state of the art of tracking, classification, and sensor management. This book addresses the tremendous progress made over the last few decades in algorithm development and mathematical analysis for filtering, multi-target multi-sensor tracking, sensor management and control, and target classification. It provides for the first time an integrated treatment of these advanced topics, complete with careful mathematical formulation, clear description of the theory, and real-world applications. Written by experts in the field, Integrated Tracking, Classification, and Sensor Management provides readers with easy access to key Bayesian modeling and filtering methods, multi-target tracking approaches, target classification procedures, and large scale sensor management problem-solving techniques.
Racial classification in the evolutionary sciences: a comparative analysis.
Billinger, Michael S
2007-01-01
Human racial classification has long been a problem for the discipline of anthropology, but much of the criticism of the race concept has focused on its social and political connotations. The central argument of this paper is that race is not a specifically human problem, but one that exists in evolutionary thought in general. This paper looks at various disciplinary approaches to racial or subspecies classification, extending its focus beyond the anthropological race concept by providing a comparative analysis of the use of racial classification in evolutionary biology, genetics, and anthropology.
The classification of p-compact groups for p odd
DEFF Research Database (Denmark)
Andersen, Kasper K. S.; Grodal, Jesper Kragh; Møller, Jesper Michael
2008-01-01
A p-compact group, as defined by Dwyer and Wilkerson, is a purely homotopically defined p-local analog of a compact Lie group. It has long been the hope, and later the conjecture, that these objects should have a classification similar to the classification of compact Lie groups. In this paper we...... groups are uniquely determined as p-compact groups by their Weyl groups seen as finite reflection groups over the p-adic integers. Our approach in fact gives a largely self-contained proof of the entire classification theorem for p odd....
Directory of Open Access Journals (Sweden)
Stefan Dech
2012-09-01
Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.
Abrupt climate-independent fire regime changes
Pausas, Juli G.; Keeley, Jon E.
2014-01-01
Wildfires have played a determining role in distribution, composition and structure of many ecosystems worldwide and climatic changes are widely considered to be a major driver of future fire regime changes. However, forecasting future climatic change induced impacts on fire regimes will require a clearer understanding of other drivers of abrupt fire regime changes. Here, we focus on evidence from different environmental and temporal settings of fire regimes changes that are not directly attributed to climatic changes. We review key cases of these abrupt fire regime changes at different spatial and temporal scales, including those directly driven (i) by fauna, (ii) by invasive plant species, and (iii) by socio-economic and policy changes. All these drivers might generate non-linear effects of landscape changes in fuel structure; that is, they generate fuel changes that can cross thresholds of landscape continuity, and thus drastically change fire activity. Although climatic changes might contribute to some of these changes, there are also many instances that are not primarily linked to climatic shifts. Understanding the mechanism driving fire regime changes should contribute to our ability to better assess future fire regimes.
Are Some Technologies Beyond Regulatory Regimes?
Energy Technology Data Exchange (ETDEWEB)
Jones, Wendell B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kusnezov, Dimitri [National Nuclear Security Administration (NNSA), Washington, DC (United States)
2017-08-01
Regulatory frameworks are a common tool in governance to incent and coerce behaviors supporting national or strategic stability. This includes domestic regulations and international agreements. Though regulation is always a challenge, the domain of fast evolving threats, like cyber, are proving much more difficult to control. Many discussions are underway searching for approaches that can provide national security in these domains. We use game theoretic learning models to explore the question of strategic stability with respect to the democratization of certain technologies (such as cyber). We suggest that such many-player games could inherently be chaotic with no corresponding (Nash) equilibria. In the absence of such equilibria, traditional approaches, as measures to achieve levels of overall security, may not be suitable approaches to support strategic stability in these domains. Altogether new paradigms may be needed for these issues. At the very least, regulatory regimes that fail to address the basic nature of the technology domains should not be pursued as a default solution, regardless of success in other domains. In addition, the very chaotic nature of these domains may hold the promise of novel approaches to regulation.
DETECTION OF MALICIOUS SOFTWARE USING CLASSICAL AND NEURAL NETWORK CLASSIFICATION METHODS
Directory of Open Access Journals (Sweden)
S. V. Zhernakov
2015-01-01
Full Text Available Formulation of the problem: the spectrum of problems solved by modern mobile systems such as Android is constantly growing. This is because on the one hand by the potential opportunities that are implemented in hardware, as well as their integration with modern information technologies, which in turn harmoniously complement and create powerful ardware and software information systems, capable of performing many functions, including pro- information boards. Increasing the flow of information, complexity of the processes and of the hardware and software component devices such as Android, forcing developers to create new means of protection, efficiency and qualitative performing the process. This is especially important in the development of automated systems instrumental performing classification (clustering of existing software into two classes: safe and malicious software. The aim is to increase the reliability and quality of recognition of modern built-in security of information, as well as the rationale and the selection methods of carrying out these functions. The methods used are: to accomplish the goals are analyzed and used classical methods of classification, neural network method based on standard architectures, and support vector machine (SVM - machine. Novelty: The paper presents the concept of the use of support vector in identifying deleterious software developed methodological, algorithmic and software that implements this concept in relation to the means of mobile communication. Result: The obtained qualitative and quantitative characteristics-security software. Practical value: the technique of development of advanced information security systems in mobile environments such as Android. It presents an approach to the description of behavioral malware (based on the following virus: none - wakes - Analysis of weaknesses - the action: a healthy regime or attack (threat.
Pathological Bases for a Robust Application of Cancer Molecular Classification
Directory of Open Access Journals (Sweden)
Salvador J. Diaz-Cano
2015-04-01
Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.
A New Regime of Nanoscale Thermal Transport: Collective Diffusion Increases Dissipation Efficiency
2015-04-21
different regimes of thermal transport. The laser-induced thermal expansion and subsequent cooling of the nanogratings is probed using coherent extreme UV ...technique compared with previously reported MFP spectros - copy techniques. First, our approach that combines nanoheaters with the phase sensitivity of
Eurasian polities as hybrid regimes: The case of Putin's Russia
Directory of Open Access Journals (Sweden)
Henry E. Hale
2010-01-01
Full Text Available Most Eurasian countries' political systems are not accurately described as some version of either democracy or authoritarianism. Nor does it advance social science to study each of these countries' political systems as being completely unique, sharing no significant commonalities with those of other countries. Instead, it is more fruitful to understand many Eurasian countries as a type of hybrid regime, a system that combines important elements of both democracy and autocracy in some way. One of the most important features of Eurasia's hybrid regimes, one that is shared by many hybrid regimes worldwide, is that they combine contested elections with pervasive political clientelism. Political developments in these countries can thus be usefully understood as machine politics, and the development of political systems can be understood as processes of rearranging the components of the machines in different ways. The usefulness of this approach is demonstrated through an in-depth study of the Russian Federation. It is argued that Russian political development under Putin is best understood not as “authoritarianization” but as a process in which Russia transitioned from a system of “competing pyramids” of machine power to a “single-pyramid” system, a system dominated by one large political machine. It turns out that in single-pyramid systems that preserve contested elections, as does Russia, public opinion matters more than in typical authoritarian regimes.
Towards Automatic Classification of Wikipedia Content
Szymański, Julian
Wikipedia - the Free Encyclopedia encounters the problem of proper classification of new articles everyday. The process of assignment of articles to categories is performed manually and it is a time consuming task. It requires knowledge about Wikipedia structure, which is beyond typical editor competence, which leads to human-caused mistakes - omitting or wrong assignments of articles to categories. The article presents application of SVM classifier for automatic classification of documents from The Free Encyclopedia. The classifier application has been tested while using two text representations: inter-documents connections (hyperlinks) and word content. The results of the performed experiments evaluated on hand crafted data show that the Wikipedia classification process can be partially automated. The proposed approach can be used for building a decision support system which suggests editors the best categories that fit new content entered to Wikipedia.
Ecker, Christine; Marquand, Andre; Mourão-Miranda, Janaina; Johnston, Patrick; Daly, Eileen M; Brammer, Michael J; Maltezos, Stefanos; Murphy, Clodagh M; Robertson, Dene; Williams, Steven C; Murphy, Declan G M
2010-08-11
Autism spectrum disorder (ASD) is a neurodevelopmental condition with multiple causes, comorbid conditions, and a wide range in the type and severity of symptoms expressed by different individuals. This makes the neuroanatomy of autism inherently difficult to describe. Here, we demonstrate how a multiparameter classification approach can be used to characterize the complex and subtle structural pattern of gray matter anatomy implicated in adults with ASD, and to reveal spatially distributed patterns of discriminating regions for a variety of parameters describing brain anatomy. A set of five morphological parameters including volumetric and geometric features at each spatial location on the cortical surface was used to discriminate between people with ASD and controls using a support vector machine (SVM) analytic approach, and to find a spatially distributed pattern of regions with maximal classification weights. On the basis of these patterns, SVM was able to identify individuals with ASD at a sensitivity and specificity of up to 90% and 80%, respectively. However, the ability of individual cortical features to discriminate between groups was highly variable, and the discriminating patterns of regions varied across parameters. The classification was specific to ASD rather than neurodevelopmental conditions in general (e.g., attention deficit hyperactivity disorder). Our results confirm the hypothesis that the neuroanatomy of autism is truly multidimensional, and affects multiple and most likely independent cortical features. The spatial patterns detected using SVM may help further exploration of the specific genetic and neuropathological underpinnings of ASD, and provide new insights into the most likely multifactorial etiology of the condition.
Track classification within wireless sensor network
Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic
2017-05-01
In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
The trade regime and the climate regime. Institutional evolution and adaptation
International Nuclear Information System (INIS)
Brewer, Thomas L.
2003-01-01
This article addresses concerns that the multilateral trade regime centered in the WTO and the emerging climate regime may conflict in ways that could be damaging to either or both. The article discusses the institutional and diplomatic context of these concerns, and it identifies the kinds of issues that are in question. The analysis suggests that there are opportunities for win-win outcomes in the interactions of the two regimes, for instance in the possibility of reducing fossil fuel subsidies. However, there are also problematic areas where they intersect. A core issue-and as yet an unresolved one-is whether and how emission credit trading and other activities envisioned by the Kyoto Protocol would be subject to WTO rules. The resolution of this issue will affect many other issues as well. Additional specific issues about the interactions of particular provisions in WTO agreements and the Kyoto Protocol are analyzed in a subsequent companion article in Climate Policy
A long-memory model of motor learning in the saccadic system: a regime-switching approach.
Wong, Aaron L; Shelhamer, Mark
2013-08-01
Maintenance of movement accuracy relies on motor learning, by which prior errors guide future behavior. One aspect of this learning process involves the accurate generation of predictions of movement outcome. These predictions can, for example, drive anticipatory movements during a predictive-saccade task. Predictive saccades are rapid eye movements made to anticipated future targets based on error information from prior movements. This predictive process exhibits long-memory (fractal) behavior, as suggested by inter-trial fluctuations. Here, we model this learning process using a regime-switching approach, which avoids the computational complexities associated with true long-memory processes. The resulting model demonstrates two fundamental characteristics. First, long-memory behavior can be mimicked by a system possessing no true long-term memory, producing model outputs consistent with human-subjects performance. In contrast, the popular two-state model, which is frequently used in motor learning, cannot replicate these findings. Second, our model suggests that apparent long-term memory arises from the trade-off between correcting for the most recent movement error and maintaining consistent long-term behavior. Thus, the model surprisingly predicts that stronger long-memory behavior correlates to faster learning during adaptation (in which systematic errors drive large behavioral changes); greater apparent long-term memory indicates more effective incorporation of error from the cumulative history across trials.
Locality-preserving sparse representation-based classification in hyperspectral imagery
Gao, Lianru; Yu, Haoyang; Zhang, Bing; Li, Qingting
2016-10-01
This paper proposes to combine locality-preserving projections (LPP) and sparse representation (SR) for hyperspectral image classification. The LPP is first used to reduce the dimensionality of all the training and testing data by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold, where the high-dimensional data lies. Then, SR codes the projected testing pixels as sparse linear combinations of all the training samples to classify the testing pixels by evaluating which class leads to the minimum approximation error. The integration of LPP and SR represents an innovative contribution to the literature. The proposed approach, called locality-preserving SR-based classification, addresses the imbalance between high dimensionality of hyperspectral data and the limited number of training samples. Experimental results on three real hyperspectral data sets demonstrate that the proposed approach outperforms the original counterpart, i.e., SR-based classification.
Multifractal regime transition in a modified minority game model
International Nuclear Information System (INIS)
Crepaldi, Antonio F.; Rodrigues Neto, Camilo; Ferreira, Fernando F.; Francisco, Gerson
2009-01-01
The search for more realistic modeling of financial time series reveals several stylized facts of real markets. In this work we focus on the multifractal properties found in price and index signals. Although the usual minority game (MG) models do not exhibit multifractality, we study here one of its variants that does. We show that the nonsynchronous MG models in the nonergodic phase is multifractal and in this sense, together with other stylized facts, constitute a better modeling tool. Using the structure function (SF) approach we detected the stationary and the scaling range of the time series generated by the MG model and, from the linear (non-linear) behavior of the SF we identified the fractal (multifractal) regimes. Finally, using the wavelet transform modulus maxima (WTMM) technique we obtained its multifractal spectrum width for different dynamical regimes.
Dense Iterative Contextual Pixel Classification using Kriging
DEFF Research Database (Denmark)
Ganz, Melanie; Loog, Marco; Brandt, Sami
2009-01-01
have been proposed to this end, e.g., iterative contextual pixel classification, iterated conditional modes, and other approaches related to Markov random fields. A problem of these methods, however, is their computational complexity, especially when dealing with high-resolution images in which......In medical applications, segmentation has become an ever more important task. One of the competitive schemes to perform such segmentation is by means of pixel classification. Simple pixel-based classification schemes can be improved by incorporating contextual label information. Various methods...... relatively long range interactions may play a role. We propose a new method based on Kriging that makes it possible to include such long range interactions, while keeping the computations manageable when dealing with large medical images....