WorldWideScience

Sample records for classification schemes based

  1. A new Fourier transform based CBIR scheme for mammographic mass classification: a preliminary invariance assessment

    Science.gov (United States)

    Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin

    2015-03-01

    The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.

  2. Segmentation techniques evaluation based on a single compact breast mass classification scheme

    Science.gov (United States)

    Matheus, Bruno R. N.; Marcomini, Karem D.; Schiabel, Homero

    2016-03-01

    In this work some segmentation techniques are evaluated by using a simple centroid-based classification system regarding breast mass delineation in digital mammography images. The aim is to determine the best one for future CADx developments. Six techniques were tested: Otsu, SOM, EICAMM, Fuzzy C-Means, K-Means and Level-Set. All of them were applied to segment 317 mammography images from DDSM database. A single compact set of attributes was extracted and two centroids were defined, one for malignant and another for benign cases. The final classification was based on proximity with a given centroid and the best results were presented by the Level-Set technique with a 68.1% of Accuracy, which indicates this method as the most promising for breast masses segmentation aiming a more precise interpretation in schemes CADx.

  3. A new classification scheme of plastic wastes based upon recycling labels

    Energy Technology Data Exchange (ETDEWEB)

    Özkan, Kemal, E-mail: kozkan@ogu.edu.tr [Computer Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Ergin, Semih, E-mail: sergin@ogu.edu.tr [Electrical Electronics Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Işık, Şahin, E-mail: sahini@ogu.edu.tr [Computer Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Işıklı, İdil, E-mail: idil.isikli@bilecik.edu.tr [Electrical Electronics Engineering Dept., Bilecik University, 11210 Bilecik (Turkey)

    2015-01-15

    Highlights: • PET, HPDE or PP types of plastics are considered. • An automated classification of plastic bottles based on the feature extraction and classification methods is performed. • The decision mechanism consists of PCA, Kernel PCA, FLDA, SVD and Laplacian Eigenmaps methods. • SVM is selected to achieve the classification task and majority voting technique is used. - Abstract: Since recycling of materials is widely assumed to be environmentally and economically beneficial, reliable sorting and processing of waste packaging materials such as plastics is very important for recycling with high efficiency. An automated system that can quickly categorize these materials is certainly needed for obtaining maximum classification while maintaining high throughput. In this paper, first of all, the photographs of the plastic bottles have been taken and several preprocessing steps were carried out. The first preprocessing step is to extract the plastic area of a bottle from the background. Then, the morphological image operations are implemented. These operations are edge detection, noise removal, hole removing, image enhancement, and image segmentation. These morphological operations can be generally defined in terms of the combinations of erosion and dilation. The effect of bottle color as well as label are eliminated using these operations. Secondly, the pixel-wise intensity values of the plastic bottle images have been used together with the most popular subspace and statistical feature extraction methods to construct the feature vectors in this study. Only three types of plastics are considered due to higher existence ratio of them than the other plastic types in the world. The decision mechanism consists of five different feature extraction methods including as Principal Component Analysis (PCA), Kernel PCA (KPCA), Fisher’s Linear Discriminant Analysis (FLDA), Singular Value Decomposition (SVD) and Laplacian Eigenmaps (LEMAP) and uses a simple

  4. A new classification scheme of plastic wastes based upon recycling labels

    International Nuclear Information System (INIS)

    Highlights: • PET, HPDE or PP types of plastics are considered. • An automated classification of plastic bottles based on the feature extraction and classification methods is performed. • The decision mechanism consists of PCA, Kernel PCA, FLDA, SVD and Laplacian Eigenmaps methods. • SVM is selected to achieve the classification task and majority voting technique is used. - Abstract: Since recycling of materials is widely assumed to be environmentally and economically beneficial, reliable sorting and processing of waste packaging materials such as plastics is very important for recycling with high efficiency. An automated system that can quickly categorize these materials is certainly needed for obtaining maximum classification while maintaining high throughput. In this paper, first of all, the photographs of the plastic bottles have been taken and several preprocessing steps were carried out. The first preprocessing step is to extract the plastic area of a bottle from the background. Then, the morphological image operations are implemented. These operations are edge detection, noise removal, hole removing, image enhancement, and image segmentation. These morphological operations can be generally defined in terms of the combinations of erosion and dilation. The effect of bottle color as well as label are eliminated using these operations. Secondly, the pixel-wise intensity values of the plastic bottle images have been used together with the most popular subspace and statistical feature extraction methods to construct the feature vectors in this study. Only three types of plastics are considered due to higher existence ratio of them than the other plastic types in the world. The decision mechanism consists of five different feature extraction methods including as Principal Component Analysis (PCA), Kernel PCA (KPCA), Fisher’s Linear Discriminant Analysis (FLDA), Singular Value Decomposition (SVD) and Laplacian Eigenmaps (LEMAP) and uses a simple

  5. EEG Classification for Hybrid Brain-Computer Interface Using a Tensor Based Multiclass Multimodal Analysis Scheme.

    Science.gov (United States)

    Ji, Hongfei; Li, Jie; Lu, Rongrong; Gu, Rong; Cao, Lei; Gong, Xiaoliang

    2016-01-01

    Electroencephalogram- (EEG-) based brain-computer interface (BCI) systems usually utilize one type of changes in the dynamics of brain oscillations for control, such as event-related desynchronization/synchronization (ERD/ERS), steady state visual evoked potential (SSVEP), and P300 evoked potentials. There is a recent trend to detect more than one of these signals in one system to create a hybrid BCI. However, in this case, EEG data were always divided into groups and analyzed by the separate processing procedures. As a result, the interactive effects were ignored when different types of BCI tasks were executed simultaneously. In this work, we propose an improved tensor based multiclass multimodal scheme especially for hybrid BCI, in which EEG signals are denoted as multiway tensors, a nonredundant rank-one tensor decomposition model is proposed to obtain nonredundant tensor components, a weighted fisher criterion is designed to select multimodal discriminative patterns without ignoring the interactive effects, and support vector machine (SVM) is extended to multiclass classification. Experiment results suggest that the proposed scheme can not only identify the different changes in the dynamics of brain oscillations induced by different types of tasks but also capture the interactive effects of simultaneous tasks properly. Therefore, it has great potential use for hybrid BCI. PMID:26880873

  6. A classification scheme for chimera states

    OpenAIRE

    Kemeth, Felix P.; Haugland, Sindre W.; Schmidt, Lennart; Kevrekidis, Ioannis G.; Krischer, Katharina

    2016-01-01

    We present a universal characterization scheme for chimera states applicable to both numerical and experimental data sets. The scheme is based on two correlation measures that enable a meaningful definition of chimera states as well as their classification into three categories: stationary, turbulent and breathing. In addition, these categories can be further subdivided according to the time-stationarity of these two measures. We demonstrate that this approach both is consistent with previous...

  7. A classification scheme for chimera states

    Science.gov (United States)

    Kemeth, Felix P.; Haugland, Sindre W.; Schmidt, Lennart; Kevrekidis, Ioannis G.; Krischer, Katharina

    2016-09-01

    We present a universal characterization scheme for chimera states applicable to both numerical and experimental data sets. The scheme is based on two correlation measures that enable a meaningful definition of chimera states as well as their classification into three categories: stationary, turbulent, and breathing. In addition, these categories can be further subdivided according to the time-stationarity of these two measures. We demonstrate that this approach is both consistent with previously recognized chimera states and enables us to classify states as chimeras which have not been categorized as such before. Furthermore, the scheme allows for a qualitative and quantitative comparison of experimental chimeras with chimeras obtained through numerical simulations.

  8. A new classification scheme of plastic wastes based upon recycling labels.

    Science.gov (United States)

    Özkan, Kemal; Ergin, Semih; Işık, Şahin; Işıklı, Idil

    2015-01-01

    results agree on. The proposed classification scheme provides high accuracy rate, and also it is able to run in real-time applications. It can automatically classify the plastic bottle types with approximately 90% recognition accuracy. Besides this, the proposed methodology yields approximately 96% classification rate for the separation of PET or non-PET plastic types. It also gives 92% accuracy for the categorization of non-PET plastic types into HPDE or PP.

  9. Intelligent Video Object Classification Scheme using Offline Feature Extraction and Machine Learning based Approach

    Directory of Open Access Journals (Sweden)

    Chandra Mani Sharma

    2012-01-01

    Full Text Available Classification of objects in video stream is important because of its application in many emerging areas such as visual surveillance, content based video retrieval and indexing etc. The task is far more challenging because the video data is of heavy and highly variable nature. The processing of video data is required to be in real-time. This paper presents a multiclass object classification technique using machine learning approach. Haar-like features are used for training the classifier. The feature calculation is performed using Integral Image representation and we train the classifier offline using a Stage-wise Additive Modeling using a Multiclass Exponential loss function (SAMME. The validity of the method has been verified from the implementation of a real-time human-car detector. Experimental results show that the proposed method can accurately classify objects, in video, into their respective classes. The proposed object classifier works well in outdoor environment in presence of moderate lighting conditions and variable scene background. The proposed technique is compared, with other object classification techniques, based on various performance parameters.

  10. Parallel Implementation of Morphological Profile Based Spectral-Spatial Classification Scheme for Hyperspectral Imagery

    Science.gov (United States)

    Kumar, B.; Dikshit, O.

    2016-06-01

    Extended morphological profile (EMP) is a good technique for extracting spectral-spatial information from the images but large size of hyperspectral images is an important concern for creating EMPs. However, with the availability of modern multi-core processors and commodity parallel processing systems like graphics processing units (GPUs) at desktop level, parallel computing provides a viable option to significantly accelerate execution of such computations. In this paper, parallel implementation of an EMP based spectralspatial classification method for hyperspectral imagery is presented. The parallel implementation is done both on multi-core CPU and GPU. The impact of parallelization on speed up and classification accuracy is analyzed. For GPU, the implementation is done in compute unified device architecture (CUDA) C. The experiments are carried out on two well-known hyperspectral images. It is observed from the experimental results that GPU implementation provides a speed up of about 7 times, while parallel implementation on multi-core CPU resulted in speed up of about 3 times. It is also observed that parallel implementation has no adverse impact on the classification accuracy.

  11. A new classification scheme for deep geothermal systems based on geologic controls

    Science.gov (United States)

    Moeck, I.

    2012-04-01

    A key element in the characterization, assessment and development of geothermal energy systems is the resource classification. Throughout the past 30 years many classifications and definitions were published mainly based on temperature and thermodynamic properties. In the past classification systems, temperature has been the essential measure of the quality of the resource and geothermal systems have been divided into three different temperature (or enthalpy) classes: low-temperature, moderate-temperature and high-temperature. There are, however, no uniform temperature ranges for these classes. It is still a key requirement of a geothermal classification that resource assessment provides logical and consistent frameworks simplified enough to communicate important aspects of geothermal energy potential to both non-experts and general public. One possible solution may be to avoid classifying geothermal resources by temperature and simply state the range of temperatures at the individual site. Due to technological development, in particular in EGS (Enhanced Geothermal Systems or Engineered Geothermal Systems; both terms are considered synonymously in this thesis) technology, currently there are more geothermal systems potentially economic than 30 years ago. An alternative possibility is to classify geothermal energy systems by their geologic setting. Understanding and characterizing the geologic controls on geothermal systems has been an ongoing focus on different scales from plate tectonics to local tectonics/structural geology. In fact, the geologic setting has a fundamental influence on the potential temperature, on the fluid composition, the reservoir characteristics and whether the system is a predominantly convective or conductive system. The key element in this new classification for geothermal systems is the recognition that a geothermal system is part of a geological system. The structural geological and plate tectonic setting has a fundamental influence on

  12. DCT domain feature extraction scheme based on motor unit action potential of EMG signal for neuromuscular disease classification.

    Science.gov (United States)

    Doulah, Abul Barkat Mollah Sayeed Ud; Fattah, Shaikh Anowarul; Zhu, Wei-Ping; Ahmad, M Omair

    2014-01-01

    A feature extraction scheme based on discrete cosine transform (DCT) of electromyography (EMG) signals is proposed for the classification of normal event and a neuromuscular disease, namely the amyotrophic lateral sclerosis. Instead of employing DCT directly on EMG data, it is employed on the motor unit action potentials (MUAPs) extracted from the EMG signal via a template matching-based decomposition technique. Unlike conventional MUAP-based methods, only one MUAP with maximum dynamic range is selected for DCT-based feature extraction. Magnitude and frequency values of a few high-energy DCT coefficients corresponding to the selected MUAP are used as the desired feature which not only reduces computational burden, but also offers better feature quality with high within-class compactness and between-class separation. For the purpose of classification, the K-nearest neighbourhood classifier is employed. Extensive analysis is performed on clinical EMG database and it is found that the proposed method provides a very satisfactory performance in terms of specificity, sensitivity and overall classification accuracy.

  13. A classification scheme for risk assessment methods.

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In

  14. Small-scale classification schemes

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2004-01-01

    effects remained largely implicit. The requirements classification contributed to constraining the requirements-engineering process by supporting the software engineers in maintaining some level of control over the process. This way, the requirements classification provided the software engineers...... requirements and provided little information about why these requirements were considered relevant. This stands in contrast to the discussions at the project meetings where the software engineers made frequent use of both abstract goal descriptions and concrete examples to make sense of the requirements...... classification inherited a lot of its structure from the existing system and rendered requirements that transcended the framework laid out by the existing system almost invisible. As a result, the requirements classification became a defining element of the requirements-engineering process, though its main...

  15. A hierarchical classification scheme of psoriasis images

    DEFF Research Database (Denmark)

    Maletti, Gabriela Mariel; Ersbøll, Bjarne Kjær

    2003-01-01

    A two-stage hierarchical classification scheme of psoriasis lesion images is proposed. These images are basically composed of three classes: normal skin, lesion and background. The scheme combines conventional tools to separate the skin from the background in the first stage, and the lesion from...

  16. A DSmT Based Combination Scheme for Multi-Class Classification

    OpenAIRE

    Nassim Abbas; Youcef Chiban; Zineb Belhadi; Mehdia Hedir

    2015-01-01

    This paper presents a new combination scheme for reducing the number of focal elements to manipulate in order to reduce the complexity of the combination process in the multiclass framework. The basic idea consists in using of p sources of information involved in the global scheme providing p kinds of complementary information to feed each set of p one class support vector machine classifiers independently of each other, which are designed for detecting the ou...

  17. A Classification Scheme for Glaciological AVA Responses

    Science.gov (United States)

    Booth, A.; Emir, E.

    2014-12-01

    A classification scheme is proposed for amplitude vs. angle (AVA) responses as an aid to the interpretation of seismic reflectivity in glaciological research campaigns. AVA responses are a powerful tool in characterising the material properties of glacier ice and its substrate. However, before interpreting AVA data, careful true amplitude processing is required to constrain basal reflectivity and compensate amplitude decay mechanisms, including anelastic attenuation and spherical divergence. These fundamental processing steps can be difficult to design in cases of noisy data, e.g. where a target reflection is contaminated by surface wave energy (in the case of shallow glaciers) or by energy reflected from out of the survey plane. AVA methods have equally powerful usage in estimating the fluid fill of potential hydrocarbon reservoirs. However, such applications seldom use true amplitude data and instead consider qualitative AVA responses using a well-defined classification scheme. Such schemes are often defined in terms of the characteristics of best-fit responses to the observed reflectivity, e.g. the intercept (I) and gradient (G) of a linear approximation to the AVA data. The position of the response on a cross-plot of I and G then offers a diagnostic attribute for certain fluid types. We investigate the advantages in glaciology of emulating this practice, and develop a cross-plot based on the 3-term Shuey AVA approximation (using I, G, and a curvature term C). Model AVA curves define a clear lithification trend: AVA responses to stiff (lithified) substrates fall discretely into one quadrant of the cross-plot, with positive I and negative G, whereas those to fluid-rich substrates plot diagonally opposite (in the negative I and positive G quadrant). The remaining quadrants are unoccupied by plausible single-layer responses and may therefore be diagnostic of complex thin-layer reflectivity, and the magnitude and polarity of the C term serves as a further indicator

  18. Environmental endocrine disruptors: A proposed classification scheme

    Energy Technology Data Exchange (ETDEWEB)

    Fur, P.L. de; Roberts, J. [Environmental Defense Fund, Washington, DC (United States)

    1995-12-31

    A number of chemicals known to act on animal systems through the endocrine system have been termed environmental endocrine disruptors. This group includes some of the PCBs and TCDDs, as well as lead, mercury and a large number of pesticides. The common feature is that the chemicals interact with endogenous endocrine systems at the cellular and/or molecular level to alter normal processes that are controlled or regulated by hormones. Although the existence of artificial or environmental estrogens (e.g. chlordecone and DES) has been known for some time, recent data indicate that this phenomenon is widespread. Indeed, anti-androgens have been held responsible for reproductive dysfunction in alligator populations in Florida. But the significance of endocrine disruption was recognized by pesticide manufacturers when insect growth regulators were developed to interfere with hormonal control of growth. Controlling, regulating or managing these chemicals depends in no small part on the ability to identify, screen or otherwise know that a chemical is an endocrine disrupter. Two possible classifications schemes are: using the effects caused in an animal, or animals as an exposure indicator; and using a known screen for the point of contact with the animal. The former would require extensive knowledge of cause and effect relationships in dozens of animal groups; the latter would require a screening tool comparable to an estrogen binding assay. The authors present a possible classification based on chemicals known to disrupt estrogenic, androgenic and ecdysone regulated hormonal systems.

  19. Development and application of a new comprehensive image-based classification scheme for coastal and benthic environments along the southeast Florida continental shelf

    Science.gov (United States)

    Makowski, Christopher

    The coastal (terrestrial) and benthic environments along the southeast Florida continental shelf show a unique biophysical succession of marine features from a highly urbanized, developed coastal region in the north (i.e. northern Miami-Dade County) to a protective marine sanctuary in the southeast (i.e. Florida Keys National Marine Sanctuary). However, the establishment of a standard bio-geomorphological classification scheme for this area of coastal and benthic environments is lacking. The purpose of this study was to test the hypothesis and answer the research question of whether new parameters of integrating geomorphological components with dominant biological covers could be developed and applied across multiple remote sensing platforms for an innovative way to identify, interpret, and classify diverse coastal and benthic environments along the southeast Florida continental shelf. An ordered manageable hierarchical classification scheme was developed to incorporate the categories of Physiographic Realm, Morphodynamic Zone, Geoform, Landform, Dominant Surface Sediment, and Dominant Biological Cover. Six different remote sensing platforms (i.e. five multi-spectral satellite image sensors and one high-resolution aerial orthoimagery) were acquired, delineated according to the new classification scheme, and compared to determine optimal formats for classifying the study area. Cognitive digital classification at a nominal scale of 1:6000 proved to be more accurate than autoclassification programs and therefore used to differentiate coastal marine environments based on spectral reflectance characteristics, such as color, tone, saturation, pattern, and texture of the seafloor topology. In addition, attribute tables were created in conjugation with interpretations to quantify and compare the spatial relationships between classificatory units. IKONOS-2 satellite imagery was determined to be the optimal platform for applying the hierarchical classification scheme

  20. Cross-ontological analytics for alignment of different classification schemes

    Science.gov (United States)

    Posse, Christian; Sanfilippo, Antonio P; Gopalan, Banu; Riensche, Roderick M; Baddeley, Robert L

    2010-09-28

    Quantification of the similarity between nodes in multiple electronic classification schemes is provided by automatically identifying relationships and similarities between nodes within and across the electronic classification schemes. Quantifying the similarity between a first node in a first electronic classification scheme and a second node in a second electronic classification scheme involves finding a third node in the first electronic classification scheme, wherein a first product value of an inter-scheme similarity value between the second and third nodes and an intra-scheme similarity value between the first and third nodes is a maximum. A fourth node in the second electronic classification scheme can be found, wherein a second product value of an inter-scheme similarity value between the first and fourth nodes and an intra-scheme similarity value between the second and fourth nodes is a maximum. The maximum between the first and second product values represents a measure of similarity between the first and second nodes.

  1. An Efficient Machine Learning Based Classification Scheme for Detecting Distributed Command & Control Traffic of P2P Botnets

    Directory of Open Access Journals (Sweden)

    Pijush Barthakur

    2013-11-01

    Full Text Available Biggest internet security threat is the rise of Botnets having modular and flexible structures. The combined power of thousands of remotely controlled computers increases the speed and severity of attacks. In this paper, we provide a comparative analysis of machine-learning based classification of botnet command & control(C&C traffic for proactive detection of Peer-to-Peer (P2P botnets. We combine some of selected botnet C&C traffic flow features with that of carefully selected botnet behavioral characteristic features for better classification using machine learning algorithms. Our simulation results show that our method is very effective having very good test accuracy and very little training time. We compare the performances of Decision Tree (C4.5, Bayesian Network and Linear Support Vector Machines using performance metrics like accuracy, sensitivity, positive predictive value(PPV and F-Measure. We also provide a comparative analysis of our predictive models using AUC (area under ROC curve. Finally, we propose a rule induction algorithm from original C4.5 algorithm of Quinlan. Our proposed algorithm produces better accuracy than the original decision tree classifier.

  2. The Impact of Industry Classification Schemes on Financial Research

    OpenAIRE

    Weiner, Christian

    2005-01-01

    This paper investigates industry classification systems. During the last 50 years there has been a considerable discussion of problems regarding the classification of economic data by industries. From my perspective, the central point of each classification is to determine a balance between aggregation of similar firms and differentiation between industries. This paper examines the structure and content of industrial classification schemes and how they affect financial research. I use classif...

  3. Enriching User-Oriented Class Associations for Library Classification Schemes.

    Science.gov (United States)

    Pu, Hsiao-Tieh; Yang, Chyan

    2003-01-01

    Explores the possibility of adding user-oriented class associations to hierarchical library classification schemes. Analyses a log of book circulation records from a university library in Taiwan and shows that classification schemes can be made more adaptable by analyzing circulation patterns of similar users. (Author/LRW)

  4. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Biogeographic Classification Scheme I... RESOURCE MANAGEMENT NATIONAL ESTUARINE RESEARCH RESERVE SYSTEM REGULATIONS Pt. 921, App. I Appendix I to Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the...

  5. An improved fault detection classification and location scheme based on wavelet transform and artificial neural network for six phase transmission line using single end data only.

    Science.gov (United States)

    Koley, Ebha; Verma, Khushaboo; Ghosh, Subhojit

    2015-01-01

    Restrictions on right of way and increasing power demand has boosted development of six phase transmission. It offers a viable alternative for transmitting more power, without major modification in existing structure of three phase double circuit transmission system. Inspite of the advantages, low acceptance of six phase system is attributed to the unavailability of a proper protection scheme. The complexity arising from large number of possible faults in six phase lines makes the protection quite challenging. The proposed work presents a hybrid wavelet transform and modular artificial neural network based fault detector, classifier and locator for six phase lines using single end data only. The standard deviation of the approximate coefficients of voltage and current signals obtained using discrete wavelet transform are applied as input to the modular artificial neural network for fault classification and location. The proposed scheme has been tested for all 120 types of shunt faults with variation in location, fault resistance, fault inception angles. The variation in power system parameters viz. short circuit capacity of the source and its X/R ratio, voltage, frequency and CT saturation has also been investigated. The result confirms the effectiveness and reliability of the proposed protection scheme which makes it ideal for real time implementation. PMID:26435897

  6. Sound classification of dwellings - Comparison of schemes in Europe

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2009-01-01

    , there are significant discrepancies. Descriptors, number of quality classes, class intervals, class levels and status of the classification schemes in relation to legal requirements vary. In some countries the building code and the classification standard are incoherent. In other countries, they are...... strongly ”integrated”, implying that the building code refers to a specific class in a classification standard rather than describing requirements. Although the schemes prove useful on a national basis, the diversity in Europe is an obstacle for exchange of experience and for further development of design......National sound classification schemes for dwellings exist in nine countries in Europe, and proposals are under preparation in more countries. The schemes specify class criteria concerning several acoustic aspects, the main criteria being about airborne and impact sound insulation between dwellings...

  7. A Classification Scheme for Phenomenological Universalities in Growth Problems

    CERN Document Server

    Castorina, P; Guiot, C

    2006-01-01

    A classification in universality classes of broad categories of phenomenologies, belonging to different disciplines, may be very useful for a crossfertilization among them and for the purpose of pattern recognition. We present here a simple scheme for the classification of nonlinear growth problems. The success of the scheme in predicting and characterizing the well known Gompertz, West and logistic models suggests to us the study of a hitherto unexplored class of nonlinear growth problems.

  8. On the Classification of Psychology in General Library Classification Schemes.

    Science.gov (United States)

    Soudek, Miluse

    1980-01-01

    Holds that traditional library classification systems are inadequate to handle psychological literature, and advocates the establishment of new theoretical approaches to bibliographic organization. (FM)

  9. Towards a Collaborative Intelligent Tutoring System Classification Scheme

    Science.gov (United States)

    Harsley, Rachel

    2014-01-01

    This paper presents a novel classification scheme for Collaborative Intelligent Tutoring Systems (CITS), an emergent research field. The three emergent classifications of CITS are unstructured, semi-structured, and fully structured. While all three types of CITS offer opportunities to improve student learning gains, the full extent to which these…

  10. Classification and prioritization of usability problems using an augmented classification scheme.

    Science.gov (United States)

    Khajouei, R; Peute, L W P; Hasman, A; Jaspers, M W M

    2011-12-01

    Various methods exist for conducting usability evaluation studies in health care. But although the methodology is clear, no usability evaluation method provides a framework by which the usability reporting activities are fully standardized. Despite the frequent use of forms to report the usability problems and their context-information, this reporting is often hindered by information losses. This is due to the fact that evaluators' problem descriptions are based on individual judgments of what they find salient about a usability problem at a certain moment in time. Moreover, usability problems are typically classified in terms of their type, number, and severity. These classes are usually devised by the evaluator for the purpose at hand and the used problem types often are not mutually exclusive, complete and distinct. Also the impact of usability problems on the task outcome is usually not taken into account. Consequently, problem descriptions are often vague and even when combined with their classification in type or severity leave room for multiple interpretations when discussed with system designers afterwards. Correct interpretation of these problem descriptions is then highly dependent upon the extent to which the evaluators can retrieve relevant details from memory. To remedy this situation a framework is needed guiding usability evaluators in high quality reporting and unique classification of usability problems. Such a framework should allow the disclosure of the underlying essence of problem causes, the severity rating and the classification of the impact of usability problems on the task outcome. The User Action Framework (UAF) is an existing validated classification framework that allows the unique classification of usability problems, but it does not include a severity rating nor does it contain an assessment of the potential impact of usability flaws on the final task outcomes. We therefore augmented the UAF with a severity rating based on Nielsen

  11. Transporter taxonomy - a comparison of different transport protein classification schemes.

    Science.gov (United States)

    Viereck, Michael; Gaulton, Anna; Digles, Daniela; Ecker, Gerhard F

    2014-06-01

    Currently, there are more than 800 well characterized human membrane transport proteins (including channels and transporters) and there are estimates that about 10% (approx. 2000) of all human genes are related to transport. Membrane transport proteins are of interest as potential drug targets, for drug delivery, and as a cause of side effects and drug–drug interactions. In light of the development of Open PHACTS, which provides an open pharmacological space, we analyzed selected membrane transport protein classification schemes (Transporter Classification Database, ChEMBL, IUPHAR/BPS Guide to Pharmacology, and Gene Ontology) for their ability to serve as a basis for pharmacology driven protein classification. A comparison of these membrane transport protein classification schemes by using a set of clinically relevant transporters as use-case reveals the strengths and weaknesses of the different taxonomy approaches.

  12. New Course Design: Classification Schemes and Information Architecture.

    Science.gov (United States)

    Weinberg, Bella Hass

    2002-01-01

    Describes a course developed at St. John's University (New York) in the Division of Library and Information Science that relates traditional classification schemes to information architecture and Web sites. Highlights include functional aspects of information architecture, that is, the way content is structured; assignments; student reactions; and…

  13. International proposal for an acoustic classification scheme for dwellings

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2014-01-01

    countries, had as one of the main objectives preparation of a proposal for a harmonized acoustic classification scheme. The proposal developed has been approved as an ISO/TC43/SC2 work item, and a working group established. This paper describes the proposal, the background and the perspectives....

  14. The interplay of descriptor-based computational analysis with pharmacophore modeling builds the basis for a novel classification scheme for feruloyl esterases

    DEFF Research Database (Denmark)

    Udatha, D.B.R.K. Gupta; Kouskoumvekaki, Irene; Olsson, Lisbeth;

    2011-01-01

    classification studies on FAEs were restricted on sequence similarity and substrate specificity on just four model substrates and considered only a handful of FAEs belonging to the fungal kingdom. This study centers on the descriptor-based classification and structural analysis of experimentally verified...... on amino acid composition and physico-chemical composition descriptors derived from the respective amino acid sequence. A Support Vector Machine model was subsequently constructed for the classification of new FAEs into the pre-assigned clusters. The model successfully recognized 98.2% of the training...... sequences and all the sequences of the blind test. The underlying functionality of the 12 proposed FAE families was validated against a combination of prediction tools and published experimental data. Another important aspect of the present work involves the development of pharmacophore models for the new...

  15. Pitch Based Sound Classification

    OpenAIRE

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U.

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classif...

  16. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  17. Discovery of User-Oriented Class Associations for Enriching Library Classification Schemes.

    Science.gov (United States)

    Pu, Hsiao-Tieh

    2002-01-01

    Presents a user-based approach to exploring the possibility of adding user-oriented class associations to hierarchical library classification schemes. Classes not grouped in the same subject hierarchies yet relevant to users' knowledge are obtained by analyzing a log book of a university library's circulation records, using collaborative filtering…

  18. A novel pattern classification scheme using the Baker's map

    OpenAIRE

    Rogers, Alan; Keating, John; Shorten, Robert

    2003-01-01

    We demonstrate a novel application of nonlinear systems in the design of pattern classification systems. We show that pattern classification systems can be designed based upon training algorithms designed to control the qualitative behaviour of a nonlinear system. Our paradigm is illustrated by means of a simple chaotic system-the Baker's map. Algorithms for training the system are presented and examples are given to illustrate the operation and learning of the system for pattern classificati...

  19. Adaptive codebook selection schemes for image classification in correlated channels

    Science.gov (United States)

    Hu, Chia Chang; Liu, Xiang Lian; Liu, Kuan-Fu

    2015-09-01

    The multiple-input multiple-output (MIMO) system with the use of transmit and receive antenna arrays achieves diversity and array gains via transmit beamforming. Due to the absence of full channel state information (CSI) at the transmitter, the transmit beamforming vector can be quantized at the receiver and sent back to the transmitter by a low-rate feedback channel, called limited feedback beamforming. One of the key roles of Vector Quantization (VQ) is how to generate a good codebook such that the distortion between the original image and the reconstructed image is the minimized. In this paper, a novel adaptive codebook selection scheme for image classification is proposed with taking both spatial and temporal correlation inherent in the channel into consideration. The new codebook selection algorithm is developed to select two codebooks from the discrete Fourier transform (DFT) codebook, the generalized Lloyd algorithm (GLA) codebook and the Grassmannian codebook to be combined and used as candidates of the original image and the reconstructed image for image transmission. The channel is estimated and divided into four regions based on the spatial and temporal correlation of the channel and an appropriate codebook is assigned to each region. The proposed method can efficiently reduce the required information of feedback under the spatially and temporally correlated channels, where each region is adaptively. Simulation results show that in the case of temporally and spatially correlated channels, the bit-error-rate (BER) performance can be improved substantially by the proposed algorithm compared to the one with only single codebook.

  20. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  1. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    CERN Document Server

    Kartaltepe, Jeyhan S; Kocevski, Dale; McIntosh, Daniel H; Lotz, Jennifer; Bell, Eric F; Faber, Sandra; Ferguson, Henry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J; Croton, Darren; Dahlen, Tomas; de Mello, Duilia F; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Inami, Hanae; Kassin, Susan; Koekemoer, Anton; Lani, Caterina; Liu, Nick; Lucas, Ray A; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; Mutch, Simon; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Poole, Gregory B; Rizer, Zachary; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Weiner, Benjamin; Wuyts, Stijn

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H<24.5 involving the dedicated efforts of 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies up to z<4 over all the fields. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, $k$-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed -- GOODS-S. The wide area coverage spanning the full field includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all ...

  2. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    Science.gov (United States)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Henry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J.; Croton, Darren; Dahlen, Tomas; deMello, Duilia F.; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Inami, Hanae; Kassin, Susan; Koekemoer, Anton; Lani, Caterina; Liu, Nick; Lucas, Ray A.; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Rizer, Zachary; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Weiner, Benjamin; Wuyts, Stijn

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H <24.5 involving the dedicated efforts of 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies spanning 0 < z < 4 over all the fields. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, k-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed - GOODS-S, which has been classified at various depths. The wide area coverage spanning the full field (wide+deep+ERS) includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all of the visual classifications in GOODS-S along with the Perl/Tk GUI that we developed to classify galaxies. We present our initial results here, including an analysis of our internal consistency and comparisons among multiple classifiers as well as a comparison to the Sersic index. We find that the level of agreement among classifiers is quite good and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement and irregulars the lowest. A comparison of our classifications with the Sersic index and restframe colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or

  3. "Quark Confinement" and Evolution of Covariant Hadron-Classification Scheme

    CERN Document Server

    Ishida, Shin

    2014-01-01

    The extension of Non-Relativistic-to-Covariant classification scheme seems to be an urgent problem in the Hadron Spectroscopy. Here are given the recent results of our research. 1) Brief history of our way of the extension on Kinematical Frameworks: from SU(2)_{sigma} $\\otimes$ O(3)_{L} to U (4)_{DS,m} (Tensor-space of Dirac Spinor with the static unitary symm. SU(2)_{m}, which is new Mass-Reversal symm. reflecting the situation of Q.C.) $\\otimes O(2)_{r\\perp v}$ (2-dim. internal spatial-vector r vertical to Boost-velocity v, embedded in the O(3,1)_{Lorentz}). Also is brought in Cov. scheme the thus far Overlooked Chirality Symm. of QCD/Stand. Gauge Model. 2) Propertime tau-Quantum Mechanics for Conf. Q. System and Quantization of Comp. Hadron field is developed. The similar to conventional procedures are performed in Lorentz-Inv. Particle Frame (Galilean Inertial Frame with v=const) which becomes Lorentz-Cov. Observer F., when v=0. The one notable feature of the tau-Q.M. is concerned only future-development,...

  4. Dissimilarity-based classification of anatomical tree structures

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Lo, Pechin Chien Pau; Dirksen, Asger;

    2011-01-01

    A novel method for classification of abnormality in anatomical tree structures is presented. A tree is classified based on direct comparisons with other trees in a dissimilarity-based classification scheme. The pair-wise dissimilarity measure between two trees is based on a linear assignment betw...

  5. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Science.gov (United States)

    2010-01-01

    ... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... biogeographic classification scheme is used to ensure that the National Estuarine Research Reserve System... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false National Estuarine Research...

  6. Texture Classification based on Gabor Wavelet

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2012-07-01

    Full Text Available This paper presents the comparison of Texture classification algorithms based on Gabor Wavelets. The focus of this paper is on feature extraction scheme for texture classification. The texture feature for an image can be classified using texture descriptors. In this paper we have used Homogeneous texture descriptor that uses Gabor Wavelets concept. For texture classification, we have used online texture database that is Brodatz’s database and three advanced well known classifiers: Support Vector Machine, K-nearest neighbor method and decision tree induction method. The results shows that classification using Support vector machines gives better results as compare to the other classifiers. It can accurately discriminate between a testing image data and training data.

  7. An Evaluation of Different Training Sample Allocation Schemes for Discrete and Continuous Land Cover Classification Using Decision Tree-Based Algorithms

    Directory of Open Access Journals (Sweden)

    René Roland Colditz

    2015-07-01

    Full Text Available Land cover mapping for large regions often employs satellite images of medium to coarse spatial resolution, which complicates mapping of discrete classes. Class memberships, which estimate the proportion of each class for every pixel, have been suggested as an alternative. This paper compares different strategies of training data allocation for discrete and continuous land cover mapping using classification and regression tree algorithms. In addition to measures of discrete and continuous map accuracy the correct estimation of the area is another important criteria. A subset of the 30 m national land cover dataset of 2006 (NLCD2006 of the United States was used as reference set to classify NADIR BRDF-adjusted surface reflectance time series of MODIS at 900 m spatial resolution. Results show that sampling of heterogeneous pixels and sample allocation according to the expected area of each class is best for classification trees. Regression trees for continuous land cover mapping should be trained with random allocation, and predictions should be normalized with a linear scaling function to correctly estimate the total area. From the tested algorithms random forest classification yields lower errors than boosted trees of C5.0, and Cubist shows higher accuracies than random forest regression.

  8. Fair Electronic Payment Scheme Based on DSA

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-bin; HONG Fan; ZHU Xian

    2005-01-01

    We present a multi-signature scheme based on DSA and describes a fair electronic payment scheme based on improved DSA signatures. The scheme makes both sides in equal positions during the course of electronic transaction. A Trusted Third Party (TTP) is involved in the scheme to guarantee the fairness of the scheme for both sides. However, only during the course of registration and dispute resolution will TTP be needed. TTP is not needed during the normal payment stage.

  9. Acoustic classification schemes in Europe – Applicability for new, existing and renovated housing

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2016-01-01

    The first acoustic classification schemes for dwellings were published in the 1990’es as national standards with the main purpose to introduce the possibility of specifying easily stricter acoustic criteria for new-build than the minimum requirements found in building regulations. Since then, more...... countries have introduced acoustic classification schemes, the first countries updated more times and some countries introduced acoustic classification also for other building categories. However, the classification schemes continued to focus on new buildings and have in general limited applicability for...... international scheme for classification of dwellings under development in ISO/TC43/SC2 will be explained. One of several key characteristics of the proposal is a wide range of classes, implying applicability to a major part of the existing housing stock in Europe, thus enabling acoustic labelling like energy...

  10. A novel adaptive classification scheme for digital modulations in satellite communication

    Institute of Scientific and Technical Information of China (English)

    Wu Dan; Gu Xuemai; Guo Qing

    2007-01-01

    To make the modulation classification system more suitable for signals in a wide range of signal to noise ratios (SNRs) , a novel adaptive modulation classification scheme is presented in this paper. Different from traditional schemes, the proposed scheme employs a new SNR estimation algorithm for small samples before modulation classification, which makes the modulation classifier work adaptively according to estimated SNRs. Furthermore, it uses three efficient features and support vector machines (SVM) in modulation classification. Computer simulation shows that the scheme can adaptively classify ten digital modulation types (i.e. 2ASK, 4ASK, 2FSK, 4FSK, 2PSK, 4PSK, 16QAM, TFM, π/4QPSK and OQPSK) at SNRS ranging from OdB to 25 dB and success rates are over 95% when SNR is not lower than 3dB. Accuracy, efficiency and simplicity of the proposed scheme are obviously improved, which make it more adaptive to engineering applications.

  11. Malware Detection, Supportive Software Agents and Its Classification Schemes

    Directory of Open Access Journals (Sweden)

    Adebayo, Olawale Surajudeen

    2012-12-01

    Full Text Available Over time, the task of curbing the emergence of malware and its dastard activities has been identified interms of analysis, detection and containment of malware. Malware is a general term that is used todescribe the category of malicious software that is part of security threats to the computer and internetsystem. It is a malignant program designed to hamper the effectiveness of a computer and internetsystem. This paper aims at identifying the malware as one of the most dreaded threats to an emergingcomputer and communication technology. The paper identified the category of malware, malwareclassification algorithms, malwares activities and ways of preventing and removing malware if iteventually infects system.The research also describes tools that classify malware dataset using a rule-based classification schemeand machine learning algorithms to detect the malicious program from normal program through patternrecognition.

  12. Optimal Timer Based Selection Schemes

    CERN Document Server

    Shah, Virag; Yim, Raymond

    2009-01-01

    Timer-based mechanisms are often used to help a given (sink) node select the best helper node among many available nodes. Specifically, a node transmits a packet when its timer expires, and the timer value is a monotone non-increasing function of its local suitability metric. The best node is selected successfully if no other node's timer expires within a 'vulnerability' window after its timer expiry, and so long as the sink can hear the available nodes. In this paper, we show that the optimal metric-to-timer mapping that (i) maximizes the probability of success or (ii) minimizes the average selection time subject to a minimum constraint on the probability of success, maps the metric into a set of discrete timer values. We specify, in closed-form, the optimal scheme as a function of the maximum selection duration, the vulnerability window, and the number of nodes. An asymptotic characterization of the optimal scheme turns out to be elegant and insightful. For any probability distribution function of the metri...

  13. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  14. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers

    Science.gov (United States)

    Bennett, David A.; Blennow, Kaj; Carrillo, Maria C.; Feldman, Howard H.; Frisoni, Giovanni B.; Hampel, Harald; Jagust, William J.; Johnson, Keith A.; Knopman, David S.; Petersen, Ronald C.; Scheltens, Philip; Sperling, Reisa A.; Dubois, Bruno

    2016-01-01

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the “A/T/N” system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. “A” refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); “T,” the value of a tau biomarker (CSF phospho tau, or tau PET); and “N,” biomarkers of neurodegeneration or neuronal injury ([18F]-fluorodeoxyglucose–PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N−, or A+/T−/N−, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme. PMID:27371494

  15. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers.

    Science.gov (United States)

    Jack, Clifford R; Bennett, David A; Blennow, Kaj; Carrillo, Maria C; Feldman, Howard H; Frisoni, Giovanni B; Hampel, Harald; Jagust, William J; Johnson, Keith A; Knopman, David S; Petersen, Ronald C; Scheltens, Philip; Sperling, Reisa A; Dubois, Bruno

    2016-08-01

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the "A/T/N" system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. "A" refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); "T," the value of a tau biomarker (CSF phospho tau, or tau PET); and "N," biomarkers of neurodegeneration or neuronal injury ([(18)F]-fluorodeoxyglucose-PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N-, or A+/T-/N-, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme. PMID:27371494

  16. Development the EarthCARE aerosol classification scheme

    Science.gov (United States)

    Wandinger, Ulla; Baars, Holger; Hünerbein, Anja; Donovan, Dave; van Zadelhoff, Gerd-Jan; Fischer, Jürgen; von Bismarck, Jonas; Eisinger, Michael; Lajas, Dulce; Wehr, Tobias

    2015-04-01

    the consistency of EarthCARE retrievals, to support aerosol description in the EarthCARE simulator ECSIM, and to facilitate a uniform specification of broad-band aerosol optical properties, a hybrid end-to-end aerosol classification model (HETEAC) is developed which serves as a baseline for EarthCARE algorithm development and evaluation procedures. The model's theoretical description of aerosol microphysics (bi-modal size distribution, spectral refractive index, and particle shape distribution) is adjusted to experimental data of aerosol optical properties, i.e. lidar ratio, depolarization ratio, Ångström exponents (hybrid approach). The experimental basis is provided by ground-based observations with sophisticated multi-wavelength, polarization lidars applied in the European Aerosol Research Lidar Network (EARLINET) and in dedicated field campaigns in the Sahara (SAMUM-1), Cape Verde (SAMUM-2), Barbados (SALTRACE), Atlantic Ocean (Polarstern and Meteor cruises), and Amazonia. The model is designed such that it covers the entire loop from aerosol microphysics via aerosol classification to optical and radiative properties of the respective types and allows consistency checks of modeled and measured parameters (end-to-end approach). Optical modeling considers scattering properties of spherical and non-spherical particles. A suitable set of aerosol types is defined which includes dust, clean marine, clean continental, pollution, smoke, and stratospheric aerosol. Mixtures of these types are included as well. The definition is consistent with CALIPSO approaches and will thus enable the establishment of a long-term global four-dimensional aerosol dataset.

  17. Sound classification of dwellings – A diversity of national schemes in Europe

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2011-01-01

    Sound classification schemes for dwellings exist in ten countries in Europe, typically prepared and published as national standards. The schemes define quality classes intended to reflect different levels of acoustical comfort. The main criteria concern airborne and impact sound insulation betwee...

  18. Biogeography based Satellite Image Classification

    CERN Document Server

    Panchal, V K; Kaur, Navdeep; Kundra, Harish

    2009-01-01

    Biogeography is the study of the geographical distribution of biological organisms. The mindset of the engineer is that we can learn from nature. Biogeography Based Optimization is a burgeoning nature inspired technique to find the optimal solution of the problem. Satellite image classification is an important task because it is the only way we can know about the land cover map of inaccessible areas. Though satellite images have been classified in past by using various techniques, the researchers are always finding alternative strategies for satellite image classification so that they may be prepared to select the most appropriate technique for the feature extraction task in hand. This paper is focused on classification of the satellite image of a particular land cover using the theory of Biogeography based Optimization. The original BBO algorithm does not have the inbuilt property of clustering which is required during image classification. Hence modifications have been proposed to the original algorithm and...

  19. Classification method based on KCCA

    Science.gov (United States)

    Wang, Zhanqing; Zhang, Guilin; Zhao, Guangzhou

    2007-11-01

    Nonlinear CCA extends the linear CCA in that it operates in the kernel space and thus implies the nonlinear combinations in the original space. This paper presents a classification method based on the kernel canonical correlation analysis (KCCA). We introduce the probabilistic label vectors (PLV) for a give pattern which extend the conventional concept of class label, and investigate the correlation between feature variables and PLV variables. A PLV predictor is presented based on KCCA, and then classification is performed on the predicted PLV. We formulate a frame for classification by integrating class information through PLV. Experimental results on Iris data set classification and facial expression recognition show the efficiencies of the proposed method.

  20. Sound classification of dwellings – A diversity of national schemes in Europe

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2011-01-01

    Sound classification schemes for dwellings exist in ten countries in Europe, typically prepared and published as national standards. The schemes define quality classes intended to reflect different levels of acoustical comfort. The main criteria concern airborne and impact sound insulation betwee...... Housing Constructions", has been established and runs 2009-2013. The main objectives of TU0901 are to prepare proposals for harmonized sound insulation descriptors and for a European sound classification scheme with a number of quality classes for dwellings.......Sound classification schemes for dwellings exist in ten countries in Europe, typically prepared and published as national standards. The schemes define quality classes intended to reflect different levels of acoustical comfort. The main criteria concern airborne and impact sound insulation between...... dwellings, facade sound insulation and installation noise. This paper presents the sound classification schemes in Europe and compares the class criteria for sound insulation between dwellings. The schemes have been implemented and revised gradually since the early 1990s. However, due to lack of...

  1. Classification-based reasoning

    Science.gov (United States)

    Gomez, Fernando; Segami, Carlos

    1991-01-01

    A representation formalism for N-ary relations, quantification, and definition of concepts is described. Three types of conditions are associated with the concepts: (1) necessary and sufficient properties, (2) contingent properties, and (3) necessary properties. Also explained is how complex chains of inferences can be accomplished by representing existentially quantified sentences, and concepts denoted by restrictive relative clauses as classification hierarchies. The representation structures that make possible the inferences are explained first, followed by the reasoning algorithms that draw the inferences from the knowledge structures. All the ideas explained have been implemented and are part of the information retrieval component of a program called Snowy. An appendix contains a brief session with the program.

  2. Texture classification based on EMD and FFT

    Institute of Scientific and Technical Information of China (English)

    XIONG Chang-zhen; XU Jun-yi; ZOU Jian-cheng; QI Dong-xu

    2006-01-01

    Empirical mode decomposition (EMD) is an adaptive and approximately orthogonal filtering process that reflects human's visual mechanism of differentiating textures. In this paper, we present a modified 2D EMD algorithm using the FastRBF and an appropriate number of iterations in the shifting process (SP), then apply it to texture classification. Rotation-invariant texture feature vectors are extracted using auto-registration and circular regions of magnitude spectra of 2D fast Fourier transform(FFT). In the experiments, we employ a Bayesion classifier to classify a set of 15 distinct natural textures selected from the Brodatz album. The experimental results, based on different testing datasets for images with different orientations, show the effectiveness of the proposed classification scheme.

  3. Quantum Authentication Scheme Based on Entanglement Swapping

    Science.gov (United States)

    Penghao, Niu; Yuan, Chen; Chong, Li

    2016-01-01

    Based on the entanglement swapping, a quantum authentication scheme with a trusted- party is proposed in this paper. With this scheme, two users can perform mutual identity authentication to confirm each other's validity. In addition, the scheme is proved to be secure under circumstances where a malicious attacker is capable of monitoring the classical and quantum channels and has the power to forge all information on the public channel.

  4. Classification of base sequences

    CERN Document Server

    Djokovic, Dragomir Z

    2010-01-01

    Base sequences BS(n+1,n) are quadruples of {1,-1}-sequences (A;B;C;D), with A and B of length n+1 and C and D of length n, such that the sum of their nonperiodic autocorrelation functions is a delta-function. The base sequence conjecture, asserting that BS(n+1,n) exist for all n, is stronger than the famous Hadamard matrix conjecture. We introduce a new definition of equivalence for base sequences BS(n+1,n) and construct a canonical form. By using this canonical form, we have enumerated the equivalence classes of BS(n+1,n) for n <= 30. Due to excessive size of the equivalence classes, the tables in the paper cover only the cases n <= 12.

  5. Classification of basic facilities for high-rise residential: A survey from 100 housing scheme in Kajang area

    Science.gov (United States)

    Ani, Adi Irfan Che; Sairi, Ahmad; Tawil, Norngainy Mohd; Wahab, Siti Rashidah Hanum Abd; Razak, Muhd Zulhanif Abd

    2016-08-01

    High demand for housing and limited land in town area has increasing the provision of high-rise residential scheme. This type of housing has different owners but share the same land lot and common facilities. Thus, maintenance works of the buildings and common facilities must be well organized. The purpose of this paper is to identify and classify basic facilities for high-rise residential building hoping to improve the management of the scheme. The method adopted is a survey on 100 high-rise residential schemes that ranged from affordable housing to high cost housing by using a snowball sampling. The scope of this research is within Kajang area, which is rapidly developed with high-rise housing. The objective of the survey is to list out all facilities in every sample of the schemes. The result confirmed that pre-determined 11 classifications hold true and can provide the realistic classification for high-rise residential scheme. This paper proposed for redefinition of facilities provided to create a better management system and give a clear definition on the type of high-rise residential based on its facilities.

  6. A High Dimensional Clustering Scheme for Data Classification

    Directory of Open Access Journals (Sweden)

    Tejalal choudhary

    2015-09-01

    Full Text Available The data mining is the knowledge extraction or finding the hidden patterns from large data these data may be in different form as well from different resources. The data mining systems can be used in various research domains like health, share market analysis, super market, weather forecasting and many other domains. Data mining systems use the computer oriented algorithms. These algorithms can be categorized as supervised and unsupervised respectively. The classification or prediction algorithms belong to supervised category and clustering algorithms are the type of unsupervised. The clustering is an approach to group the similar data objects to the same cluster and the main aspect of clustering is that the distance between data objects in same cluster should be as minimum as and distance of objects in inter cluster should be high. k-means is one of the most common clustering algorithm. K-means is very easy to use and efficient but has also some weakness because of random or inappropriate selection of initial centroids so need to improve k-means. The proposed work is an attempt to improve k means by using genetic algorithm for selection of initial cluster centroid.

  7. Dose classification scheme for digital imaging techniques in diagnostic radiology

    International Nuclear Information System (INIS)

    CT all clinical questions can be answered with certainty and regardless of clinical experience of the involved physician. They are often recommended by the equipment manufacturers and should be reviewed critically because of their high radiation exposure. Conclusion: the classification of applicable doses in three classes can generally be considered as a practicable way of dose reduction. (author)

  8. A Novel Broadcast Scheme DSR-based Mobile Adhoc Networks

    Directory of Open Access Journals (Sweden)

    Muneer Bani Yassein

    2016-04-01

    Full Text Available Traffic classification seeks to assign packet flows to an appropriate quality of service (QoS. Despite many studies that have placed a lot of emphasis on broadcast communication, broadcasting in MANETs is still a problematic issue. Due to the absence of the fixed infrastructure in MANETs, broadcast is an essential operation for all network nodes. Although the blind flooding is the simplest broadcasting technique, it is inefficient and lacks resource utilization efficiency. One of the proposed schemes to mitigate the blind flooding deficiency is the counter based broadcast scheme that depends on the number of received duplicate packets between the node and its neighbors, where the node compares the duplicate packet itself and each neighbor node that previously re-broadcasted a packet. Due to the fact that existing counter-based schemes are mainly based on the fixed counter based approach, these schemes are not efficient in different operating conditions. Thus, unlike existing studies, this paper proposes a dynamic counter based threshold value and examines its effectiveness under the Dynamic Source Routing Protocol (DSR which is one of the well-known on-demand routing protocols. Specifically, we develop in this paper a new counter based broadcast algorithm under the umbrella of the DSR, namely, Inspired Counter Based Broadcasting (DSR-ICB. Using various simulation experiments, DSR-ICB has shown good performance especially in terms of delay and the number of redundant packets.

  9. Hybrid Support Vector Machines-Based Multi-fault Classification

    Institute of Scientific and Technical Information of China (English)

    GAO Guo-hua; ZHANG Yong-zhong; ZHU Yu; DUAN Guang-huang

    2007-01-01

    Support Vector Machines (SVM) is a new general machine-learning tool based on structural risk minimization principle. This characteristic is very signific ant for the fault diagnostics when the number of fault samples is limited. Considering that SVM theory is originally designed for a two-class classification, a hybrid SVM scheme is proposed for multi-fault classification of rotating machinery in our paper. Two SVM strategies, 1-v-1 (one versus one) and 1-v-r (one versus rest), are respectively adopted at different classification levels. At the parallel classification level, using 1-v-1 strategy, the fault features extracted by various signal analysis methods are transferred into the multiple parallel SVM and the local classification results are obtained. At the serial classification level, these local results values are fused by one serial SVM based on 1-v-r strategy. The hybrid SVM scheme introduced in our paper not only generalizes the performance of signal binary SVMs but improves the precision and reliability of the fault classification results. The actually testing results show the availability suitability of this new method.

  10. Modulation classification based on spectrogram

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The aim of modulation classification (MC) is to identify the modulation type of a communication signal. It plays an important role in many cooperative or noncooperative communication applications. Three spectrogram-based modulation classification methods are proposed. Their reccgnition scope and performance are investigated or evaluated by theoretical analysis and extensive simulation studies. The method taking moment-like features is robust to frequency offset while the other two, which make use of principal component analysis (PCA) with different transformation inputs,can achieve satisfactory accuracy even at low SNR (as low as 2 dB). Due to the properties of spectrogram, the statistical pattern recognition techniques, and the image preprocessing steps, all of our methods are insensitive to unknown phase and frequency offsets, timing errors, and the arriving sequence of symbols.

  11. A comparison between national scheme for the acoustic classification of dwellings in Europe and in the U.S

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2015-01-01

    work item. This paper compares sound classification schemes in Europe with the current situation in the United States. Economic evaluations related to the technological choices necessary to achieve different sound classification classes are also discussed. The hope is that a common sound classification......, focusing on sound insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant...... diversity in terms of descriptors, number of classes, and class intervals occurred between national schemes. However, a proposal ”acoustic classification scheme for dwellings” has been developed recently in the European COST Action TU0901 with 32 member countries. This proposal has been accepted as an ISO...

  12. An unsupervised classification scheme for improving predictions of prokaryotic TIS

    Directory of Open Access Journals (Sweden)

    Meinicke Peter

    2006-03-01

    Full Text Available Abstract Background Although it is not difficult for state-of-the-art gene finders to identify coding regions in prokaryotic genomes, exact prediction of the corresponding translation initiation sites (TIS is still a challenging problem. Recently a number of post-processing tools have been proposed for improving the annotation of prokaryotic TIS. However, inherent difficulties of these approaches arise from the considerable variation of TIS characteristics across different species. Therefore prior assumptions about the properties of prokaryotic gene starts may cause suboptimal predictions for newly sequenced genomes with TIS signals differing from those of well-investigated genomes. Results We introduce a clustering algorithm for completely unsupervised scoring of potential TIS, based on positionally smoothed probability matrices. The algorithm requires an initial gene prediction and the genomic sequence of the organism to perform the reannotation. As compared with other methods for improving predictions of gene starts in bacterial genomes, our approach is not based on any specific assumptions about prokaryotic TIS. Despite the generality of the underlying algorithm, the prediction rate of our method is competitive on experimentally verified test data from E. coli and B. subtilis. Regarding genomes with high G+C content, in contrast to some previously proposed methods, our algorithm also provides good performance on P. aeruginosa, B. pseudomallei and R. solanacearum. Conclusion On reliable test data we showed that our method provides good results in post-processing the predictions of the widely-used program GLIMMER. The underlying clustering algorithm is robust with respect to variations in the initial TIS annotation and does not require specific assumptions about prokaryotic gene starts. These features are particularly useful on genomes with high G+C content. The algorithm has been implemented in the tool »TICO«(TIs COrrector which is

  13. Threshold Ring Signature Scheme Based on TPM

    Institute of Scientific and Technical Information of China (English)

    Gong Bei; Jiang Wei; Lin Li; Li Yu; Zhang Xing

    2012-01-01

    The conventional ring signature schemes cannot address the scenario where the rank of members of the ring needs to be distinguished, for example, in electronically commerce application. To solve this problem, we presented a Trusted Platform Module (TPM)-based threshold ring signature schen. Employing a reliable secret Share Distribution Center (SDC), the proposed approach can authenticate the TPM-based identity rank of members of the ring but not track a specific member's identity. A subset including t members with the same identity rank is built. With the signing cooperation of t members of the subset, the ring signature based on Chinese remainder theorem is generated. We proved the anonymity and unforgeability of the proposed scheme and compared it with the threshold ring signature based on Lagrange interpolation polynomial. Our scheme is relatively simpler to calculate.

  14. LEARNING WITH ERROR BASED SEARCHABLE ENCRYPTION SCHEME

    Institute of Scientific and Technical Information of China (English)

    Zhang Jiuling; Deng Beixing; Li Xing

    2012-01-01

    A learning with error problem based encryption scheme that allows secure searching over the cipher text is proposed.Both the generation of cipher text and the trapdoor of the query are based on the problem of learning with errors.By performing an operation over the trapdoor and the cipher text,it is able to tell if the cipher text is the encryption of a plaintext.The secure searchable encryption scheme is both cipher text and trapdoor indistinguishable.The probabilities of missing and failing match occurrence in searching are both exponentially small.

  15. Evaluation of rock mass classification schemes: a case study from the Bowen Basin, Australia

    Science.gov (United States)

    Brook, Martin; Hebblewhite, Bruce; Mitra, Rudrajit

    2016-04-01

    The development of an accurate engineering geological model and adequate knowledge of spatial variation in rock mass conditions are important prerequisites for slope stability analyses, tunnel design, mine planning and risk management. Rock mass classification schemes such as Rock Mass Rating (RMR), Coal Mine Roof Rating (CMRR), Q-system and Roof Strength Index (RSI) have been used for a range of engineering geological applications, including transport tunnels, "hard rock" mining and underground and open-cut coal mines. Often, rock mass classification schemes have been evaluated on subaerial exposures, where weathering has affected joint characteristics and intact strength. In contrast, the focus of this evaluation of the above classification schemes is an underground coal mine in the Bowen Basin, central Queensland, Australia, 15 km east of the town of Moranbah. Rock mass classification was undertaken at 68 sites across the mine. Both the target coal seam and overlying rock show marked spatial variability in terms of RMR, CMRR and Q, but RSI showed limited sensitivity to changes in rock mass condition. Relationships were developed between different parameters with varying degrees of success. A mine-wide analysis of faulting was undertaken, and compared with in situ stress field and local-scale measurements of joint and cleat. While there are no unequivocal relationships between rock mass classification parameters and faulting, a central graben zone shows heterogeneous rock mass properties. The corollary is that if geological features can be accurately defined by remote sensing technologies, then this can assist in predicting rock mass conditions and risk management ahead of development and construction.

  16. An ECC-Based Blind Signature Scheme

    Directory of Open Access Journals (Sweden)

    Fuh-Gwo Jeng

    2010-08-01

    Full Text Available Cryptography is increasingly applied to the E-commerce world, especially to the untraceable payment system and the electronic voting system. Protocols for these systems strongly require the anonymous digital signature property, and thus a blind signature strategy is the answer to it. Chaum stated that every blind signature protocol should hold two fundamental properties, blindness and intractableness. All blind signature schemes proposed previously almost are based on the integer factorization problems, discrete logarithm problems, or the quadratic residues, which are shown by Lee et al. that none of the schemes is able to meet the two fundamental properties above. Therefore, an ECC-based blind signature scheme that possesses both the above properties is proposed in this paper.

  17. Identity-based signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    CHAI ZhenChuan; CAO ZhenFu; DONG XiaoLei

    2007-01-01

    Identity-based (ID-based) cryptography has drawn great concerns in recent years, and most of ID-based schemes are constructed from bilinear parings. Therefore, ID-based scheme without pairing is of great interest in the field of cryptography. Up to now,there still remains a challenge to construct ID-based signature scheme from quadratic residues. Thus, we aim to meet this challenge by proposing a concrete scheme. In this paper, we first introduce the technique of how to calculate a 2lth root of a quadratic residue, and then give a concrete ID-based signature scheme using such technique.We also prove that our scheme is chosen message and ID secure in the random oracle model, assuming the hardness of factoring.

  18. Feasibility analysis of two identity- based proxy ring signature schemes

    Institute of Scientific and Technical Information of China (English)

    Wang Huaqun; Zhang Lijun; Zhao Junxi

    2007-01-01

    Recently , proxy ring signature schemes have been shown to be useful in various applications , such as electronic polling, electronic payment, etc. Although many proxy ring signature schemes have been proposed, there are only two identity- based proxy ring signature schemes have been proposed until now, I.e., Cheng's scheme and Lang's scheme. It's unlucky that the two identity- based proxy ring signature schemes are unfeasible . This paper points out the reasons why the two identity- based proxy ring signature schemes are unfeasible. In order to design feasible and efficient identity-based proxy ring signature schemes from bilinear pairings , we have to search for other methods .

  19. 碳相及相关纳米结构的分类%Classification schemes for carbon phases and nanostructures

    Institute of Scientific and Technical Information of China (English)

    Evgeny A Belenkov; Vladimir A Greshnyakov

    2013-01-01

    介绍了一种基于材料中的化学键和原子的近邻原子数基础上的碳相及其相关纳米结构的分类新方法.该分类方法不仅可以描述已有碳结构,还可用于预测新的碳物相和结构.通过连接、原子叠加、以及已有结构转变可以获得新结构.使用这种方法可以用来预测金刚石晶系,发现了由原子放置在的等同位置处的30种类金刚石结构,其中18种结构为首次预测所得.%New schemes of structural classification for carbon phases and nanostructures have been proposed,which are based on the types of chemical bonds formed and the numbers of the nearest neighbors with which each atom forms covalent bonds.The classification schemes can describe not only the known phases,but also new phases and nanostructures.New phases can be derived by linking,superpositioning or cutting precursor structures.The classification scheme has been used to predict diamond polymorphs,yielding thirty diamond-like phases that consist of atoms in equivalent crystallographic positions and eighteen of which were predicted for the first time.

  20. Password Authentication Based on Fractal Coding Scheme

    OpenAIRE

    Al-Saidi, Nadia M. G.; Mohamad Rushdan Md. Said; Othman, Wan Ainun M.

    2012-01-01

    Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image ...

  1. Arabic Text Mining Using Rule Based Classification

    OpenAIRE

    Fadi Thabtah; Omar Gharaibeh; Rashid Al-Zubaidy

    2012-01-01

    A well-known classification problem in the domain of text mining is text classification, which concerns about mapping textual documents into one or more predefined category based on its content. Text classification arena recently attracted many researchers because of the massive amounts of online documents and text archives which hold essential information for a decision-making process. In this field, most of such researches focus on classifying English documents while there are limited studi...

  2. Identity-based ring signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    Xiong Hu; Qin Zhiguang; Li Fagen

    2009-01-01

    Identity-based (ID-based) ring signature has drawn great concerns in recent years and many ID-based ring signature schemes have been proposed until now. Unfortunately, all of these ID-based ring signatures are constructed from bilinear pairings, a powerful but computationally expensive primitive. Hence, ID-based ring signature without pairing is of great interest in the field of cryptography. In this paper, the authors firstly propose an ID-based ring signature scheme based on quadratic residues. The proposed scheme is proved to be existentially unforgeable against adaptive chosen message-and-identity attack under the random oracle model, assuming the hardness of factoring. The proposed scheme is more efficient than those which are constructed from bilinear pairings.

  3. A Dynamic Multimedia User-Weight Classification Scheme for IEEE_802.11 WLANs

    CERN Document Server

    Rebai, Ahmed Riadh; 10.5121/ijcnc.2011.3214

    2011-01-01

    In this paper we expose a dynamic traffic-classification scheme to support multimedia applications such as voice and broadband video transmissions over IEEE 802.11 Wireless Local Area Networks (WLANs). Obviously, over a Wi-Fi link and to better serve these applications - which normally have strict bounded transmission delay or minimum link rate requirement - a service differentiation technique can be applied to the media traffic transmitted by the same mobile node using the well-known 802.11e Enhanced Distributed Channel Access (EDCA) protocol. However, the given EDCA mode does not offer user differentiation, which can be viewed as a deficiency in multi-access wireless networks. Accordingly, we propose a new inter-node priority access scheme for IEEE 802.11e networks which is compatible with the EDCA scheme. The proposed scheme joins a dynamic user-weight to each mobile station depending on its outgoing data, and therefore deploys inter-node priority for the channel access to complement the existing EDCA inte...

  4. DIFFERENCE SCHEMES BASING ON COEFFICIENT APPROXIMATION

    Institute of Scientific and Technical Information of China (English)

    MOU Zong-ze; LONG Yong-xing; QU Wen-xiao

    2005-01-01

    In respect of variable coefficient differential equations, the equations of coefficient function approximation were more accurate than the coefficient to be frozen as a constant in every discrete subinterval. Usually, the difference schemes constructed based on Taylor expansion approximation of the solution do not suit the solution with sharp function.Introducing into local bases to be combined with coefficient function approximation, the difference can well depict more complex physical phenomena, for example, boundary layer as well as high oscillatory, with sharp behavior. The numerical test shows the method is more effective than the traditional one.

  5. Head/tail Breaks: A New Classification Scheme for Data with a Heavy-tailed Distribution

    CERN Document Server

    Jiang, Bin

    2012-01-01

    This paper introduces a new classification scheme - head/tail breaks - in order to find groupings or hierarchy for data with a heavy-tailed distribution. The heavy-tailed distributions are heavily right skewed, with a minority of large values in the head and a majority of small values in the tail, commonly characterized by a power law, a lognormal or an exponential function. For example, a country's population is often distributed in such a heavy-tailed manner, with a minority of people (e.g., 20 percent) in the countryside and the vast majority (e.g., 80 percent) in urban areas. This heavy-tailed distribution is also called scaling, hierarchy or scaling hierarchy. This new classification scheme partitions all of the data values around the mean into two parts and continues the process iteratively for the values (above the mean) in the head until the head part values are no longer heavy-tailed distributed. Thus, the number of classes and the class intervals are both naturally determined. We therefore claim tha...

  6. Climatological characteristics of the tropics in China: climate classification schemes between German scientists and Huang Bingwei

    Institute of Scientific and Technical Information of China (English)

    ManfredDomroes

    2003-01-01

    Reviewing some important German scientists who have developed climatic regionalization schemes either on a global or Chinese scale, their various definitions of the tropical climate characteristics in China are discussed and compared with Huang Bingwei's climate classification scheme and the identification of the tropical climate therein. It can be seen that, due to different methodological approaches of the climatic regionalization schemes, the definitions of the tropics vary and hence also their spatial distribution in China. However, it is found that the tropical climate type occupies only a peripheral part of southern China, though it firmly represents a distinctive type of climate that is subsequently associated with a great economic importance for China. As such, the tropical climate type was mostly identified with its agro-climatological significance, that is by giving favourable growing conditions all-year round for perennial crops with a great heat demand. Tropical climate is, hence, conventionally regarded to be governed by all-year round summer conditions "where winter never comes".

  7. A proper Land Cover and Forest Type Classification Scheme for Mexico

    Science.gov (United States)

    Gebhardt, S.; Maeda, P.; Wehrmann, T.; Argumedo Espinoza, J.; Schmidt, M.

    2015-04-01

    The imminent implementation of a REDD+ MRV system in Mexico in 2015, demanding operational annual land cover change reporting, requires highly accurate, annual and high resolution forest type maps; not only for monitoring but also to establish the historical baseline from the 1990s onwards. The employment of any supervised classifier demands exhaustive definition of land cover classes and the representation of all classes in the training stage. This paper reports the process of a data driven class separability analysis and the definition and application of a national land cover classification scheme. All Landsat data recorded over Mexico in the year 2000 with cloud coverage below 10 percent and a national digital elevation model have been used. Automatic wall-2-wall image classification has been performed trained by national reference data on land use and vegetation types with 66 classes. Validation has been performed against field plots of the national forest inventory. Groups of non-separable classes have subsequently been discerned by automatic iterative class aggregation. Class aggregations have finally been manually revised and modified towards a proposed national land cover classification scheme at 4 levels with 35 classes at the highest level including 13 classes for primary temperate and tropical forests, 2 classes for secondary temperate and tropical forest, 1 for induced or cultivated forest, as also 8 different scrubland classes. The remaining 11 classes cover agriculture, grassland, wetland, water bodies, urban and other vegetation land cover classes. The remaining 3 levels provide further hierarchic aggregations with 14, 10, and 8 classes, respectively. Trained by the relabeled training dataset wall-2-wall classification towards the 35 classes has been performed. The final national land cover dataset has been validated against more than 200,000 polygons randomly distributed all over the country with class labels derived by manual interpretation. The

  8. Password Authentication Based on Fractal Coding Scheme

    Directory of Open Access Journals (Sweden)

    Nadia M. G. Al-Saidi

    2012-01-01

    Full Text Available Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image mechanisms. The advantage of fractal image coding is to be used to securely send the compressed image data through a nonsecured communication channel to the server. The verification of client information with the database system is achieved in the server to authenticate the legal user. The encrypted hashed password in the decoded fractal image is recognized using optical character recognition. The authentication process is performed after a successful verification of the client identity by comparing the decrypted hashed password with those which was stored in the database system. The system is analyzed and discussed from the attacker’s viewpoint. A security comparison is performed to show that the proposed scheme provides an essential security requirement, while their efficiency makes it easier to be applied alone or in hybrid with other security methods. Computer simulation and statistical analysis are presented.

  9. Sound classification of dwellings in the Nordic countries – Differences and similarities between the five national schemes

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2012-01-01

    In all five Nordic countries, sound classification schemes for dwellings have been published in national standards being implemented and revised gradually since the late 1990s. The national classification criteria for dwellings originate from a common Nordic INSTA-B proposal from the 1990s, thus...... having several similarities. In 2012, status is that number and denotations of classes for dwellings are identical in the Nordic countries, but the structures of the standards and several details are quite different. Also the issues dealt with are different. Examples of differences are sound insulation...... for classification of such buildings. This paper presents and compares the main class criteria for sound insulation of dwellings and summarizes differences and similarities in criteria and in structures of standards. Classification schemes for dwellings also exist in several other countries in Europe...

  10. Secure Order-Specified Multisignature Scheme Based on DSA

    Institute of Scientific and Technical Information of China (English)

    YANG Muxiang; SU Li; LI Jun; HONG Fan

    2006-01-01

    In multisignature schemes signers can sign either in a linear order or not in any specified order, but neither of them is adequate in some scenarios where require mixture using of orderless and ordered multisignature. Most order-specified multisignatures specified the orders as linear ones. In this paper, we proposed an order-specified multisignature scheme based on DSA secure against active insider attack. To our knowledge, it is the first order-specified multisignature scheme based on DSA signature scheme, in which signers can sign in flexible order represented by series-parallel graphs. In the multisignature scheme verification to both signers and signing order are available. The security of the scheme is proved by reduce to an identification scheme that is proved have some concrete security. The running time of verifying a signature is comparable to previous schemes while the running time of multisignature generation and the space needed is less than those schemes.

  11. Joint efforts to harmonize sound insulation descriptors and classification schemes in Europe (COST TU0901)

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2010-01-01

    , new housing must meet the needs of the people and offer comfort. Also for existing housing, sound insulation aspects should be taken into account, when renovating housing; otherwise the renovation is not “sustainable”. A joint European Action, COST TU0901 "Integrating and Harmonizing Sound Insulation...... Aspects in Sustainable Urban Housing Constructions", has been approved and runs 2009-2013. The main objectives are to prepare proposals for harmonized sound insulation descriptors and for a European sound classification scheme. Other goals are e.g. to establish a catalogue of sound insulation data...... and an on-line compendium on good workmanship. The paper will summarize the background, discuss the present situation in Europe and describe the joint efforts to reduce the diversity in Europe....

  12. A Classification-based Review Recommender

    Science.gov (United States)

    O'Mahony, Michael P.; Smyth, Barry

    Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.

  13. A Malware Detection Scheme Based on Mining Format Information

    Science.gov (United States)

    Bai, Jinrong; Wang, Junfeng; Zou, Guozhong

    2014-01-01

    Malware has become one of the most serious threats to computer information system and the current malware detection technology still has very significant limitations. In this paper, we proposed a malware detection approach by mining format information of PE (portable executable) files. Based on in-depth analysis of the static format information of the PE files, we extracted 197 features from format information of PE files and applied feature selection methods to reduce the dimensionality of the features and achieve acceptable high performance. When the selected features were trained using classification algorithms, the results of our experiments indicate that the accuracy of the top classification algorithm is 99.1% and the value of the AUC is 0.998. We designed three experiments to evaluate the performance of our detection scheme and the ability of detecting unknown and new malware. Although the experimental results of identifying new malware are not perfect, our method is still able to identify 97.6% of new malware with 1.3% false positive rates. PMID:24991639

  14. Clinical presentation and outcome prediction of clinical, serological, and histopathological classification schemes in ANCA-associated vasculitis with renal involvement.

    Science.gov (United States)

    Córdova-Sánchez, Bertha M; Mejía-Vilet, Juan M; Morales-Buenrostro, Luis E; Loyola-Rodríguez, Georgina; Uribe-Uribe, Norma O; Correa-Rotter, Ricardo

    2016-07-01

    Several classification schemes have been developed for anti-neutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV), with actual debate focusing on their clinical and prognostic performance. Sixty-two patients with renal biopsy-proven AAV from a single center in Mexico City diagnosed between 2004 and 2013 were analyzed and classified under clinical (granulomatosis with polyangiitis [GPA], microscopic polyangiitis [MPA], renal limited vasculitis [RLV]), serological (proteinase 3 anti-neutrophil cytoplasmic antibodies [PR3-ANCA], myeloperoxidase anti-neutrophil cytoplasmic antibodies [MPO-ANCA], ANCA negative), and histopathological (focal, crescenteric, mixed-type, sclerosing) categories. Clinical presentation parameters were compared at baseline between classification groups, and the predictive value of different classification categories for disease and renal remission, relapse, renal, and patient survival was analyzed. Serological classification predicted relapse rate (PR3-ANCA hazard ratio for relapse 2.93, 1.20-7.17, p = 0.019). There were no differences in disease or renal remission, renal, or patient survival between clinical and serological categories. Histopathological classification predicted response to therapy, with a poorer renal remission rate for sclerosing group and those with less than 25 % normal glomeruli; in addition, it adequately delimited 24-month glomerular filtration rate (eGFR) evolution, but it did not predict renal nor patient survival. On multivariate models, renal replacement therapy (RRT) requirement (HR 8.07, CI 1.75-37.4, p = 0.008) and proteinuria (HR 1.49, CI 1.03-2.14, p = 0.034) at presentation predicted renal survival, while age (HR 1.10, CI 1.01-1.21, p = 0.041) and infective events during the induction phase (HR 4.72, 1.01-22.1, p = 0.049) negatively influenced patient survival. At present, ANCA-based serological classification may predict AAV relapses, but neither clinical nor serological

  15. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  16. A network condition classification scheme for supporting video delivery over wireless Internet

    Institute of Scientific and Technical Information of China (English)

    CHAN Siu-ping; SUN Ming-ting

    2006-01-01

    Real-time video transport over wireless Internet faces many challenges due to the heterogeneous environment including wireline and wireless networks. A robust network condition classification algorithm using multiple end-to-end metrics and Support Vector Machine (SVM) is proposed to classify different network events and model the transition pattern of network conditions.End-to-end Quality-of-Service (QoS) mechanisms like congestion control, error control, and power control can benefit from the network condition information and react to different network situations appropriately. The proposed network condition classification algorithm uses SVM as a classifier to cluster different end-to-end metrics such as end-to-end delay, delay jitter, throughput and packet loss-rate for the UDP traffic with TCP-friendly Rate Control (TFRC), which is used for video transport. The algorithm is also flexible for classifying different numbers of states representing different levels of network events such as wireline congestion and wireless channel loss. Simulation results using network simulator 2 (ns2) showed the effectiveness of the proposed scheme.

  17. Blind Signature Scheme Based on Chebyshev Polynomials

    Directory of Open Access Journals (Sweden)

    Maheswara Rao Valluri

    2011-12-01

    Full Text Available A blind signature scheme is a cryptographic protocol to obtain a valid signature for a message from a signer such that signer’s view of the protocol can’t be linked to the resulting message signature pair. This paper presents blind signature scheme using Chebyshev polynomials. The security of the given scheme depends upon the intractability of the integer factorization problem and discrete logarithms ofChebyshev polynomials.

  18. Polarimetric radar observations during an orographic rain event and the performance of a hydrometeor classification scheme

    Directory of Open Access Journals (Sweden)

    M. Frech

    2014-07-01

    Full Text Available An intense orographic precipitation event is analysed using two polarimetric C-Band radars situated north of the Alps on 5 January 2013. One radar is operated at DWD's meteorological observatory Hohenpeißenberg (MHP, 1006 m a.s.l. – above sea level and the Memmingen (MEM, 65 km west of MHP, 600 m a.s.l. radar is part of DWD's operational radar network. The event lasted about 1.5 days and in total 44 mm precipitation was measured at Hohenpeißenberg. Detailed high resolution observation on the vertical structure of this event is obtained through a birdbath scan at 90° elevation which is part of the operational scanning. This scan is acquired every 5 min and provides meteorological profiles at high spatial resolution. In the course of this event, the melting layer (ML descends until the transition from rain into snow is observed at ground level. This transition from rain into snow is well documented by local weather observers and a present-weather sensor. The orographic precipitation event reveals mesoscale variability above the melting layer which is unexpected from a meteorological point of view. It corresponds to a substantial increase in rain rate at the surface. The performance of the newly developed hydrometeor classification scheme "Hymec" using Memmingen radar data over Hohenpeißenberg is analyzed. The detection in location and timing of the ML agrees well with the Hohenpeißenberg radar data. Considering the size of the Memmingen radar sensing volume, the detected hydrometeor (HM types are consistent for measurements at or in a ML, even though surface observation indicate for example rain whereas the predominant HM is classified as wet snow. To better link the HM classification with the surface observation, either better thermodynamic input is needed for Hymec or a statistical correction of the HM classification similar to a model output statistics (MOS approach may be needed.

  19. Review of habitat classification schemes appropriate to streams, rivers, and connecting channels in the Great Lakes drainage system

    Science.gov (United States)

    Hudson, Patrick L.; Griffiths, R.W.; Wheaton, T.J.

    1992-01-01

    Studies of lotic classification, zonation, and distribution carried out since the turn of the century were reviewed for their use in developing a habitat classification scheme for flowing water in the Great Lakes drainage basin. Seventy papers, dealing mainly with fish but including benthos, were organized into four somewhat distinct groups. A heirarchical scale of habitat measurements is suggested, and sources of data and inventory methods, including statistical treatment, are reviewed. An outline is also provided for developing a classification system for riverine habitat in the Great Lakes drainage basin.

  20. A NEW SCHEME BASED ON THE MI SCHEME AND ITS ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jiao Luyao; Li Yifa; Qiao Shuaiting

    2013-01-01

    This article aims at designing a new Multivariate Quadratic (MQ) public-key scheme to avoid the linearization attack and differential attack against the Matsumoto-Imai (MI) scheme.Based on the original scheme,our new scheme,named the Multi-layer MI (MMI) scheme,has a structure of multi-layer central map.Firstly,this article introduces the MI scheme and describes linearization attack and differential attack; then prescribes the designation of MMI in detail,and proves that MMI can resist both linearization attack and differential attack.Besides,this article also proves that MMI can resist recent eXtended Linearization (XL)-like methods.In the end,this article concludes that MMI also maintains the efficiency of MI.

  1. A new threshold signature scheme based on fuzzy biometric identity

    Institute of Scientific and Technical Information of China (English)

    Yongquan Cai; Ke Zhang

    2009-01-01

    The focus of this paper is to present the first threshold signature scheme based on biometric identity, which is acquired from a recently proposed fuzzy identities-based encryption scheme. An important feature of this scheme, which is different from other previous ID-based threshold signature schemes, is that it can be applied to situations using not only average personal attributes in social contact but also people's noisy biometric inputs as identities. The security of our scheme in the selective-lD model reduces the limit in the hardness of the Decisional BDH Assumption.

  2. Evaluating arguments based on Toulmin's scheme

    NARCIS (Netherlands)

    Verheij, Bart; Hitchcock, D; Verheij, B

    2006-01-01

    Toulmin's scheme for the layout of arguments (1958) represents an influential tool for the analysis of arguments. The scheme enriches the traditional premises-conclusion model of arguments by distinguishing additional elements, like warrant, backing and rebuttal. The present paper contains a formal

  3. Cryptanalysis and Improvement of Digital Multisignature Scheme Based on RSA

    Institute of Scientific and Technical Information of China (English)

    SU Li; CUI Guo-hua; CHEN Jing; YUAN Jun

    2007-01-01

    Zhang et al. proposed a sequential multisignature scheme based on RSA. The scheme has advantages of low computation and communication costs, and so on. However, we find a problem in their scheme that the verifier can not distinguish whether the multisignature is signed by all the signers of the group or only by the last signer. Thus, any single signature created by the last signer can be used as a multisignature created by the whole group members. This paper proposes an improved scheme that can overcome the defect. In the new scheme, the identity messages of all the signers are added in the multisignature and used in verification phase, so that the verifier can know the signature is generated by which signers. Performance analysis shows that the proposed scheme costs less computation than the original scheme in both signature and verification phases. Furthermore, each partial signature is based on the signer's identity certificate, which makes the scheme more secure.

  4. A NEW EFFICIENT ID-BASED PROXY BLIND SIGNATURE SCHEME

    Institute of Scientific and Technical Information of China (English)

    Ming Yang; Wang Yumin

    2008-01-01

    In a proxy blind signature scheme, the proxy signer is allowed to generate a blind signature on behalf of the original signer. The proxy blind signature scheme is useful in several applications such as e-voting, e-payment, etc. Recently, Zheng, et al. presented an IDentity (ID)-based proxy blind signature. In this paper, a new efficient ID-based proxy blind signature scheme from bilinear pairings is proposed, which can satisfy the security properties of both the proxy signatures and the blind signature schemes. Analysis of the scheme efficiency shows that the new scheme is more efficient than Zheng, et al.'s scheme. The proposed scheme is more practical in the real world.

  5. A quantum identification scheme based on polarization modulation

    Institute of Scientific and Technical Information of China (English)

    He Guang-Qiang; Zeng Gui-Hua

    2005-01-01

    Aquantum idetification scheme including registration and identification phases is proposed.The user' passwords are transmitted by qubit string and recorded as set of quantum operators.The security of the proposed scheme is guarateed by the no-coloning theorem.Based on photon polarization modulation,an experimental approach is also designed to implement our proposed scheme.

  6. An Identity-Based Strong Designated Verifier Proxy Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    WANG Qin; CAO Zhenfu

    2006-01-01

    In a strong designated verifier proxy signature scheme, a proxy signer can generate proxy signature on behalf of an original signer, but only the designated verifier can verify the validity of the proxy signature. In this paper, we first define the security requirements for strong designated verifier proxy signature schemes. And then we construct an identity-based strong designated verifier proxy signature scheme. We argue that the proposed scheme satisfies all of the security requirements.

  7. Verifiable (t, n) Threshold Signature Scheme Based on Elliptic Curve

    Institute of Scientific and Technical Information of China (English)

    WANG Hua-qun; ZHAO Jun-xi; ZHANG Li-jun

    2005-01-01

    Based on the difficulty of solving the ECDLP (elliptic curve discrete logarithm problem) on the finite field,we present a (t, n) threshold signature scheme and a verifiable key agreement scheme without trusted party. Applying a modified elliptic curve signature equation, we get a more efficient signature scheme than the existing ECDSA (elliptic curve digital signature algorithm) from the computability and security view. Our scheme has a shorter key, faster computation, and better security.

  8. A New Efficient Certificate-Based Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yichen; LI Jiguo; WANG Zhiwei; YAO Wei

    2015-01-01

    Certificate-based cryptography is a new kind of public key algorithm, which combines the merits of traditional Public key infrastructure (PKI) and identity-based cryptography. It removes the inherent key escrow problem in the identity-based cryptography and eliminates the certificate revocation problem and third-party queries in the traditional PKI. In this paper, we propose an effi-cient certificate-based signature scheme based on bilinear pairings. Under the strong security model of certificate-based signature scheme, we prove that our scheme is exis-tentially unforgeable against adaptive chosen message and identity attacks in the random oracle. In our scheme, only two pairing operations are needed in the signing and ver-ification processes. Compared with some certificate-based signature schemes from bilinear pairings, our scheme en-joys more advantage in computational cost and communi-cational cost.

  9. Belief Function Based Decision Fusion for Decentralized Target Classification in Wireless Sensor Networks

    OpenAIRE

    Wenyu Zhang; Zhenjiang Zhang

    2015-01-01

    Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any ty...

  10. Threshold Signature Scheme Based on Discrete Logarithm and Quadratic Residue

    Institute of Scientific and Technical Information of China (English)

    FEI Ru-chun; WANG Li-na

    2004-01-01

    Digital signature scheme is a very important research field in computer security and modern cryptography.A(k,n) threshold digital signature scheme is proposed by integrating digital signature scheme with Shamir secret sharing scheme.It can realize group-oriented digital signature, and its security is based on the difficulty in computing discrete logarithm and quadratic residue on some special conditions.In this scheme, effective digital signature can not be generated by any k-1 or fewer legal users, or only by signature executive.In addition, this scheme can identify any legal user who presents incorrect partial digital signature to disrupt correct signature, or any illegal user who forges digital signature.A method of extending this scheme to an Abelian group such as elliptical curve group is also discussed.The extended scheme can provide rapider computing speed and stronger security in the case of using shorter key.

  11. An improved identity-based proxy ring signature scheme

    Institute of Scientific and Technical Information of China (English)

    Lang Weimin; Yang Zongkai; Cheng Wenqing; Tan Yunmeng

    2005-01-01

    Proxy ring signature schemes have been shown to be useful in various applications, such as electronic polling, electronic payment, etc. In this paper, we point out that Lang's scheme is unreasonable and propose an improved Identity-based proxy ring scheme from bilinear pairings which is reasonable and overcomes the deficiencies of Lang's scheme. Our scheme can prevent the original signer from generating the proxy ring signature, thus the profits of the proxy signer are guaranteed. In addition, our scheme satisfies all the security requirements of proxy ring signature, I.e. Signer-ambiguity, non-forgeability, verification, non-deniability and distinguishability. As compared with Zhang's scheme, our scheme is a computational efficiency improvement for signature verification because the computational cost of bilinear pairings required is reduced from O(n) to O(1).

  12. Broadcast encryption schemes based on RSA

    Institute of Scientific and Technical Information of China (English)

    MU Ning-bo; HU Yu-pu; OU Hai-wen

    2009-01-01

    Three broadcast schemes for small receiver set using the property of RSA modulus are presented. They can solve the problem of data redundancy when the size of receiver set is small. In the proposed schemes, the center uses one key to encrypt the message and can revoke authorization conveniently. Every authorized user only needs to store one decryption key of a constant size. Among these three schemes, the first one has indistinguishability against adaptive chosen ciphertext attack (IND-CCA2) secure, and any collusion of authorized users cannot produce a new decryption key but the sizes of encryption modulus and ciphertext are linear in the number of receivers. In the second scheme, the size of ciphertext is half of the first one and any two authorized users can produce a new decryption key, but the center can identify them using the traitor tracing algorithm. The third one is the most efficient but the center cannot identify the traitors exactly.

  13. Classifying Obstructive and Nonobstructive Code Clones of Type I Using Simplified Classification Scheme: A Case Study

    Directory of Open Access Journals (Sweden)

    Miroslaw Staron

    2015-01-01

    Full Text Available Code cloning is a part of many commercial and open source development products. Multiple methods for detecting code clones have been developed and finding the clones is often used in modern quality assurance tools in industry. There is no consensus whether the detected clones are negative for the product and therefore the detected clones are often left unmanaged in the product code base. In this paper we investigate how obstructive code clones of Type I (duplicated exact code fragments are in large software systems from the perspective of the quality of the product after the release. We conduct a case study at Ericsson and three of its large products, which handle mobile data traffic. We show how to use automated analogy-based classification to decrease the classification effort required to determine whether a clone pair should be refactored or remain untouched. The automated method allows classifying 96% of Type I clones (both algorithms and data declarations leaving the remaining 4% for the manual classification. The results show that cloning is common in the studied commercial software, but that only 1% of these clones are potentially obstructive and can jeopardize the quality of the product if left unmanaged.

  14. The Performance-based Funding Scheme of Universities

    Directory of Open Access Journals (Sweden)

    Juha KETTUNEN

    2016-05-01

    Full Text Available The purpose of this study is to analyse the effectiveness of the performance-based funding scheme of the Finnish universities that was adopted at the beginning of 2013. The political decision-makers expect that the funding scheme will create incentives for the universities to improve performance, but these funding schemes have largely failed in many other countries, primarily because public funding is only a small share of the total funding of universities. This study is interesting because Finnish universities have no tuition fees, unlike in many other countries, and the state allocates funding based on the objectives achieved. The empirical evidence of the graduation rates indicates that graduation rates increased when a new scheme was adopted, especially among male students, who have more room for improvement than female students. The new performance-based funding scheme allocates the funding according to the output-based indicators and limits the scope of strategic planning and the autonomy of the university. The performance-based funding scheme is transformed to the strategy map of the balanced scorecard. The new funding scheme steers universities in many respects but leaves the research and teaching skills to the discretion of the universities. The new scheme has also diminished the importance of the performance agreements between the university and the Ministry. The scheme increases the incentives for universities to improve the processes and structures in order to attain as much public funding as possible. It is optimal for the central administration of the university to allocate resources to faculties and other organisational units following the criteria of the performance-based funding scheme. The new funding scheme has made the universities compete with each other, because the total funding to the universities is allocated to each university according to the funding scheme. There is a tendency that the funding schemes are occasionally

  15. Adaptive Image Transmission Scheme over Wavelet-Based OFDM System

    Institute of Scientific and Technical Information of China (English)

    GAOXinying; YUANDongfeng; ZHANGHaixia

    2005-01-01

    In this paper an adaptive image transmission scheme is proposed over Wavelet-based OFDM (WOFDM) system with Unequal error protection (UEP) by the design of non-uniform signal constellation in MLC. Two different data division schemes: byte-based and bitbased, are analyzed and compared. Different bits are protected unequally according to their different contribution to the image quality in bit-based data division scheme, which causes UEP combined with this scheme more powerful than that with byte-based scheme. Simulation results demonstrate that image transmission by UEP with bit-based data division scheme presents much higher PSNR values and surprisingly better image quality. Furthermore, by considering the tradeoff of complexity and BER performance, Haar wavelet with the shortest compactly supported filter length is the most suitable one among orthogonal Daubechies wavelet series in our proposed system.

  16. ID-Based Signature Scheme with Weil Pairing

    Directory of Open Access Journals (Sweden)

    Neetu Sharma

    2013-09-01

    Full Text Available Digital signature is an essential component in cryptography. Digital signaturesguarantee end-to-end message integrity and authentication information about the origin of a message. In this paper we propose a new identification based digital signature scheme with weil pairing. Also we analyze security and efficiency of our scheme. Security of our scheme is based on expressing the torsion point of curve into linear combination of its basis points; it is more complicated than solving ECDLP(Elliptic Curve Discrete Logarithm Problem. We claim that our new identification based digital signature scheme is more secure and efficient than the existing scheme of Islam et al(S. K. Hafizul Islam, G.P. Biswas, An Efficient and Provably-secure Digital signature Scheme based on Elliptic Curve Bilinear Pairings, Theoretical and Applied Informatics ISSN 18965334 Vol.24, no. 2, 2012, pp. 109 118 based on bilinear pairing.

  17. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  18. Quantum election scheme based on anonymous quantum key distribution

    International Nuclear Information System (INIS)

    An unconditionally secure authority-certified anonymous quantum key distribution scheme using conjugate coding is presented, based on which we construct a quantum election scheme without the help of an entanglement state. We show that this election scheme ensures the completeness, soundness, privacy, eligibility, unreusability, fairness, and verifiability of a large-scale election in which the administrator and counter are semi-honest. This election scheme can work even if there exist loss and errors in quantum channels. In addition, any irregularity in this scheme is sensible. (general)

  19. An Improved Scalar Costa Scheme Based on Watson Perceptual Model

    Institute of Scientific and Technical Information of China (English)

    QI Kai-yue; CHEN Jian-bo; ZHOU Yi

    2008-01-01

    An improved scalar Costa scheme (SCS) was proposed by using improved Watson perceptual model to adaptively decide quantization step size and scaling factor. The improved scheme equals to embed hiding data based on an actual image. In order to withstand amplitude scaling attack, the Watson perceptual model was redefined, and the improved scheme using the new definition can insure quantization step size in decoder that is proportional to amplitude scaling attack factor. The performance of the improved scheme outperforms that of SCS with fixed quantization step size. The improved scheme combines information theory and visual model.

  20. Quantum election scheme based on anonymous quantum key distribution

    Institute of Scientific and Technical Information of China (English)

    Zhou Rui-Rui; Yang Li

    2012-01-01

    An unconditionally secure authority-certified anonymous quantum key distribution scheme using conjugate coding is presented,based on which we construct a quantum election scheme without the help of an entanglement state.We show that this election scheme ensures the completeness,soundness,privacy,eligibility,unreusability,fairness,and verifiability of a large-scale election in which the administrator and counter are semi-honest.This election scheme can work even if there exist loss and errors in quantum channels.In addition,any irregularity in this scheme is sensible.

  1. A Proxy Blind Signature Scheme Based on ECDLP

    Institute of Scientific and Technical Information of China (English)

    WANGHaiyan; WANGRuchuan

    2005-01-01

    While proxy signature scheme enables an original signer to fully authorize a proxy to sign a message on his or her behalf legally and undeniably, blind signature scheme keeps the message blind from the signer so that the signer cannot make a linkage between the signature and the identity of requester (receiver). Both schemes have been widely applied in the electronic business. A new ECDLP (Elliptic curve discrete problem)-based proxy blind signature scheme is to be proposed in this paper by integrating the security properties of both schemes.

  2. Design Scheme of Remote Monitoring System Based on Qt

    Directory of Open Access Journals (Sweden)

    Xu Dawei

    2015-01-01

    Full Text Available This paper introduces a design scheme of remote monitoring system based on Qt, the scheme of remote monitoring system based on S3C2410 and Qt, with the aid of cross platform development tools Qt and powerful ARM platform design and implementation. The development of remote video surveillance system based on embedded terminal has practical significance and value.

  3. Texture Image Classification Based on Gabor Wavelet

    Institute of Scientific and Technical Information of China (English)

    DENG Wei-bing; LI Hai-fei; SHI Ya-li; YANG Xiao-hui

    2014-01-01

    For a texture image, by recognizining the class of every pixel of the image, it can be partitioned into disjoint regions of uniform texture. This paper proposed a texture image classification algorithm based on Gabor wavelet. In this algorithm, characteristic of every image is obtained through every pixel and its neighborhood of this image. And this algorithm can achieve the information transform between different sizes of neighborhood. Experiments on standard Brodatz texture image dataset show that our proposed algorithm can achieve good classification rates.

  4. An Encoder/Decoder Scheme of OCDMA Based on Waveguide

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new encoder/decoder scheme of OCDMA based on waveguide isproposed in this paper. The principle as well as the structure of waveguide encoder/decoder is given. It can be seen that all-optical OCDMA encoder/decoder can be realized by the proposed scheme of the waveguide encoder/decoder. It can also make the OCDMA encoder/decoder integrated easily and the access controlled easily. The system based on this scheme can work under the entirely asynchronous condition.

  5. Classification of Base Sequences (+1,

    Directory of Open Access Journals (Sweden)

    Dragomir Ž. Ðoković

    2010-01-01

    Full Text Available Base sequences BS(+1, are quadruples of {±1}-sequences (;;;, with A and B of length +1 and C and D of length n, such that the sum of their nonperiodic autocor-relation functions is a -function. The base sequence conjecture, asserting that BS(+1, exist for all n, is stronger than the famous Hadamard matrix conjecture. We introduce a new definition of equivalence for base sequences BS(+1, and construct a canonical form. By using this canonical form, we have enumerated the equivalence classes of BS(+1, for ≤30. As the number of equivalence classes grows rapidly (but not monotonically with n, the tables in the paper cover only the cases ≤13.

  6. A Dynamic ID-based Remote User Authentication Scheme

    CERN Document Server

    Das, Manik Lal; Gulati, Ved P

    2004-01-01

    Password-based authentication schemes are the most widely used techniques for remote user authentication. Many static ID-based remote user authentication schemes both with and without smart cards have been proposed. Most of the schemes do not allow the users to choose and change their passwords, and maintain a verifier table to verify the validity of the user login. In this paper we present a dynamic ID-based remote user authentication scheme using smart cards. Our scheme allows the users to choose and change their passwords freely, and do not maintain any verifier table. The scheme is secure against ID-theft, and can resist the reply attacks, forgery attacks, guessing attacks, insider attacks and stolen verifier attacks.

  7. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  8. Improved ID-Based Signature Scheme Solving Key Escrow

    Institute of Scientific and Technical Information of China (English)

    LIAO Jian; QI Ying-hao; HUANG Pei-wei; RONG Men-tian; LI Sheng-hong

    2006-01-01

    Key escrow is an inherent disadvantage for traditional ID-based cryptosystem, i.e. , the dishonest private key generator (PKG) can forge the signature of any user, meanwhile, the user can deny the signature actually signed by him/herself. To avoid the keyescrow problem, an ID-based signature scheme was presented without trusted PKG. The exact proof of security was presented to demonstrate that our scheme is secure against existential forgery on adaptively chosen message and ID attacks assuming the complexity of computational Diffie-Hellman(CDH) problem. Compared with other signature schemes, the proposed scheme is more efficient.

  9. Image-based Vehicle Classification System

    CERN Document Server

    Ng, Jun Yee

    2012-01-01

    Electronic toll collection (ETC) system has been a common trend used for toll collection on toll road nowadays. The implementation of electronic toll collection allows vehicles to travel at low or full speed during the toll payment, which help to avoid the traffic delay at toll road. One of the major components of an electronic toll collection is the automatic vehicle detection and classification (AVDC) system which is important to classify the vehicle so that the toll is charged according to the vehicle classes. Vision-based vehicle classification system is one type of vehicle classification system which adopt camera as the input sensing device for the system. This type of system has advantage over the rest for it is cost efficient as low cost camera is used. The implementation of vision-based vehicle classification system requires lower initial investment cost and very suitable for the toll collection trend migration in Malaysia from single ETC system to full-scale multi-lane free flow (MLFF). This project ...

  10. An Image Fusion Algorithm Based on Lifting Scheme

    Institute of Scientific and Technical Information of China (English)

    BAI Di; FAN Qibin; SHU Qian

    2005-01-01

    Taking the advantage of the lifting scheme's characters that can build wavelet transforms for transforming from integer to integer and the quality ofthe reconstructing imageis independent of the topology way adopted by the boundary, an image fusion algorithm based on lifting scheme is proposed. This paper discusses the fundamental theory of lifting scheme firstly and then after taking transform analysis according to a kind of images that need to be confused.

  11. Gemstones and geosciences in space and time. Digital maps to the "Chessboard classification scheme of mineral deposits"

    Science.gov (United States)

    Dill, Harald G.; Weber, Berthold

    2013-12-01

    The gemstones, covering the spectrum from jeweler's to showcase quality, have been presented in a tripartite subdivision, by country, geology and geomorphology realized in 99 digital maps with more than 2600 mineralized sites. The various maps were designed based on the "Chessboard classification scheme of mineral deposits" proposed by Dill (2010a, 2010b) to reveal the interrelations between gemstone deposits and mineral deposits of other commodities and direct our thoughts to potential new target areas for exploration. A number of 33 categories were used for these digital maps: chromium, nickel, titanium, iron, manganese, copper, tin-tungsten, beryllium, lithium, zinc, calcium, boron, fluorine, strontium, phosphorus, zirconium, silica, feldspar, feldspathoids, zeolite, amphibole (tiger's eye), olivine, pyroxenoid, garnet, epidote, sillimanite-andalusite, corundum-spinel - diaspore, diamond, vermiculite-pagodite, prehnite, sepiolite, jet, and amber. Besides the political base map (gems by country) the mineral deposit is drawn on a geological map, illustrating the main lithologies, stratigraphic units and tectonic structure to unravel the evolution of primary gemstone deposits in time and space. The geomorphological map is to show the control of climate and subaerial and submarine hydrography on the deposition of secondary gemstone deposits. The digital maps are designed so as to be plotted as a paper version of different scale and to upgrade them for an interactive use and link them to gemological databases.

  12. Cost Based Droop Schemes for Economic Dispatch in Islanded Microgrids

    DEFF Research Database (Denmark)

    Chen, Feixiong; Chen, Minyou; Li, Qiang;

    2016-01-01

    In this paper, cost based droop schemes are proposed, to minimize the total active power generation cost in an islanded microgrid (MG), while the simplicity and decentralized nature of the droop control are retained. In cost based droop schemes, the incremental costs of distributed generators (DGs......) are embedded into the droop schemes, where the incremental cost is a derivative of the DG cost function with respect to output power. In the steady state, DGs share a single common frequency, and cost based droop schemes equate incremental costs of DGs, thus minimizing the total active power generation cost......, in terms of the equal incremental cost principle. Finally, simulation results in an islanded MG with high penetration of intermittent renewable energy sources are presented, to demonstrate the eectiveness, as well as plug and play capability of the cost based droop schemes....

  13. Mechanism-based drug exposure classification in pharmacoepidemiological studies

    NARCIS (Netherlands)

    Verdel, B.M.

    2010-01-01

    Mechanism-based classification of drug exposure in pharmacoepidemiological studies In pharmacoepidemiology and pharmacovigilance, the relation between drug exposure and clinical outcomes is crucial. Exposure classification in pharmacoepidemiological studies is traditionally based on pharmacotherapeu

  14. Scheme optimization of AT shifting element based on genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    岳会军; 刘艳芳; 马明月; 徐向阳; 王书翰

    2015-01-01

    In order to realize the computer aided design of AT shifting element schemes, a mathematical model of shifting element schemes which can be easily identified by computers was built. Taking the transmission ratio sequence as an optimization objective and simple shifting logic between adjacent gears through operating only one shifting element as a constraint condition, a fitness function of shifting element schemes was proposed. ZF-8AT shifting element schemes were optimized based on GA work-box of MATLAB, and the feasibility of the optimization algorithm was verified.

  15. MIXED SCHEME FOR IMAGE EDGE DETECTION BASED ON WAVELET TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Xie Hongmei; Yu Bianzhang; Zhao Jian

    2004-01-01

    A mixed scheme based on Wavelet Transformation (WT) is proposed for image edge detection. The scheme combines the wavelet transform and traditional Sobel and LoG (Laplacian of Gaussian) operator edge-detection algorithms. The precise theory analysis is given to show that the wavelet transformation has an advantage for signal processing. Simulation results show that the new scheme is better than only using the Sobel or LoG methods. Complexity analysis is also given and the conclusion is acceptable, therefore the proposed scheme is effective for edge detection.

  16. A novel chaotic encryption scheme based on pseudorandom bit padding

    CERN Document Server

    Sadra, Yaser; Fard, Zahra Arasteh

    2012-01-01

    Cryptography is always very important in data origin authentications, entity authentication, data integrity and confidentiality. In recent years, a variety of chaotic cryptographic schemes have been proposed. These schemes has typical structure which performed the permutation and the diffusion stages, alternatively. The random number generators are intransitive in cryptographic schemes and be used in the diffusion functions of the image encryption for diffused pixels of plain image. In this paper, we propose a chaotic encryption scheme based on pseudorandom bit padding that the bits be generated by a novel logistic pseudorandom image algorithm. To evaluate the security of the cipher image of this scheme, the key space analysis, the correlation of two adjacent pixels and differential attack were performed. This scheme tries to improve the problem of failure of encryption such as small key space and level of security.

  17. A Neuro-Fuzzy based System for Classification of Natural Textures

    Science.gov (United States)

    Jiji, G. Wiselin

    2016-06-01

    A statistical approach based on the coordinated clusters representation of images is used for classification and recognition of textured images. In this paper, two issues are being addressed; one is the extraction of texture features from the fuzzy texture spectrum in the chromatic and achromatic domains from each colour component histogram of natural texture images and the second issue is the concept of a fusion of multiple classifiers. The implementation of an advanced neuro-fuzzy learning scheme has been also adopted in this paper. The results of classification tests show the high performance of the proposed method that may have industrial application for texture classification, when compared with other works.

  18. Optimized entanglement purification schemes for modular based quantum computers

    Science.gov (United States)

    Krastanov, Stefan; Jiang, Liang

    The choice of entanglement purification scheme strongly depends on the fidelities of quantum gates and measurements, as well as the imperfection of initial entanglement. For instance, the purification scheme optimal at low gate fidelities may not necessarily be the optimal scheme at higher gate fidelities. We employ an evolutionary algorithm that efficiently optimizes the entanglement purification circuit for given system parameters. Such optimized purification schemes will boost the performance of entanglement purification, and consequently enhance the fidelity of teleportation-based non-local coupling gates, which is an indispensible building block for modular-based quantum computers. In addition, we study how these optimized purification schemes affect the resource overhead caused by error correction in modular based quantum computers.

  19. Evaluation of a 5-tier scheme proposed for classification of sequence variants using bioinformatic and splicing assay data

    DEFF Research Database (Denmark)

    Walker, Logan C; Whiley, Phillip J; Houdayer, Claude;

    2013-01-01

    Splicing assays are commonly undertaken in the clinical setting to assess the clinical relevance of sequence variants in disease predisposition genes. A 5-tier classification system incorporating both bioinformatic and splicing assay information was previously proposed as a method to provide...... consistent clinical classification of such variants. Members of the ENIGMA Consortium Splicing Working Group undertook a study to assess the applicability of the scheme to published assay results, and the consistency of classifications across multiple reviewers. Splicing assay data were identified for 235...... BRCA1 and 176 BRCA2 unique variants, from 77 publications. At least six independent reviewers from research and/or clinical settings comprehensively examined splicing assay methods and data reported for 22 variant assays of 21 variants in four publications, and classified the variants using the 5-tier...

  20. A FRACTAL-BASED STOCHASTIC INTERPOLATION SCHEME IN SUBSURFACE HYDROLOGY

    Science.gov (United States)

    The need for a realistic and rational method for interpolating sparse data sets is widespread. Real porosity and hydraulic conductivity data do not vary smoothly over space, so an interpolation scheme that preserves irregularity is desirable. Such a scheme based on the properties...

  1. PILOT-BASED FREQUENCY OFFSET DETECTION SCHEME IN OFDM SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Du Zheng; Zhu Jinkang

    2003-01-01

    The frequency offset information is extracted from local pilot amplitude characteristics, which suffer much less distortion in frequency-selective fading channels than those utilizing frequency domain correlation techniques. Simulation shows that the performance of this scheme has better performance than the existing frequency domain pilot-based frequency offset detection scheme.

  2. An ID-based Blind Signature Scheme from Bilinear Pairings

    Directory of Open Access Journals (Sweden)

    B.Umaprasada Rao

    2010-03-01

    Full Text Available Blind signatures, introduced by Chaum, allow a user to obtain a signature on a message without revealing any thing about the message to the signer. Blind signatures play on important role in plenty of applications such as e-voting, e-cash system where anonymity is of great concern. Identity based(ID-based public key cryptography can be a good alternative for certified based public key setting, especially when efficient key management and moderate security are required. In this paper, we propose an ID-based blind signature scheme from bilinear pairings. The proposed scheme is based on the Hess ID- based digital signature scheme. Also we analyze security and efficiency of the proposed scheme.

  3. Ship Classification with High Resolution TerraSAR-X Imagery Based on Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Zhi Zhao

    2013-01-01

    Full Text Available Ship surveillance using space-borne synthetic aperture radar (SAR, taking advantages of high resolution over wide swaths and all-weather working capability, has attracted worldwide attention. Recent activity in this field has concentrated mainly on the study of ship detection, but the classification is largely still open. In this paper, we propose a novel ship classification scheme based on analytic hierarchy process (AHP in order to achieve better performance. The main idea is to apply AHP on both feature selection and classification decision. On one hand, the AHP based feature selection constructs a selection decision problem based on several feature evaluation measures (e.g., discriminability, stability, and information measure and provides objective criteria to make comprehensive decisions for their combinations quantitatively. On the other hand, we take the selected feature sets as the input of KNN classifiers and fuse the multiple classification results based on AHP, in which the feature sets’ confidence is taken into account when the AHP based classification decision is made. We analyze the proposed classification scheme and demonstrate its results on a ship dataset that comes from TerraSAR-X SAR images.

  4. Identity-based Verifiably Committed Signature Scheme without Random Oracles

    Institute of Scientific and Technical Information of China (English)

    SUN Xun; LI Jian-hua; CHEN Gong-liang

    2008-01-01

    An identity-based verifiably committed signature scheme (IB-VCS) was proposed, which is proved secure in the standard model (i.e., without random oracles). It enjoys the setup-free property and stand-alone property, both of which make an exchange protocol more practical. The scheme is unconditionally secure against the cheating signer, its security against the cheating verifier is reduced to the computational Diffie-Hellman (CDH) problem in the underlying group, it is secure against the cheating trusted third party if the underlying Paterson Schuldt's identity based signature (IBS) scheme is secure, which is proven true based on the CDH assumption in the standard model.

  5. An Identity-Based Encryption Scheme with Compact Ciphertexts

    Institute of Scientific and Technical Information of China (English)

    LIU Sheng-li; GUO Bao-an; ZHANG Qing-sheng

    2009-01-01

    This paper proposes an identity-based encryption scheme with the help of bilinear pairings, where the identity information of a user functions as the user's public key. The advantage of an identity-based public key system is that it can avoid public key certificates and certificate management. Our identity-based encryption scheme enjoys short ciphertexts and provable security against chosen-ciphertext attack (CCA).

  6. A New Proxy Blind Signature Scheme based on ECDLP

    Directory of Open Access Journals (Sweden)

    Daniyal M Alghazzawi

    2011-05-01

    Full Text Available A proxy blind signature scheme is a special form of blind signature which allows a designated person called proxy signer to sign on behalf of two or more original signers without knowing the content of the message or document. It combines the advantages of proxy signature, blind signature and multi-signature scheme and satisfies the security properties of both proxy and blind signature scheme. Most of the exiting proxy blind signature schemes were developed based on the mathematical hard problems integer factorization (IFP and simple discrete logarithm (DLP which take sub-exponential time to solve. This paper describes an secure simple proxy blind signature scheme based on Elliptic Curve Discrete Logarithm Problem (ECDLP takes fully-exponential time. This can be implemented in low power and small processor mobile devices such as smart card, PDA etc. Here also we describes implementation issues of various scalar multiplication for ECDLP.

  7. Collaborative Representation based Classification for Face Recognition

    CERN Document Server

    Zhang, Lei; Feng, Xiangchu; Ma, Yi; Zhang, David

    2012-01-01

    By coding a query sample as a sparse linear combination of all training samples and then classifying it by evaluating which class leads to the minimal coding residual, sparse representation based classification (SRC) leads to interesting results for robust face recognition. It is widely believed that the l1- norm sparsity constraint on coding coefficients plays a key role in the success of SRC, while its use of all training samples to collaboratively represent the query sample is rather ignored. In this paper we discuss how SRC works, and show that the collaborative representation mechanism used in SRC is much more crucial to its success of face classification. The SRC is a special case of collaborative representation based classification (CRC), which has various instantiations by applying different norms to the coding residual and coding coefficient. More specifically, the l1 or l2 norm characterization of coding residual is related to the robustness of CRC to outlier facial pixels, while the l1 or l2 norm c...

  8. Feature-Based Classification of Networks

    CERN Document Server

    Barnett, Ian; Kuijjer, Marieke L; Mucha, Peter J; Onnela, Jukka-Pekka

    2016-01-01

    Network representations of systems from various scientific and societal domains are neither completely random nor fully regular, but instead appear to contain recurring structural building blocks. These features tend to be shared by networks belonging to the same broad class, such as the class of social networks or the class of biological networks. At a finer scale of classification within each such class, networks describing more similar systems tend to have more similar features. This occurs presumably because networks representing similar purposes or constructions would be expected to be generated by a shared set of domain specific mechanisms, and it should therefore be possible to classify these networks into categories based on their features at various structural levels. Here we describe and demonstrate a new, hybrid approach that combines manual selection of features of potential interest with existing automated classification methods. In particular, selecting well-known and well-studied features that ...

  9. Dihedral-based segment identification and classification of biopolymers II: polynucleotides.

    Science.gov (United States)

    Nagy, Gabor; Oostenbrink, Chris

    2014-01-27

    In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which--to our knowledge--were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides.

  10. Dihedral-Based Segment Identification and Classification of Biopolymers II: Polynucleotides

    Science.gov (United States)

    2013-01-01

    In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which—to our knowledge—were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides. PMID:24364355

  11. Triangle based TVD schemes for hyperbolic conservation laws

    Science.gov (United States)

    Durlofsky, Louis J.; Osher, Stanley; Engquist, Bjorn

    1990-01-01

    A triangle based total variation diminishing (TVD) scheme for the numerical approximation of hyperbolic conservation laws in two space dimensions is constructed. The novelty of the scheme lies in the nature of the preprocessing of the cell averaged data, which is accomplished via a nearest neighbor linear interpolation followed by a slope limiting procedures. Two such limiting procedures are suggested. The resulting method is considerably more simple than other triangle based non-oscillatory approximations which, like this scheme, approximate the flux up to second order accuracy. Numerical results for linear advection and Burgers' equation are presented.

  12. Triangle based TVD schemes for hyperbolic conservation laws. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Durlofsky, L.J.; Osher, S.; Engquist, B.

    1990-01-01

    A triangle based total variation diminishing (TVD) scheme for the numerical approximation of hyperbolic conservation laws in two space dimensions is constructed. The novelty of the scheme lies in the nature of the preprocessing of the cell averaged data, which is accomplished via a nearest neighbor linear interpolation followed by a slope limiting procedures. Two such limiting procedures are suggested. The resulting method is considerably more simple than other triangle based non-oscillatory approximations which, like this scheme, approximate the flux up to second order accuracy. Numerical results for linear advection and Burgers' equation are presented.

  13. Digital image-based classification of biodiesel.

    Science.gov (United States)

    Costa, Gean Bezerra; Fernandes, David Douglas Sousa; Almeida, Valber Elias; Araújo, Thomas Souto Policarpo; Melo, Jessica Priscila; Diniz, Paulo Henrique Gonçalves Dias; Véras, Germano

    2015-07-01

    This work proposes a simple, rapid, inexpensive, and non-destructive methodology based on digital images and pattern recognition techniques for classification of biodiesel according to oil type (cottonseed, sunflower, corn, or soybean). For this, differing color histograms in RGB (extracted from digital images), HSI, Grayscale channels, and their combinations were used as analytical information, which was then statistically evaluated using Soft Independent Modeling by Class Analogy (SIMCA), Partial Least Squares Discriminant Analysis (PLS-DA), and variable selection using the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). Despite good performances by the SIMCA and PLS-DA classification models, SPA-LDA provided better results (up to 95% for all approaches) in terms of accuracy, sensitivity, and specificity for both the training and test sets. The variables selected Successive Projections Algorithm clearly contained the information necessary for biodiesel type classification. This is important since a product may exhibit different properties, depending on the feedstock used. Such variations directly influence the quality, and consequently the price. Moreover, intrinsic advantages such as quick analysis, requiring no reagents, and a noteworthy reduction (the avoidance of chemical characterization) of waste generation, all contribute towards the primary objective of green chemistry.

  14. Nominated Texture Based Cervical Cancer Classification

    Directory of Open Access Journals (Sweden)

    Edwin Jayasingh Mariarputham

    2015-01-01

    Full Text Available Accurate classification of Pap smear images becomes the challenging task in medical image processing. This can be improved in two ways. One way is by selecting suitable well defined specific features and the other is by selecting the best classifier. This paper presents a nominated texture based cervical cancer (NTCC classification system which classifies the Pap smear images into any one of the seven classes. This can be achieved by extracting well defined texture features and selecting best classifier. Seven sets of texture features (24 features are extracted which include relative size of nucleus and cytoplasm, dynamic range and first four moments of intensities of nucleus and cytoplasm, relative displacement of nucleus within the cytoplasm, gray level cooccurrence matrix, local binary pattern histogram, tamura features, and edge orientation histogram. Few types of support vector machine (SVM and neural network (NN classifiers are used for the classification. The performance of the NTCC algorithm is tested and compared to other algorithms on public image database of Herlev University Hospital, Denmark, with 917 Pap smear images. The output of SVM is found to be best for the most of the classes and better results for the remaining classes.

  15. BROAD PHONEME CLASSIFICATION USING SIGNAL BASED FEATURES

    Directory of Open Access Journals (Sweden)

    Deekshitha G

    2014-12-01

    Full Text Available Speech is the most efficient and popular means of human communication Speech is produced as a sequence of phonemes. Phoneme recognition is the first step performed by automatic speech recognition system. The state-of-the-art recognizers use mel-frequency cepstral coefficients (MFCC features derived through short time analysis, for which the recognition accuracy is limited. Instead of this, here broad phoneme classification is achieved using features derived directly from the speech at the signal level itself. Broad phoneme classes include vowels, nasals, fricatives, stops, approximants and silence. The features identified useful for broad phoneme classification are voiced/unvoiced decision, zero crossing rate (ZCR, short time energy, most dominant frequency, energy in most dominant frequency, spectral flatness measure and first three formants. Features derived from short time frames of training speech are used to train a multilayer feedforward neural network based classifier with manually marked class label as output and classification accuracy is then tested. Later this broad phoneme classifier is used for broad syllable structure prediction which is useful for applications such as automatic speech recognition and automatic language identification.

  16. Hyperspectral remote sensing image classification based on decision level fusion

    Institute of Scientific and Technical Information of China (English)

    Peijun Du; Wei Zhang; Junshi Xia

    2011-01-01

    @@ To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner.To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers.In the experiment, by using the operational modular imaging spectrometer (OMIS) II HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion.The results also indicate that the optimization of input features can improve the classification performance.%To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner. To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers. In the experiment, by using the operational modular imaging spectrometer (OMIS) Ⅱ HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion. The results also indicate that the optimization of input features can improve the classification performance.

  17. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    Science.gov (United States)

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis. PMID:26356597

  18. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    Science.gov (United States)

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis.

  19. An expert system based intelligent control scheme for space bioreactors

    Science.gov (United States)

    San, Ka-Yiu

    1988-01-01

    An expert system based intelligent control scheme is being developed for the effective control and full automation of bioreactor systems in space. The scheme developed will have the capability to capture information from various resources including heuristic information from process researchers and operators. The knowledge base of the expert system should contain enough expertise to perform on-line system identification and thus be able to adapt the controllers accordingly with minimal human supervision.

  20. A threshold key escrow scheme based on public key cryptosystem

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In key escrow field it is important to solve the problem thatuser's secret key completely depends on the trusted escrow agency. In 1995, some methods of solving the problem were presented. But these methods are no better than that of directly using threshold cryptography. In this paper, we present a common pattern of threshold key escrow scheme based on public key cryptosystem, and a detailed design based on the improved RSA algorithm is given. The above problem is solved by this scheme.

  1. A Secure Code-Based Authentication Scheme for RFID Systems

    Directory of Open Access Journals (Sweden)

    Noureddine Chikouche

    2015-08-01

    Full Text Available Two essential problems are still posed in terms of Radio Frequency Identification (RFID systems, including: security and limitation of resources. Recently, Li et al.'s proposed a mutual authentication scheme for RFID systems in 2014, it is based on Quasi Cyclic-Moderate Density Parity Check (QC-MDPC McEliece cryptosystem. This cryptosystem is designed to reducing the key sizes. In this paper, we found that this scheme does not provide untraceability and forward secrecy properties. Furthermore, we propose an improved version of this scheme to eliminate existing vulnerabilities of studied scheme. It is based on the QC-MDPC McEliece cryptosystem with padding the plaintext by a random bit-string. Our work also includes a security comparison between our improved scheme and different code-based RFID authentication schemes. We prove secrecy and mutual authentication properties by AVISPA (Automated Validation of Internet Security Protocols and Applications tools. Concerning the performance, our scheme is suitable for low-cost tags with resource limitation.

  2. Evaluation of the NS1 Rapid Test and the WHO Dengue Classification Schemes for Use as Bedside Diagnosis of Acute Dengue Fever in Adults

    OpenAIRE

    Chaterji, Shera; Allen, John Carson; Chow, Angelia; Leo, Yee-Sin; Ooi, Eng-Eong

    2011-01-01

    Because healthcare facilities in many dengue endemic countries lack laboratory support, early dengue diagnosis must rely on either clinical recognition or a bedside diagnostic test. We evaluated the sensitivity and specificity of the 1997 and 2009 World Health Organization (WHO) dengue classification schemes and the NS1 strip test in acute sera from 154 virologically confirmed dengue patients and 200 patients with other febrile illnesses. Both WHO classification schemes had high sensitivity b...

  3. Pixel classification based color image segmentation using quaternion exponent moments.

    Science.gov (United States)

    Wang, Xiang-Yang; Wu, Zhi-Fang; Chen, Liang; Zheng, Hong-Liang; Yang, Hong-Ying

    2016-02-01

    Image segmentation remains an important, but hard-to-solve, problem since it appears to be application dependent with usually no a priori information available regarding the image structure. In recent years, many image segmentation algorithms have been developed, but they are often very complex and some undesired results occur frequently. In this paper, we propose a pixel classification based color image segmentation using quaternion exponent moments. Firstly, the pixel-level image feature is extracted based on quaternion exponent moments (QEMs), which can capture effectively the image pixel content by considering the correlation between different color channels. Then, the pixel-level image feature is used as input of twin support vector machines (TSVM) classifier, and the TSVM model is trained by selecting the training samples with Arimoto entropy thresholding. Finally, the color image is segmented with the trained TSVM model. The proposed scheme has the following advantages: (1) the effective QEMs is introduced to describe color image pixel content, which considers the correlation between different color channels, (2) the excellent TSVM classifier is utilized, which has lower computation time and higher classification accuracy. Experimental results show that our proposed method has very promising segmentation performance compared with the state-of-the-art segmentation approaches recently proposed in the literature. PMID:26618250

  4. An Efficient Signature Scheme based on Factoring and Discrete Logarithm

    OpenAIRE

    Ciss, Abdoul Aziz; Cheikh, Ahmed Youssef Ould

    2012-01-01

    This paper proposes a new signature scheme based on two hard problems : the cube root extraction modulo a composite moduli (which is equivalent to the factorisation of the moduli, IFP) and the discrete logarithm problem(DLP). By combining these two cryptographic assumptions, we introduce an efficient and strongly secure signature scheme. We show that if an adversary can break the new scheme with an algorithm $\\mathcal{A},$ then $\\mathcal{A}$ can be used to sove both the DLP and the IFP. The k...

  5. A New ID-Based Proxy Blind Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    LANG Wei-min; YANG Zong-kai; CHENG Wen-qing; TAN Yun-meng

    2005-01-01

    An identity-based proxy blind signature scheme from bilinear pairings is introduced, which combines the advantages of proxy signature and blind signature. Furthermore, our scheme can prevent the original signer from generating the proxy blind signature, thus the profits of the proxy signer are guaranteed. We introduce bilinear pairings to minimize computational overhead and to improve the related performance of our scheme. In addition, the proxy blind signature presented is non-repudiable and it fulfills perfectly the security requirements of a proxy blind signature.

  6. Cirrhosis Classification Based on Texture Classification of Random Features

    Directory of Open Access Journals (Sweden)

    Hui Liu

    2014-01-01

    Full Text Available Accurate staging of hepatic cirrhosis is important in investigating the cause and slowing down the effects of cirrhosis. Computer-aided diagnosis (CAD can provide doctors with an alternative second opinion and assist them to make a specific treatment with accurate cirrhosis stage. MRI has many advantages, including high resolution for soft tissue, no radiation, and multiparameters imaging modalities. So in this paper, multisequences MRIs, including T1-weighted, T2-weighted, arterial, portal venous, and equilibrium phase, are applied. However, CAD does not meet the clinical needs of cirrhosis and few researchers are concerned with it at present. Cirrhosis is characterized by the presence of widespread fibrosis and regenerative nodules in the hepatic, leading to different texture patterns of different stages. So, extracting texture feature is the primary task. Compared with typical gray level cooccurrence matrix (GLCM features, texture classification from random features provides an effective way, and we adopt it and propose CCTCRF for triple classification (normal, early, and middle and advanced stage. CCTCRF does not need strong assumptions except the sparse character of image, contains sufficient texture information, includes concise and effective process, and makes case decision with high accuracy. Experimental results also illustrate the satisfying performance and they are also compared with typical NN with GLCM.

  7. Cost-based droop scheme for DC microgrid

    DEFF Research Database (Denmark)

    Nutkani, Inam Ullah; Wang, Peng; Loh, Poh Chiang;

    2014-01-01

    DC microgrids are gaining interest due to higher efficiencies of DC distribution compared with AC. The benefits of DC systems have been widely researched for data centers, IT facilities and residential applications. The research focus, however, has been more on system architecture and optimal...... voltage level, less on optimized operation and control of generation sources. The latter theme is perused in this paper, where cost-based droop scheme is proposed for distributed generators (DGs) in DC microgrids. Unlike traditional proportional power sharing based droop scheme, the proposed scheme...... considers the generation costs of DGs and dynamically tunes their droop gradients to produce more power from less costly DGs and vice versa. The proposed scheme is fully autonomous, simple to implement in dispatchable and non-dispatchable sources coupled with storage, support islanded and grid...

  8. Generic-Model-Based Description Scheme for MPEG-7

    Institute of Scientific and Technical Information of China (English)

    Deng Juan; Tan Hut; Chen Xin-meng

    2004-01-01

    We propose a new description scheme for MPEG7-: Generic-model-based Description Scheme to describe contents of audio, video, text and other sorts of multimedia.It uses a generic model as the description frame, which provides a simple but useful object-based structure. The main components of the description scheme are generic model, objects and object fcatures. The proposed description scheme is illustrated and exemplified by Extensible Markup Language.It aims at clarity and flexibility to support MPEG-7 applications such as query and edit. We demonstrate its feasibility and efficiency by presenting applications: Digital Broadcasting and Edit System (DEBS) and Non-linear Edit System (NLES) that already used the generic structure or will greatly benefit from it.

  9. The polarimetric entropy classification of SAR based on the clustering and signal noise ration

    Science.gov (United States)

    Shi, Lei; Yang, Jie; Lang, Fengkai

    2009-10-01

    Usually, Wishart H/α/A classification is an effective unsupervised classification method. However, the anisotropy parameter (A) is an unstable factor in the low signal noise ration (SNR) areas; at the same time, many clusters are useless to manually recognize. In order to avoid too many clusters to affect the manual recognition and the convergence of iteration and aiming at the drawback of the Wishart classification, in this paper, an enhancive unsupervised Wishart classification scheme for POLSAR data sets is introduced. The anisotropy parameter A is used to subdivide the target after H/α classification, this parameter has the ability to subdivide the homogeneity area in high SNR condition which can not be classified by using H/α. It is very useful to enhance the adaptability in difficult areas. Yet, the target polarimetric decomposition is affected by SNR before the classification; thus, the local homogeneity area's SNR evaluation is necessary. After using the direction of the edge detection template to examine the direction of POL-SAR images, the results can be processed to estimate SNR. The SNR could turn to a powerful tool to guide H/α/A classification. This scheme is able to correct the mistake judging of using A parameter such as eliminating much insignificant spot on the road and urban aggregation, even having a good performance in the complex forest. To convenience the manual recognition, an agglomerative clustering algorithm basing on the method of deviation-class is used to consolidate some clusters which are similar in 3by3 polarimetric coherency matrix. This classification scheme is applied to full polarimetric L band SAR image of Foulum area, Denmark.

  10. Markov chaotic sequences for correlation based watermarking schemes

    Energy Technology Data Exchange (ETDEWEB)

    Tefas, A.; Nikolaidis, A.; Nikolaidis, N.; Solachidis, V.; Tsekeridou, S.; Pitas, I. E-mail: pitas@zeus.csd.auth.gr

    2003-07-01

    In this paper, statistical analysis of watermarking schemes based on correlation detection is presented. Statistical properties of watermark sequences generated by piecewise-linear Markov maps are exploited, resulting in superior watermark detection reliability. Correlation/spectral properties of such sequences are easily controllable, a fact that affects the watermarking system performance. A family of chaotic maps, namely the skew tent map family, is proposed for use in watermarking schemes.

  11. Kinetic regimes and limiting cases of gas uptake and heterogeneous reactions in atmospheric aerosols and clouds: a general classification scheme

    Directory of Open Access Journals (Sweden)

    T. Berkemeier

    2013-01-01

    Full Text Available Heterogeneous reactions are important to atmospheric chemistry and are therefore an area of intense research. In multiphase systems such as aerosols and clouds, chemical reactions are usually strongly coupled to a complex sequence of mass transport processes and results are often not easy to interpret.

    Here we present a systematic classification scheme for gas uptake by aerosol or cloud particles which distinguishes two major regimes: a reaction-diffusion regime and a mass-transfer regime. Each of these regimes includes four distinct limiting cases, characterized by a dominant reaction location (surface or bulk and a single rate-limiting process: chemical reaction, bulk diffusion, gas-phase diffusion or mass accommodation.

    The conceptual framework enables efficient comparison of different studies and reaction systems, going beyond the scope of previous classification schemes by explicitly resolving interfacial transport processes and surface reactions limited by mass transfer from the gas phase. The use of kinetic multi-layer models instead of resistor model approaches increases the flexibility and enables a broader treatment of the subject, including cases which do not fit into the strict limiting cases typical of most resistor model formulations. The relative importance of different kinetic parameters such as diffusion, reaction rate and accommodation coefficients in this system is evaluated by a quantitative global sensitivity analysis. We outline the characteristic features of each limiting case and discuss the potential relevance of different regimes and limiting cases for various reaction systems. In particular, the classification scheme is applied to three different data sets for the benchmark system of oleic acid reacting with ozone. In light of these results, future directions of research needed to elucidate the multiphase chemical kinetics in this and other reaction systems are discussed.

  12. Kinetic regimes and limiting cases of gas uptake and heterogeneous reactions in atmospheric aerosols and clouds: a general classification scheme

    Directory of Open Access Journals (Sweden)

    T. Berkemeier

    2013-07-01

    Full Text Available Heterogeneous reactions are important to atmospheric chemistry and are therefore an area of intense research. In multiphase systems such as aerosols and clouds, chemical reactions are usually strongly coupled to a complex sequence of mass transport processes and results are often not easy to interpret. Here we present a systematic classification scheme for gas uptake by aerosol or cloud particles which distinguishes two major regimes: a reaction-diffusion regime and a mass transfer regime. Each of these regimes includes four distinct limiting cases, characterised by a dominant reaction location (surface or bulk and a single rate-limiting process: chemical reaction, bulk diffusion, gas-phase diffusion or mass accommodation. The conceptual framework enables efficient comparison of different studies and reaction systems, going beyond the scope of previous classification schemes by explicitly resolving interfacial transport processes and surface reactions limited by mass transfer from the gas phase. The use of kinetic multi-layer models instead of resistor model approaches increases the flexibility and enables a broader treatment of the subject, including cases which do not fit into the strict limiting cases typical of most resistor model formulations. The relative importance of different kinetic parameters such as diffusion, reaction rate and accommodation coefficients in this system is evaluated by a quantitative global sensitivity analysis. We outline the characteristic features of each limiting case and discuss the potential relevance of different regimes and limiting cases for various reaction systems. In particular, the classification scheme is applied to three different datasets for the benchmark system of oleic acid reacting with ozone in order to demonstrate utility and highlight potential issues. In light of these results, future directions of research needed to elucidate the multiphase chemical kinetics in this and other reaction systems

  13. A Chemistry-Based Classification for Peridotite Xenoliths

    Science.gov (United States)

    Block, K. A.; Ducea, M.; Raye, U.; Stern, R. J.; Anthony, E. Y.; Lehnert, K. A.

    2007-12-01

    The development of a petrological and geochemical database for mantle xenoliths is important for interpreting EarthScope geophysical results. Interpretation of compositional characteristics of xenoliths requires a sound basis for comparing geochemical results, even when no petrographic modes are available. Peridotite xenoliths are generally classified on the basis of mineralogy (Streckeisen, 1973) derived from point-counting methods. Modal estimates, particularly on heterogeneous samples, are conducted using various methodologies and are therefore subject to large statistical error. Also, many studies simply do not report the modes. Other classifications for peridotite xenoliths based on host matrix or tectonic setting (cratonic vs. non-cratonic) are poorly defined and provide little information on where samples from transitional settings fit within a classification scheme (e.g., xenoliths from circum-cratonic locations). We present here a classification for peridotite xenoliths based on bulk rock major element chemistry, which is one of the most common types of data reported in the literature. A chemical dataset of over 1150 peridotite xenoliths is compiled from two online geochemistry databases, the EarthChem Deep Lithosphere Dataset and from GEOROC (http://www.earthchem.org), and is downloaded with the rock names reported in the original publications. Ternary plots of combinations of the SiO2- CaO-Al2O3-MgO (SCAM) components display sharp boundaries that define the dunite, harzburgite, lherzolite, or wehrlite-pyroxenite fields and provide a graphical basis for classification. In addition, for the CaO-Al2O3-MgO (CAM) diagram, a boundary between harzburgite and lherzolite at approximately 19% CaO is defined by a plot of over 160 abyssal peridotite compositions calculated from observed modes using the methods of Asimow (1999) and Baker and Beckett (1999). We anticipate that our SCAM classification is a first step in the development of a uniform basis for

  14. Fuzzy Rule Base System for Software Classification

    Directory of Open Access Journals (Sweden)

    Adnan Shaout

    2013-07-01

    Full Text Available Given the central role that software development plays in the delivery and application of informationtechnology, managers have been focusing on process improvement in the software development area. Thisimprovement has increased the demand for software measures, or metrics to manage the process. Thismetrics provide a quantitative basis for the development and validation of models during the softwaredevelopment process. In this paper a fuzzy rule-based system will be developed to classify java applicationsusing object oriented metrics. The system will contain the following features:Automated method to extract the OO metrics from the source code,Default/base set of rules that can be easily configured via XML file so companies, developers, teamleaders,etc, can modify the set of rules according to their needs,Implementation of a framework so new metrics, fuzzy sets and fuzzy rules can be added or removeddepending on the needs of the end user,General classification of the software application and fine-grained classification of the java classesbased on OO metrics, andTwo interfaces are provided for the system: GUI and command.

  15. How does the selection of landscape classification schemes affect the spatial pattern of natural landscapes? An assessment on a coastal wetland site in southern Italy.

    Science.gov (United States)

    Tomaselli, V; Veronico, G; Sciandrello, S; Blonda, P

    2016-06-01

    It is widely known that thematic resolution affects spatial pattern and landscape metrics performances. In literature, data dealing with this issue usually refer to a specific class scheme with its thematic levels. In this paper, the effects of different land cover (LC) and habitat classification schemes on the spatial pattern of a coastal landscape were compared. One of the largest components of the Mediterranean wetland system was considered as the study site, and different schemes widely used in the EU were selected and harmonized with a common thematic resolution, suitable for habitat discrimination and monitoring. For each scheme, a thematic map was produced and, for each map, 28 landscape metrics were calculated. The landscape composition, already in terms of number of classes, class area, and number of patches, changes significantly among different classification schemes. Landscape complexity varies according to the class scheme considered and its underlying semantics, depending on how the different types aggregate or split when changing class scheme. Results confirm that the selection of a specific class scheme affects the spatial pattern of the derived landscapes and consequently the landscape metrics, especially at class level. Moreover, among the classification schemes considered, EUNIS seems to be the best choice for a comprehensive representation of both natural and anthropogenic classes.

  16. Automatic web services classification based on rough set theory

    Institute of Scientific and Technical Information of China (English)

    陈立; 张英; 宋自林; 苗壮

    2013-01-01

    With development of web services technology, the number of existing services in the internet is growing day by day. In order to achieve automatic and accurate services classification which can be beneficial for service related tasks, a rough set theory based method for services classification was proposed. First, the services descriptions were preprocessed and represented as vectors. Elicited by the discernibility matrices based attribute reduction in rough set theory and taking into account the characteristic of decision table of services classification, a method based on continuous discernibility matrices was proposed for dimensionality reduction. And finally, services classification was processed automatically. Through the experiment, the proposed method for services classification achieves approving classification result in all five testing categories. The experiment result shows that the proposed method is accurate and could be used in practical web services classification.

  17. An Efficient Content Based Image Retrieval Scheme

    Directory of Open Access Journals (Sweden)

    Zukuan WEI

    2013-11-01

    Full Text Available Due to the recent improvements in digital photography and storage capacity, storing large amounts of images has been made possible. Consequently efficient means to retrieve images matching a user’s query are needed. In this paper, we propose a framework based on a bipartite graph model (BGM for semantic image retrieval. BGM is a scalable data structure that aids semantic indexing in an efficient manner, and it can also be incrementally updated. Firstly, all the images are segmented into several regions with image segmentation algorithm, pre-trained SVMs are used to annotate each region, and final label is obtained by merging all the region labels. Then we use the set of images and the set of region labels to build a bipartite graph. When a query is given, a query node, initially containing a fixed number of labels, is created to attach to the bipartite graph. The node then distributes the labels based on the edge weight between the node and its neighbors. Image nodes receiving the most labels represent the most relevant images. Experimental results demonstrate that our proposed technique is promising.

  18. Dihedral-based segment identification and classification of biopolymers I: proteins.

    Science.gov (United States)

    Nagy, Gabor; Oostenbrink, Chris

    2014-01-27

    A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail.

  19. Dihedral-Based Segment Identification and Classification of Biopolymers I: Proteins

    Science.gov (United States)

    2013-01-01

    A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail. PMID:24364820

  20. Optical tomographic detection of rheumatoid arthritis with computer-aided classification schemes

    Science.gov (United States)

    Klose, Christian D.; Klose, Alexander D.; Netz, Uwe; Beuthan, Jürgen; Hielscher, Andreas H.

    2009-02-01

    A recent research study has shown that combining multiple parameters, drawn from optical tomographic images, leads to better classification results to identifying human finger joints that are affected or not affected by rheumatic arthritis RA. Building up on the research findings of the previous study, this article presents an advanced computer-aided classification approach for interpreting optical image data to detect RA in finger joints. Additional data are used including, for example, maximum and minimum values of the absorption coefficient as well as their ratios and image variances. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index and area under the curve AUC. Results were compared to different benchmarks ("gold standard"): magnet resonance, ultrasound and clinical evaluation. Maximum accuracies (AUC=0.88) were reached when combining minimum/maximum-ratios and image variances and using ultrasound as gold standard.

  1. Models of Marine Fish Biodiversity: Assessing Predictors from Three Habitat Classification Schemes.

    Science.gov (United States)

    Yates, Katherine L; Mellin, Camille; Caley, M Julian; Radford, Ben T; Meeuwig, Jessica J

    2016-01-01

    Prioritising biodiversity conservation requires knowledge of where biodiversity occurs. Such knowledge, however, is often lacking. New technologies for collecting biological and physical data coupled with advances in modelling techniques could help address these gaps and facilitate improved management outcomes. Here we examined the utility of environmental data, obtained using different methods, for developing models of both uni- and multivariate biodiversity metrics. We tested which biodiversity metrics could be predicted best and evaluated the performance of predictor variables generated from three types of habitat data: acoustic multibeam sonar imagery, predicted habitat classification, and direct observer habitat classification. We used boosted regression trees (BRT) to model metrics of fish species richness, abundance and biomass, and multivariate regression trees (MRT) to model biomass and abundance of fish functional groups. We compared model performance using different sets of predictors and estimated the relative influence of individual predictors. Models of total species richness and total abundance performed best; those developed for endemic species performed worst. Abundance models performed substantially better than corresponding biomass models. In general, BRT and MRTs developed using predicted habitat classifications performed less well than those using multibeam data. The most influential individual predictor was the abiotic categorical variable from direct observer habitat classification and models that incorporated predictors from direct observer habitat classification consistently outperformed those that did not. Our results show that while remotely sensed data can offer considerable utility for predictive modelling, the addition of direct observer habitat classification data can substantially improve model performance. Thus it appears that there are aspects of marine habitats that are important for modelling metrics of fish biodiversity that are

  2. Movie Popularity Classification based on Inherent Movie Attributes using C4.5, PART and Correlation Coefficient

    DEFF Research Database (Denmark)

    Ibnal Asad, Khalid; Ahmed, Tanvir; Rahman, Md. Saiedur

    2012-01-01

    Abundance of movie data across the internet makes it an obvious candidate for machine learning and knowledge discovery. But most researches are directed towards bi-polar classification of movie or generation of a movie recommendation system based on reviews given by viewers on various internet si...... propose classification scheme of pre-release movie popularity based on inherent attributes using C4.S and PART classifier algorithm and define the relation between attributes of post release movies using correlation coefficient....

  3. A Nominative Multi- Proxy Signature Scheme Based on ECC

    Institute of Scientific and Technical Information of China (English)

    MA Chuan-gui; GAO Feng-xiu; WANG Yan

    2005-01-01

    A nominative multi-proxy signature in which the original signer authorizes a group of proxy signers is presented. Meanwhile, our proposed scheme is based on elliptic curve cryptosystem which is more efficient than the corresponding one based on traditional discrete logarithm.

  4. Classification of High-Rise Residential Building Facilities: A Descriptive Survey on 170 Housing Scheme in Klang Valley

    Directory of Open Access Journals (Sweden)

    Abd Wahab Siti Rashidah Hanum

    2016-01-01

    Full Text Available High-rise residential building is a type of housing that has multi-dwelling units built on the same land. This type of housing has become popular each year in urban area due to the increasing cost of land. There are several common facilities provided in high-rise residential building. For example playground, swimming pool, gymnasium, 24 hours security system such as CCTV, access card and so on. Thus, maintenance works of the common facilities must be well organised. The purpose of this paper is to identify the classification of facilities provided at high rise residential building. The survey was done on 170 high-rise residential schemes by using stratified random sampling technique. The scope of this research is within Klang Valley area. This area is rapidly developed with high-rise residential building. The objective of this survey is to list down all the facilities provided in each sample of the schemes. The result, there are nine classification of facilities provided for high-rise residential building.

  5. Graph-based Methods for Orbit Classification

    Energy Technology Data Exchange (ETDEWEB)

    Bagherjeiran, A; Kamath, C

    2005-09-29

    An important step in the quest for low-cost fusion power is the ability to perform and analyze experiments in prototype fusion reactors. One of the tasks in the analysis of experimental data is the classification of orbits in Poincare plots. These plots are generated by the particles in a fusion reactor as they move within the toroidal device. In this paper, we describe the use of graph-based methods to extract features from orbits. These features are then used to classify the orbits into several categories. Our results show that existing machine learning algorithms are successful in classifying orbits with few points, a situation which can arise in data from experiments.

  6. Sentiment classification technology based on Markov logic networks

    Science.gov (United States)

    He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe

    2016-07-01

    With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.

  7. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  8. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    Science.gov (United States)

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  9. Judgement of Design Scheme Based on Flexible Constraint in ICAD

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The conception of flexible constraint is proposed in the paper. The solution of flexible constraint is in special range, and maybe different in different instances of same design scheme. The paper emphasis on how to evaluate and optimize a design scheme with flexible constraints based on the satisfaction degree function defined on flexible constraints. The conception of flexible constraint is used to solve constraint conflict and design optimization in complicated constraint-based assembly design by the PFM parametrization assembly design system. An instance of gear-box design is used for verifying optimization method.

  10. Pathway-based classification of cancer subtypes

    Directory of Open Access Journals (Sweden)

    Kim Shinuk

    2012-07-01

    Full Text Available Abstract Background Molecular markers based on gene expression profiles have been used in experimental and clinical settings to distinguish cancerous tumors in stage, grade, survival time, metastasis, and drug sensitivity. However, most significant gene markers are unstable (not reproducible among data sets. We introduce a standardized method for representing cancer markers as 2-level hierarchical feature vectors, with a basic gene level as well as a second level of (more stable pathway markers, for the purpose of discriminating cancer subtypes. This extends standard gene expression arrays with new pathway-level activation features obtained directly from off-the-shelf gene set enrichment algorithms such as GSEA. Such so-called pathway-based expression arrays are significantly more reproducible across datasets. Such reproducibility will be important for clinical usefulness of genomic markers, and augment currently accepted cancer classification protocols. Results The present method produced more stable (reproducible pathway-based markers for discriminating breast cancer metastasis and ovarian cancer survival time. Between two datasets for breast cancer metastasis, the intersection of standard significant gene biomarkers totaled 7.47% of selected genes, compared to 17.65% using pathway-based markers; the corresponding percentages for ovarian cancer datasets were 20.65% and 33.33% respectively. Three pathways, consisting of Type_1_diabetes mellitus, Cytokine-cytokine_receptor_interaction and Hedgehog_signaling (all previously implicated in cancer, are enriched in both the ovarian long survival and breast non-metastasis groups. In addition, integrating pathway and gene information, we identified five (ID4, ANXA4, CXCL9, MYLK, FBXL7 and six (SQLE, E2F1, PTTG1, TSTA3, BUB1B, MAD2L1 known cancer genes significant for ovarian and breast cancer respectively. Conclusions Standardizing the analysis of genomic data in the process of cancer staging

  11. A Detection Scheme for Cavity-based Dark Matter Searches

    CERN Document Server

    Bukhari, M H S

    2016-01-01

    We present here proposal of a scheme and some useful ideas for resonant cavity-based detection of cold dark matter axions with hope to improve the existing endeavors. The scheme is based upon our idea of a detector, which incorporates an integrated tunnel diode and a GaAs HEMT or HFET, High Electron Mobility Transistor or Heterogenous FET, for resonance detection and amplification from a resonant cavity (in a strong transverse magnetic field from a cylindrical array of halbach magnets). The idea of a TD-oscillator-amplifier combination could possibly serve as a more sensitive and viable resonance detection regime while maintaining an excellent performance with low noise temperature, whereas the halbach magnets array may offer a compact and permanent solution replacing the conventional electromagnets scheme. We believe that all these factors could possibly increase the sensitivity and accuracy of axion detection searches and reduce complications (and associated costs) in the experiments, in addition to help re...

  12. Knowledge-based sea ice classification by polarimetric SAR

    DEFF Research Database (Denmark)

    Skriver, Henning; Dierking, Wolfgang

    2004-01-01

    Polarimetric SAR images acquired at C- and L-band over sea ice in the Greenland Sea, Baltic Sea, and Beaufort Sea have been analysed with respect to their potential for ice type classification. The polarimetric data were gathered by the Danish EMISAR and the US AIRSAR which both are airborne...... systems. A hierarchical classification scheme was chosen for sea ice because our knowledge about magnitudes, variations, and dependences of sea ice signatures can be directly considered. The optimal sequence of classification rules and the rules themselves depend on the ice conditions/regimes. The use...... of the polarimetric phase information improves the classification only in the case of thin ice types but is not necessary for thicker ice (above about 30 cm thickness)...

  13. A MapReduce based Parallel SVM for Email Classification

    Directory of Open Access Journals (Sweden)

    Ke Xu

    2014-06-01

    Full Text Available Support Vector Machine (SVM is a powerful classification and regression tool. Varying approaches including SVM based techniques are proposed for email classification. Automated email classification according to messages or user-specific folders and information extraction from chronologically ordered email streams have become interesting areas in text machine learning research. This paper presents a parallel SVM based on MapReduce (PSMR algorithm for email classification. We discuss the challenges that arise from differences between email foldering and traditional document classification. We show experimental results from an array of automated classification methods and evaluation methodologies, including Naive Bayes, SVM and PSMR method of foldering results on the Enron datasets based on the timeline. By distributing, processing and optimizing the subsets of the training data across multiple participating nodes, the parallel SVM based on MapReduce algorithm reduces the training time significantly

  14. Towards the creation of a flexible classification scheme for voluntarily reported transfusion and laboratory safety events

    Directory of Open Access Journals (Sweden)

    Whitehurst Julie M

    2012-05-01

    Full Text Available Abstract Background Transfusion and clinical laboratory services are high-volume activities involving complicated workflows across both ambulatory and inpatient environments. As a result, there are many opportunities for safety lapses, leading to patient harm and increased costs. Organizational techniques such as voluntary safety event reporting are commonly used to identify and prioritize risk areas across care settings. Creation of functional, standardized safety data structures that facilitate effective exploratory examination is therefore essential to drive quality improvement interventions. Unfortunately, voluntarily reported adverse event data can often be unstructured or ambiguously defined. Results To address this problem, we sought to create a “best-of-breed” patient safety classification for data contained in the Duke University Health System Safety Reporting System (SRS. Our approach was to implement the internationally recognized World Health Organization International Classification for Patient Safety Framework, supplemented with additional data points relevant to our organization. Data selection and integration into the hierarchical framework is discussed, as well as placement of the classification into the SRS. We evaluated the impact of the new SRS classification on system usage through comparisons of monthly average report rates and completion times before and after implementation. Monthly average inpatient transfusion reports decreased from 102.1 ± 14.3 to 91.6 ± 11.2, with the proportion of transfusion reports in our system remaining consistent before and after implementation. Monthly average transfusion report rates in the outpatient and homecare environments were not significantly different. Significant increases in clinical lab report rates were present across inpatient and outpatient environments, with the proportion of lab reports increasing after implementation. Report completion times increased modestly

  15. Gender Classification Based on Geometry Features of Palm Image

    OpenAIRE

    Ming Wu; Yubo Yuan

    2014-01-01

    This paper presents a novel gender classification method based on geometry features of palm image which is simple, fast, and easy to handle. This gender classification method based on geometry features comprises two main attributes. The first one is feature extraction by image processing. The other one is classification system with polynomial smooth support vector machine (PSSVM). A total of 180 palm images were collected from 30 persons to verify the validity of the proposed gender classi...

  16. A classification scheme of Amino Acids in the Genetic Code by Group Theory

    CERN Document Server

    Sachse, Sebastian

    2012-01-01

    We derive the amino acid assignment to one codon representation (typical 64-dimensional irreducible representation) of the basic classical Lie superalgebra osp(5|2) from biochemical arguments. We motivate the approach of mathematical symmetries to the classification of the building constituents of the biosphere by analogy of its success in particle physics and chemistry. The model enables to calculate polarity and molecular volume of amino acids to a good approximation.

  17. Auction-based schemes for multipath routing in selfish networks

    OpenAIRE

    Zhou, H; Leung, KC; Li, VOK

    2013-01-01

    We study multipath routing with traffic assignment in selfish networks. Based on the Vickrey-Clarke-Groves (VCG) auction, an optimal and strategy-proof scheme, known as optimal auction-based multipath routing (OAMR), is developed. However, OAMR is computationally expensive and cannot run in real time when the network size is large. Therefore, we propose sequential auction-based multipath routing (SAMR). SAMR handles routing requests sequentially using some greedy strategies. In particular, wi...

  18. DNA sequence analysis using hierarchical ART-based classification networks

    Energy Technology Data Exchange (ETDEWEB)

    LeBlanc, C.; Hruska, S.I. [Florida State Univ., Tallahassee, FL (United States); Katholi, C.R.; Unnasch, T.R. [Univ. of Alabama, Birmingham, AL (United States)

    1994-12-31

    Adaptive resonance theory (ART) describes a class of artificial neural network architectures that act as classification tools which self-organize, work in real-time, and require no retraining to classify novel sequences. We have adapted ART networks to provide support to scientists attempting to categorize tandem repeat DNA fragments from Onchocerca volvulus. In this approach, sequences of DNA fragments are presented to multiple ART-based networks which are linked together into two (or more) tiers; the first provides coarse sequence classification while the sub- sequent tiers refine the classifications as needed. The overall rating of the resulting classification of fragments is measured using statistical techniques based on those introduced to validate results from traditional phylogenetic analysis. Tests of the Hierarchical ART-based Classification Network, or HABclass network, indicate its value as a fast, easy-to-use classification tool which adapts to new data without retraining on previously classified data.

  19. Security problems with a chaos-based deniable authentication scheme

    International Nuclear Information System (INIS)

    Recently, a new scheme was proposed for deniable authentication. Its main originality lied on applying a chaos-based encryption-hash parallel algorithm and the semi-group property of the Chebyshev chaotic map. Although original and practicable, its insecurity and inefficiency are shown in this paper, thus rendering it inadequate for adoption in e-commerce

  20. WEAKNESS ON CRYPTOGRAPHIC SCHEMES BASED ON REGULAR LDPC CODES

    Directory of Open Access Journals (Sweden)

    Omessaad Hamdi

    2010-01-01

    Full Text Available We propose a method to recover the structure of a randomly permuted chained code and how to cryptanalyse cryptographic schemes based on these kinds of error coding. As application of these methods is a cryptographic schema using regular Low Density Parity Check (LDPC Codes. This result prohibits the use of chained code and particularly regular LDPC codes on cryptography.

  1. WEAKNESS ON CRYPTOGRAPHIC SCHEMES BASED ON REGULAR LDPC CODES

    OpenAIRE

    Omessaad Hamdi; Manel abdelhedi; Ammar Bouallegue; Sami Harari

    2010-01-01

    We propose a method to recover the structure of a randomly permuted chained code and how to cryptanalyse cryptographic schemes based on these kinds of error coding. As application of these methods is a cryptographic schema using regular Low Density Parity Check (LDPC) Codes. This result prohibits the use of chained code and particularly regular LDPC codes on cryptography.

  2. A ROBUST WAVELET BASED WATERMARKING SCHEME FOR DIGITAL AUDIO

    Directory of Open Access Journals (Sweden)

    Ayad Ibrahim Abdulsada

    2012-06-01

    Full Text Available In this paper, a robust wavelet based watermarking scheme has been proposed for digital audio. A single bit is embedded in the approximation part of each frame. The watermark bits are embedded in two subsets of indexes randomly generated by using two keys for security purpose. The embedding process is done in adaptively fashion according to the mean of each approximation part. The detection of watermark does not depend on the original audio. To measure the robustness of the algorithm, different signal processing operations have been applied on the watermarked audio. Several experimental results have been conducted to illustrate the robustness and efficiency of the proposed watermarked audio scheme.

  3. A Fair E-Cash Payment Scheme Based on Credit

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new fair e-cash payment scheme based on credit is present in this paper. In the scheme, an overdraft credit certificate is issued to user by bank. Using the overdraft credit certificate, user can produce e-cash himself to pay in exchanges. Merchant can verify the e-cash received from user. Bank can make a fair dispute resolution when there is a dissension between user and merchant. It can avoid the problem of partition e-cash for changes, prevent from reusing e-cash and faking e-cash. It fits justice, anonymity, non-deny and impartiality.

  4. Design Validations for Discrete Logarithm Based Signature Schemes

    OpenAIRE

    Brickell, Ernest; Pointcheval, David; Vaudenay, Serge

    2000-01-01

    A number of signature schemes and standards have been recently designed, based on the discrete logarithm problem. Examples of standards are the DSA and the KCDSA. Very few formal design/security validations have already been conducted for both the KCDSA and the DSA, but in the "full" so-called random oracle model. In this paper we try to minimize the use of ideal hash functions for several Discrete Logarithm (DSS-like) signatures (abstracted as generic schemes). Namely, we show that the follo...

  5. DWT-Based Watermarking Scheme for Digital Images

    Institute of Scientific and Technical Information of China (English)

    何泉; 苏广川

    2003-01-01

    A watermarking scheme for digital images is introduced. This method is based on discrete wavelet transform and spread spectrum technique. A discrete wavelet transformed binary signature image is expanded by an m-sequence and added to the large wavelet coefficients of a host image with a scale factor. Good balance between transparency and robustness is achieved by the selection of the scale factor. In addition, the spread spectrum technique is adopted to increase the robustness of this watermarking scheme. The experimental results show that the proposed method is of good performance and robustness for common image operations such as JPEG lossy compression, etc.

  6. A novel current-sharing scheme based on magamp

    Institute of Scientific and Technical Information of China (English)

    Wen-xi YAO; Xiao-yuan HONG; Zheng-yu LU

    2008-01-01

    The magamp (magnetic amplifier) is widely used in power supplies due to its low cost,simplicity and other advantages.This paper discusses a novel application of the magamp in switching power supplies,where the magamp is used to regulate pulse width modulation (PWM) instead of power signal in the main circuit.This method extends the application of the magamp in power supplies,and makes it possible to further regulate control signal when PWMs have been generated.Based on this application,a new current-sharing (CS) scheme using the magamp is proposed,which uses a modified inner loop CS structure.In this scheme PWMs are generated by one main controller,and CS is achieved by regulating PWMs using a magamp in each module.Compared with traditional application of the magamp,the new CS scheme can be used in most topologies and only requires magamps of low power capacity.Then a test circuit of parallel power supply is developed,in which CS is achieved by a PWM regulator with the magamp.The proposed scheme is also used to upgrade an electroplate power to make it capable of paralleling supplies.Experimental results show that the proposed scheme has good CS performance.

  7. An Efficient Provable Secure ID-Based Proxy Signature Scheme Based on CDH Assumption

    Institute of Scientific and Technical Information of China (English)

    CHAI Zhen-chuan; CAO Zhen-fu; LU Rong-xing

    2006-01-01

    Identity-based proxy signature enables an entity to delegate its signing rights to another entity in identity-based cryptosystem settings. However, few existing scheme has been proved secure in a formalized model, or acquired optimized performance. To achieve the goals of both proven security and high efficiency, this paper proposed an efficient identity-based proxy signature scheme. The scheme is constructed from bilinear pairing and proved secure in the random oracle model, using the oracle replay attack technique introduced by Pointchval and Stern. The analysis shows that the scheme needs less computation costs and has a shorter signature than the other schemes.

  8. Structure-Based Algorithms for Microvessel Classification

    KAUST Repository

    Smith, Amy F.

    2015-02-01

    © 2014 The Authors. Microcirculation published by John Wiley & Sons Ltd. Objective: Recent developments in high-resolution imaging techniques have enabled digital reconstruction of three-dimensional sections of microvascular networks down to the capillary scale. To better interpret these large data sets, our goal is to distinguish branching trees of arterioles and venules from capillaries. Methods: Two novel algorithms are presented for classifying vessels in microvascular anatomical data sets without requiring flow information. The algorithms are compared with a classification based on observed flow directions (considered the gold standard), and with an existing resistance-based method that relies only on structural data. Results: The first algorithm, developed for networks with one arteriolar and one venular tree, performs well in identifying arterioles and venules and is robust to parameter changes, but incorrectly labels a significant number of capillaries as arterioles or venules. The second algorithm, developed for networks with multiple inlets and outlets, correctly identifies more arterioles and venules, but is more sensitive to parameter changes. Conclusions: The algorithms presented here can be used to classify microvessels in large microvascular data sets lacking flow information. This provides a basis for analyzing the distinct geometrical properties and modelling the functional behavior of arterioles, capillaries, and venules.

  9. Classification of CMEs Based on Their Dynamics

    Science.gov (United States)

    Nicewicz, J.; Michalek, G.

    2016-05-01

    A large set of coronal mass ejections CMEs (6621) has been selected to study their dynamics seen with the Large Angle and Spectroscopic Coronagraph (LASCO) onboard the Solar and Heliospheric Observatory (SOHO) field of view (LFOV). These events were selected based on having at least six height-time measurements so that their dynamic properties, in the LFOV, can be evaluated with reasonable accuracy. Height-time measurements (in the SOHO/LASCO catalog) were used to determine the velocities and accelerations of individual CMEs at successive distances from the Sun. Linear and quadratic functions were fitted to these data points. On the basis of the best fits to the velocity data points, we were able to classify CMEs into four groups. The types of CMEs do not only have different dynamic behaviors but also different masses, widths, velocities, and accelerations. We also show that these groups of events are initiated by different onset mechanisms. The results of our study allow us to present a consistent classification of CMEs based on their dynamics.

  10. Video segmentation and classification for content-based storage and retrieval using motion vectors

    Science.gov (United States)

    Fernando, W. A. C.; Canagarajah, Cedric N.; Bull, David R.

    1998-12-01

    Video parsing is an important step in content-based indexing techniques, where the input video is decomposed into segments with uniform content. In video parsing detection of scene changes is one of the approaches widely used for extracting key frames from the video sequence. In this paper, an algorithm, based on motion vectors, is proposed to detect sudden scene changes and gradual scene changes (camera movements such as panning, tilting and zooming). Unlike some of the existing schemes, the proposed scheme is capable of detecting both sudden and gradual changes in uncompressed, as well as, compressed domain video. It is shown that the resultant motion vector can be used to identify and classify gradual changes due to camera movements. Results show that algorithm performed as well as the histogram-based schemes, with uncompressed video. The performance of the algorithm was also investigated with H.263 compressed video. The detection and classification of both sudden and gradual scene changes was successfully demonstrated.

  11. MEDICAL DIAGNOSIS CLASSIFICATION USING MIGRATION BASED DIFFERENTIAL EVOLUTION ALGORITHM

    Directory of Open Access Journals (Sweden)

    Htet Thazin Tike Thein

    2014-12-01

    Full Text Available Constructing a classification model is important in machine learning for a particular task. A classification process involves assigning objects into predefined groups or classes based on a number of observed attributes related to those objects. Artificial neural network is one of the classification algorithms which, can be used in many application areas. This paper investigates the potential of applying the feed forward neural network architecture for the classification of medical datasets. Migration based differential evolution algorithm (MBDE is chosen and applied to feed forward neural network to enhance the learning process and the network learning is validated in terms of convergence rate and classification accuracy. In this paper, MBDE algorithm with various migration policies is proposed for classification problems using medical diagnosis.

  12. Development of a classification scheme for disease-related enzyme information

    Directory of Open Access Journals (Sweden)

    Söhngen Carola

    2011-08-01

    Full Text Available Abstract Background BRENDA (BRaunschweig ENzyme DAtabase, http://www.brenda-enzymes.org is a major resource for enzyme related information. First and foremost, it provides data which are manually curated from the primary literature. DRENDA (Disease RElated ENzyme information DAtabase complements BRENDA with a focus on the automatic search and categorization of enzyme and disease related information from title and abstracts of primary publications. In a two-step procedure DRENDA makes use of text mining and machine learning methods. Results Currently enzyme and disease related references are biannually updated as part of the standard BRENDA update. 910,897 relations of EC-numbers and diseases were extracted from titles or abstracts and are included in the second release in 2010. The enzyme and disease entity recognition has been successfully enhanced by a further relation classification via machine learning. The classification step has been evaluated by a 5-fold cross validation and achieves an F1 score between 0.802 ± 0.032 and 0.738 ± 0.033 depending on the categories and pre-processing procedures. In the eventual DRENDA content every category reaches a classification specificity of at least 96.7% and a precision that ranges from 86-98% in the highest confidence level, and 64-83% for the smallest confidence level associated with higher recall. Conclusions The DRENDA processing chain analyses PubMed, locates references with disease-related information on enzymes and categorises their focus according to the categories causal interaction, therapeutic application, diagnostic usage and ongoing research. The categorisation gives an impression on the focus of the located references. Thus, the relation categorisation can facilitate orientation within the rapidly growing number of references with impact on diseases and enzymes. The DRENDA information is available as additional information in BRENDA.

  13. A New Wavelet-Based Document Image Segmentation Scheme

    Institute of Scientific and Technical Information of China (English)

    赵健; 李道京; 俞卞章; 耿军平

    2002-01-01

    The document image segmentation is very useful for printing, faxing and data processing. An algorithm is developed for segmenting and classifying document image. Feature used for classification is based on the histogram distribution pattern of different image classes. The important attribute of the algorithm is using wavelet correlation image to enhance raw image's pattern, so the classification accuracy is improved. In this paper document image is divided into four types: background, photo, text and graph. Firstly, the document image background has been distingusished easily by former normally method; secondly, three image types will be distinguished by their typical histograms, in order to make histograms feature clearer, each resolution' s HH wavelet subimage is used to add to the raw image at their resolution. At last, the photo, text and praph have been devided according to how the feature fit to the Laplacian distrbution by -X2 and L. Simulations show that classification accuracy is significantly improved. The comparison with related shows that our algorithm provides both lower classification error rates and better visual results.

  14. A new classification algorithm based on RGH-tree search

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper, we put forward a new classification algorithm based on RGH-Tree search and perform the classification analysis and comparison study. This algorithm can save computing resource and increase the classification efficiency. The experiment shows that this algorithm can get better effect in dealing with three dimensional multi-kind data. We find that the algorithm has better generalization ability for small training set and big testing result.

  15. A New Symbol Synchronization Scheme for Cyclic Prefix Based Systems

    Institute of Scientific and Technical Information of China (English)

    KUANG Yu-jun; TENG Yong; YIN Chang-chuan; HAO Jian-jun; YUE Guang-xin

    2003-01-01

    This contribution proposes a new symbol synchronization scheme for cyclic prefix based modulation systems, which is disclosed in Ref.[16]. The proposed algorithm involves two steps. By using short-time Fourier transform, ISI-free intervals are estimated from time-frequency spectrum of the received signal, and then an optimum symbol start time is obtained. Computer simulation results show that the algorithm is very robust, and outperforms those based upon time-domain correlations.

  16. Enhanced ensemble-based 4DVar scheme for data assimilation

    OpenAIRE

    Yang, Yin; Robinson, Cordelia; Heitz, Dominique; Mémin, Etienne

    2015-01-01

    International audience Ensemble based optimal control schemes combine the components of ensemble Kalman filters and variational data assimilation (4DVar). They are trendy because they are easier to implement than 4DVar. In this paper, we evaluate a modified version of an ensemble based optimal control strategy for image data assimilation. This modified method is assessed with a Shallow Water model combined with synthetic data and original incomplete experimental depth sensor observations. ...

  17. An Extensible, Kinematically-Based Gesture Annotation Scheme

    OpenAIRE

    Martell, Craig H.

    2002-01-01

    Chapter 1 in the book: Advances in Natural Multimodal Dialogue Systems Annotated corpora have played a critical role in speech and natural language research; and, there is an increasing interest in corpora-based research in sign language and gesture as well. We present a non-semantic, geometrically-based annotation scheme, FORM, which allows an annotator to capture the kinematic information in a gesture just from videos of speakers. In addition, FORM stores this gestural in...

  18. Deflection routing scheme for GMPLS-based OBS networks

    DEFF Research Database (Denmark)

    Eid, Arafat; Mahmood, Waqar; Alomar, Anwar;

    2010-01-01

    Integrating the Generalized Multi-Protocol Label Switching (GMPLS) framework into an Optical Burst Switching (OBS) Control Plane is a promising solution to alleviating most of OBS performance and design issues. However, implementing the already proposed OBS deflection routing schemes is not appli......Integrating the Generalized Multi-Protocol Label Switching (GMPLS) framework into an Optical Burst Switching (OBS) Control Plane is a promising solution to alleviating most of OBS performance and design issues. However, implementing the already proposed OBS deflection routing schemes...... is not applicable in such an integrated solution. This is due to the existence of already established Label Switched Paths (LSPs) between edge nodes in a GMPLS-based OBS network which guide the Data Burst Headers (DBHs) through the network. In this paper we propose a novel deflection routing scheme which can...... be implemented in GMPLS-based OBS Control Plane. In this scheme, deflection routes or LSPs are designed and pre-established for the whole network. The ingress nodes are responsible for enabling DBHs for deflection at contending core ports prior to DBHs transmission. Moreover, we propose an object extension...

  19. Classification of types of stuttering symptoms based on brain activity.

    Directory of Open Access Journals (Sweden)

    Jing Jiang

    Full Text Available Among the non-fluencies seen in speech, some are more typical (MT of stuttering speakers, whereas others are less typical (LT and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT whole-word repetitions (WWR should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type.

  20. Opposition-Based Discrete PSO Using Natural Encoding for Classification Rule Discovery

    Directory of Open Access Journals (Sweden)

    Naveed Kazim Khan

    2012-11-01

    Full Text Available In this paper we present a new Discrete Particle Swarm Optimization approach to induce rules from discrete data. The proposed algorithm, called Opposition‐ based Natural Discrete PSO (ONDPSO, initializes its population by taking into account the discrete nature of the data. Particles are encoded using a Natural Encoding scheme. Each member of the population updates its position iteratively on the basis of a newly designed position update rule. Opposition‐based learning is implemented in the optimization process. The encoding scheme and position update rule used by the algorithm allows individual terms corresponding to different attributes within the rule’s antecedent to be a disjunction of the values of those attributes. The performance of the proposed algorithm is evaluated against seven different datasets using a tenfold testing scheme. The achieved median accuracy is compared against various evolutionary and non‐evolutionary classification techniques. The algorithm produces promising results by creating highly accurate and precise rules for each dataset.

  1. WekaPyScript: Classification, Regression, and Filter Schemes for WEKA Implemented in Python

    Directory of Open Access Journals (Sweden)

    Christopher Beckham

    2016-08-01

    Full Text Available WekaPyScript is a package for the machine learning software WEKA that allows learning algorithms and preprocessing methods for classification and regression to be written in Python, as opposed to WEKA’s implementation language, Java. This opens up WEKA to its machine learning and scientific computing ecosystem. Furthermore, due to Python’s minimalist syntax, learning algorithms and preprocessing methods can be prototyped easily and utilised from within WEKA. WekaPyScript works by running a local Python server using the host’s installation of Python; as a result, any libraries installed in the host installation can be leveraged when writing a script for WekaPyScript. Three example scripts (two learning algorithms and one preprocessing method are presented.

  2. Preliminary Research on Grassland Fine-classification Based on MODIS

    International Nuclear Information System (INIS)

    Grassland ecosystem is important for climatic regulation, maintaining the soil and water. Research on the grassland monitoring method could provide effective reference for grassland resource investigation. In this study, we used the vegetation index method for grassland classification. There are several types of climate in China. Therefore, we need to use China's Main Climate Zone Maps and divide the study region into four climate zones. Based on grassland classification system of the first nation-wide grass resource survey in China, we established a new grassland classification system which is only suitable for this research. We used MODIS images as the basic data resources, and use the expert classifier method to perform grassland classification. Based on the 1:1,000,000 Grassland Resource Map of China, we obtained the basic distribution of all the grassland types and selected 20 samples evenly distributed in each type, then used NDVI/EVI product to summarize different spectral features of different grassland types. Finally, we introduced other classification auxiliary data, such as elevation, accumulate temperature (AT), humidity index (HI) and rainfall. China's nation-wide grassland classification map is resulted by merging the grassland in different climate zone. The overall classification accuracy is 60.4%. The result indicated that expert classifier is proper for national wide grassland classification, but the classification accuracy need to be improved

  3. A novel transferable individual tree crown delineation model based on Fishing Net Dragging and boundary classification

    Science.gov (United States)

    Liu, Tao; Im, Jungho; Quackenbush, Lindi J.

    2015-12-01

    This study provides a novel approach to individual tree crown delineation (ITCD) using airborne Light Detection and Ranging (LiDAR) data in dense natural forests using two main steps: crown boundary refinement based on a proposed Fishing Net Dragging (FiND) method, and segment merging based on boundary classification. FiND starts with approximate tree crown boundaries derived using a traditional watershed method with Gaussian filtering and refines these boundaries using an algorithm that mimics how a fisherman drags a fishing net. Random forest machine learning is then used to classify boundary segments into two classes: boundaries between trees and boundaries between branches that belong to a single tree. Three groups of LiDAR-derived features-two from the pseudo waveform generated along with crown boundaries and one from a canopy height model (CHM)-were used in the classification. The proposed ITCD approach was tested using LiDAR data collected over a mountainous region in the Adirondack Park, NY, USA. Overall accuracy of boundary classification was 82.4%. Features derived from the CHM were generally more important in the classification than the features extracted from the pseudo waveform. A comprehensive accuracy assessment scheme for ITCD was also introduced by considering both area of crown overlap and crown centroids. Accuracy assessment using this new scheme shows the proposed ITCD achieved 74% and 78% as overall accuracy, respectively, for deciduous and mixed forest.

  4. AN OBJECT-BASED METHOD FOR CHINESE LANDFORM TYPES CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Ding

    2016-06-01

    Full Text Available Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM. In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  5. A Curriculum-Based Classification System for Community Colleges.

    Science.gov (United States)

    Schuyler, Gwyer

    2003-01-01

    Proposes and tests a community college classification system based on curricular characteristics and their association with institutional characteristics. Seeks readily available data correlates to represent percentage of a college's course offerings that are in the liberal arts. A simple two-category classification system using total enrollment…

  6. An Object-Based Method for Chinese Landform Types Classification

    Science.gov (United States)

    Ding, Hu; Tao, Fei; Zhao, Wufan; Na, Jiaming; Tang, Guo'an

    2016-06-01

    Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM). In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  7. A Color Image Digital Watermarking Scheme Based on SOFM

    CERN Document Server

    Anitha, J

    2011-01-01

    Digital watermarking technique has been presented and widely researched to solve some important issues in the digital world, such as copyright protection, copy protection and content authentication. Several robust watermarking schemes based on vector quantization (VQ) have been presented. In this paper, we present a new digital image watermarking method based on SOFM vector quantizer for color images. This method utilizes the codebook partition technique in which the watermark bit is embedded into the selected VQ encoded block. The main feature of this scheme is that the watermark exists both in VQ compressed image and in the reconstructed image. The watermark extraction can be performed without the original image. The watermark is hidden inside the compressed image, so much transmission time and storage space can be saved when the compressed data are transmitted over the Internet. Simulation results demonstrate that the proposed method has robustness against various image processing operations without sacrif...

  8. A Color Image Digital Watermarking Scheme Based on SOFM

    Directory of Open Access Journals (Sweden)

    J. Anitha

    2010-09-01

    Full Text Available Digital watermarking technique has been presented and widely researched to solve some important issues in the digital world, such as copyright protection, copy protection and content authentication. Several robust watermarking schemes based on vector quantization (VQ have been presented. In this paper, we present a new digital image watermarking method based on SOFM vector quantizer for color images. This method utilizes the codebook partition technique in which the watermark bit is embedded into the selected VQ encoded block. The main feature of this scheme is that the watermark exists both in VQ compressed image and in the reconstructed image. The watermark extraction can be performed without the original image. The watermark is hidden inside the compressed image, so much transmission time and storage space can be saved when the compressed data are transmitted over the Internet. Simulation results demonstrate that the proposed method has robustness against various image processing operations without sacrificing compression performance and the computational speed.

  9. Enhancing Community Detection By Affinity-based Edge Weighting Scheme

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Andy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanders, Geoffrey [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Henson, Van [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vassilevski, Panayot [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-05

    Community detection refers to an important graph analytics problem of finding a set of densely-connected subgraphs in a graph and has gained a great deal of interest recently. The performance of current community detection algorithms is limited by an inherent constraint of unweighted graphs that offer very little information on their internal community structures. In this paper, we propose a new scheme to address this issue that weights the edges in a given graph based on recently proposed vertex affinity. The vertex affinity quantifies the proximity between two vertices in terms of their clustering strength, and therefore, it is ideal for graph analytics applications such as community detection. We also demonstrate that the affinity-based edge weighting scheme can improve the performance of community detection algorithms significantly.

  10. Functions and Design Scheme of Tibet High Altitude Test Base

    Institute of Scientific and Technical Information of China (English)

    Yu Yongqing; Guo Jian; Yin Yu; Mao Yan; Li Guangfan; Fan Jianbin; Lu Jiayu; Su Zhiyi; Li Peng; Li Qingfeng; Liao Weiming; Zhou Jun

    2010-01-01

    @@ The functional orientation of the Tibet High Altitude Test Base, subordinated to the State Grid Corporation of China (SGCC), is to serve power transmission projects in high altitude areas, especially to provide technical support for southwestern hydropower delivery projects by UHVDC transmission and Qinghai-Tibet grid interconnection project. This paper presents the matters concerned during siting and planning, functions,design scheme, the main performances and parameters of the test facilities, as well as the tests and research tasks already carried out.

  11. Kinetic energy decomposition scheme based on information theory.

    Science.gov (United States)

    Imamura, Yutaka; Suzuki, Jun; Nakai, Hiromi

    2013-12-15

    We proposed a novel kinetic energy decomposition analysis based on information theory. Since the Hirshfeld partitioning for electron densities can be formulated in terms of Kullback-Leibler information deficiency in information theory, a similar partitioning for kinetic energy densities was newly proposed. The numerical assessments confirm that the current kinetic energy decomposition scheme provides reasonable chemical pictures for ionic and covalent molecules, and can also estimate atomic energies using a correction with viral ratios.

  12. Functions and Design Scheme of Tibet High Altitude Test Base

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The functional orientation of the Tibet High Altitude Test Base, subordinated to the State Grid Corporation of China (SGCC), is to serve power transmission projects in high altitude areas, especially to provide technical support for southwestern hydropower delivery projects by UHVDC transmission and Qinghai-Tibet grid interconnection project. This paper presents the matters concerned during siting and planning, functions, design scheme, the main performances and parameters of the test facilities, as well as...

  13. Adaptive Mesh Redistibution Method Based on Godunov's Scheme

    OpenAIRE

    Azarenok, Boris N.; Ivanenko, Sergey A.; Tang, Tao

    2003-01-01

    In this work, a detailed description for an efficent adaptive mesh redistribution algorithm based on the Godunov's scheme is presented. After each mesh iteration a second-order finite-volume flow solver is used to update the flow parameters at the new time level directly without using interpolation. Numerical experiments are perfomed to demonstrate the efficency and robustness of the proposed adaptive mesh algorithm in one and two dimensions.

  14. Revisiting Quantum Authentication Scheme Based on Entanglement Swapping

    Science.gov (United States)

    Naseri, Mosayeb

    2016-05-01

    The crucial issue of quantum communication protocol is its security. In this paper, the security of the Quantum Authentication Scheme Based on Entanglement Swapping proposed by Penghao et al. (Int J Theor Phys., doi: 10.1007/s10773-015-2662-7) is reanalyzed. It is shown that the original does not complete the task of quantum authentication and communication securely. Furthermore a simple improvement on the protocol is proposed.

  15. Fast Wavelet-Based Visual Classification

    CERN Document Server

    Yu, Guoshen

    2008-01-01

    We investigate a biologically motivated approach to fast visual classification, directly inspired by the recent work of Serre et al. Specifically, trading-off biological accuracy for computational efficiency, we explore using wavelet and grouplet-like transforms to parallel the tuning of visual cortex V1 and V2 cells, alternated with max operations to achieve scale and translation invariance. A feature selection procedure is applied during learning to accelerate recognition. We introduce a simple attention-like feedback mechanism, significantly improving recognition and robustness in multiple-object scenes. In experiments, the proposed algorithm achieves or exceeds state-of-the-art success rate on object recognition, texture and satellite image classification, language identification and sound classification.

  16. Cell morphology-based classification of red blood cells using holographic imaging informatics.

    Science.gov (United States)

    Yi, Faliu; Moon, Inkyu; Javidi, Bahram

    2016-06-01

    We present methods that automatically select a linear or nonlinear classifier for red blood cell (RBC) classification by analyzing the equality of the covariance matrices in Gabor-filtered holographic images. First, the phase images of the RBCs are numerically reconstructed from their holograms, which are recorded using off-axis digital holographic microscopy (DHM). Second, each RBC is segmented using a marker-controlled watershed transform algorithm and the inner part of the RBC is identified and analyzed. Third, the Gabor wavelet transform is applied to the segmented cells to extract a series of features, which then undergo a multivariate statistical test to evaluate the equality of the covariance matrices of the different shapes of the RBCs using selected features. When these covariance matrices are not equal, a nonlinear classification scheme based on quadratic functions is applied; otherwise, a linear classification is applied. We used the stomatocyte, discocyte, and echinocyte RBC for classifier training and testing. Simulation results demonstrated that 10 of the 14 RBC features are useful in RBC classification. Experimental results also revealed that the covariance matrices of the three main RBC groups are not equal and that a nonlinear classification method has a much lower misclassification rate. The proposed automated RBC classification method has the potential for use in drug testing and the diagnosis of RBC-related diseases. PMID:27375953

  17. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  18. Shape classification based on singular value decomposition transform

    Institute of Scientific and Technical Information of China (English)

    SHAABAN Zyad; ARIF Thawar; BABA Sami; KREKOR Lala

    2009-01-01

    In this paper, a new shape classification system based on singular value decomposition (SVD) transform using nearest neighbour classifier was proposed. The gray scale image of the shape object was converted into a black and white image. The squared Euclidean distance transform on binary image was applied to extract the boundary image of the shape. SVD transform features were extracted from the the boundary of the object shapes. In this paper, the proposed classification system based on SVD transform feature extraction method was compared with classifier based on moment invariants using nearest neighbour classifier. The experimental results showed the advantage of our proposed classification system.

  19. Multiclass Classification Based on the Analytical Center of Version Space

    Institute of Scientific and Technical Information of China (English)

    ZENGFanzi; QIUZhengding; YUEJianhai; LIXiangqian

    2005-01-01

    Analytical center machine, based on the analytical center of version space, outperforms support vector machine, especially when the version space is elongated or asymmetric. While analytical center machine for binary classification is well understood, little is known about corresponding multiclass classification.Moreover, considering that the current multiclass classification method: “one versus all” needs repeatedly constructing classifiers to separate a single class from all the others, which leads to daunting computation and low efficiency of classification, and that though multiclass support vector machine corresponds to a simple quadratic optimization, it is not very effective when the version spaceis asymmetric or elongated, Thus, the multiclass classification approach based on the analytical center of version space is proposed to address the above problems. Experiments on wine recognition and glass identification dataset demonstrate validity of the approach proposed.

  20. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Directory of Open Access Journals (Sweden)

    Le Li

    Full Text Available Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions.

  1. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849

  2. Behavior Based Social Dimensions Extraction for Multi-Label Classification

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes’ behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes’ connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849

  3. A Fuzzy Logic Based Sentiment Classification

    Directory of Open Access Journals (Sweden)

    J.I.Sheeba

    2014-07-01

    Full Text Available Sentiment classification aims to detect information such as opinions, explicit , implicit feelings expressed in text. The most existing approaches are able to detect either explicit expressions or implicit expressions of sentiments in the text separately. In this proposed framework it will detect both Implicit and Explicit expressions available in the meeting transcripts. It will classify the Positive, Negative, Neutral words and also identify the topic of the particular meeting transcripts by using fuzzy logic. This paper aims to add some additional features for improving the classification method. The quality of the sentiment classification is improved using proposed fuzzy logic framework .In this fuzzy logic it includes the features like Fuzzy rules and Fuzzy C-means algorithm.The quality of the output is evaluated using the parameters such as precision, recall, f-measure. Here Fuzzy C-means Clustering technique measured in terms of Purity and Entropy. The data set was validated using 10-fold cross validation method and observed 95% confidence interval between the accuracy values .Finally, the proposed fuzzy logic method produced more than 85 % accurate results and error rate is very less compared to existing sentiment classification techniques.

  4. A Novel User Authentication Scheme Based on QR-Code

    Directory of Open Access Journals (Sweden)

    Kuan-Chieh Liao

    2010-08-01

    Full Text Available User authentication is one of the fundamental procedures to ensure secure communications and share system resources over an insecure public network channel.  Thus, a simple and efficient authentication mechanism is required for securing the network system in the real environment. In general, the password-based authentication mechanism provides the basic capability to prevent unauthorized access. Especially, the purpose of the one-time password is to make it more difficult to gain unauthorized access to restricted resources. Instead of using the password file as conventional authentication systems, many researchers have devoted to implement various one-time password schemes using smart cards, time-synchronized token or short message service in order to reduce the risk of tampering and maintenance cost.  However, these schemes are impractical because of the far from ubiquitous hardware devices or the infrastructure requirements. To remedy these weaknesses, the attraction of the QR-code technique can be introduced into our one-time password authentication protocol. Not the same as before, the proposed scheme based on QR code not only eliminates the usage of the password verification table, but also is a cost effective solution since most internet users already have mobile phones. For this reason, instead of carrying around a separate hardware token for each security domain, the superiority of handiness benefit from the mobile phone makes our approach more practical and convenient.

  5. An image encryption scheme based on quantum logistic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Lim, S.-C.; Hassan, Z.

    2012-12-01

    The topic of quantum chaos has begun to draw increasing attention in recent years. While a satisfactory definition for it is not settled yet in order to differentiate between its classical counterparts. Dissipative quantum maps can be characterized by sensitive dependence on initial conditions, like classical maps. Considering this property, an implementation of image encryption scheme based on the quantum logistic map is proposed. The security and performance analysis of the proposed image encryption is performed using well-known methods. The results of the reliability analysis are encouraging and it can be concluded that, the proposed scheme is efficient and secure. The results of this study also suggest application of other quantum maps such as quantum standard map and quantum baker map in cryptography and other aspects of security and privacy.

  6. About the Key Escrow Properties of Identity Based Encryption Schemes

    Directory of Open Access Journals (Sweden)

    Ruxandra Olimid

    2012-09-01

    Full Text Available IBE (Identity Based Encryption represents a type of public key encryption that allows a party to encrypt a message using the recipient’s identity as public key. The private keys needed for decryption are generated and distributed to each party by a KGC (Key Generation Center. The existence of such an entity in an IBE scheme allows access to the encrypted information for other parties other than the intended recipient by construction: the KGC or any other entity that receives the cryptographic keys from the KGC may perform decryption. A system that permits other parties to have access to the private keys of the users is said to have key escrow abilities. The paper performs a brief analysis of the key escrow properties of IBE schemes and gives a practical example of communication protocol that improves the key escrow capabilities.

  7. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  8. Multiresolution image fusion scheme based on fuzzy region feature

    Institute of Scientific and Technical Information of China (English)

    LIU Gang; JING Zhong-liang; SUN Shao-yuan

    2006-01-01

    This paper proposes a novel region based image fusion scheme based on multiresolution analysis. The low frequency band of the image multiresolution representation is segmented into important regions, sub-important regions and background regions. Each feature of the regions is used to determine the region's degree of membership in the multiresolution representation,and then to achieve multiresolution representation of the fusion result. The final image fusion result can be obtained by using the inverse multiresolution transform. Experiments showed that the proposed image fusion method can have better performance than existing image fusion methods.

  9. Target searching based on modified implicit ROI encoding scheme

    Institute of Scientific and Technical Information of China (English)

    Bai Xu; Zhang Zhongzhao

    2008-01-01

    An EBCOT-based method is proposed to reduce the priority of background coefficients in the ROI code block without compromising algorithm complexity.The region of interest is encoded to a higher quality level than background,and the target searching time in video-guided penetrating missile can be shortened.Three kinds of coding schemes based on EBCOT are discussed.Experimental results demonstrate that the proposed method shows higher compression efficiency,lower complexity,and good reconstructed ROI image quality in the lower channel capacity.

  10. GSM-MRF based classification approach for real-time moving object detection

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Statistical and contextual information are typically used to detect moving regions in image sequences for a fixed camera. In this paper, we propose a fast and stable linear discriminant approach based on Gaussian Single Model (GSM) and Markov Random Field (MRF). The performance of GSM is analyzed first, and then two main improvements corresponding to the drawbacks of GSM are proposed: the latest filtered data based update scheme of the background model and the linear classification judgment rule based on spatial-temporal feature specified by MRF. Experimental results show that the proposed method runs more rapidly and accurately when compared with other methods.

  11. TENSOR MODELING BASED FOR AIRBORNE LiDAR DATA CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    N. Li

    2016-06-01

    Full Text Available Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the “raw” data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.

  12. Tensor Modeling Based for Airborne LiDAR Data Classification

    Science.gov (United States)

    Li, N.; Liu, C.; Pfeifer, N.; Yin, J. F.; Liao, Z. Y.; Zhou, Y.

    2016-06-01

    Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the "raw" data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.

  13. Cardiac Arrhythmias Classification Method Based on MUSIC, Morphological Descriptors, and Neural Network

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available An electrocardiogram (ECG beat classification scheme based on multiple signal classification (MUSIC algorithm, morphological descriptors, and neural networks is proposed for discriminating nine ECG beat types. These are normal, fusion of ventricular and normal, fusion of paced and normal, left bundle branch block, right bundle branch block, premature ventricular concentration, atrial premature contraction, paced beat, and ventricular flutter. ECG signal samples from MIT-BIH arrhythmia database are used to evaluate the scheme. MUSIC algorithm is used to calculate pseudospectrum of ECG signals. The low-frequency samples are picked to have the most valuable heartbeat information. These samples along with two morphological descriptors, which deliver the characteristics and features of all parts of the heart, form an input feature vector. This vector is used for the initial training of a classifier neural network. The neural network is designed to have nine sample outputs which constitute the nine beat types. Two neural network schemes, namely multilayered perceptron (MLP neural network and a probabilistic neural network (PNN, are employed. The experimental results achieved a promising accuracy of 99.03% for classifying the beat types using MLP neural network. In addition, our scheme recognizes NORMAL class with 100% accuracy and never misclassifies any other classes as NORMAL.

  14. Cardiac Arrhythmias Classification Method Based on MUSIC, Morphological Descriptors, and Neural Network

    Science.gov (United States)

    Naghsh-Nilchi, Ahmad R.; Kadkhodamohammadi, A. Rahim

    2009-12-01

    An electrocardiogram (ECG) beat classification scheme based on multiple signal classification (MUSIC) algorithm, morphological descriptors, and neural networks is proposed for discriminating nine ECG beat types. These are normal, fusion of ventricular and normal, fusion of paced and normal, left bundle branch block, right bundle branch block, premature ventricular concentration, atrial premature contraction, paced beat, and ventricular flutter. ECG signal samples from MIT-BIH arrhythmia database are used to evaluate the scheme. MUSIC algorithm is used to calculate pseudospectrum of ECG signals. The low-frequency samples are picked to have the most valuable heartbeat information. These samples along with two morphological descriptors, which deliver the characteristics and features of all parts of the heart, form an input feature vector. This vector is used for the initial training of a classifier neural network. The neural network is designed to have nine sample outputs which constitute the nine beat types. Two neural network schemes, namely multilayered perceptron (MLP) neural network and a probabilistic neural network (PNN), are employed. The experimental results achieved a promising accuracy of 99.03% for classifying the beat types using MLP neural network. In addition, our scheme recognizes NORMAL class with 100% accuracy and never misclassifies any other classes as NORMAL.

  15. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  16. A generalized procedure for constructing an upwind based TVD scheme

    Science.gov (United States)

    Liou, Meng-Sing

    1987-01-01

    A generalized formulation for constructing second- and higher-order accurate TVD (total variation diminishing) schemes is presented. A given scheme is made TVD by limiting antidiffusive flux differences with some linear functions, so-called limiters. The general idea of the formulation and its mathematical proof of Harten's TVD conditions is shown by applying the Lax-Wendroff method to scalar nonlinear equations and a constant-coefficient system of conservation laws. For the system of equations, several definitions are derived for the argument used in the limiter function and present their performance in numerical experiments. The formulation is extended to the nonlinear system. It is demonstrated that the present procedure can easily convert existing central or upwind, and second- or higher-order differencing schemes to preserve monotonicity and yield physically admissible solutions. The formulation is simple mathematically as well as numerically; both matrix-vector multiplication and Riemann solver are avoided. Although the notion of TVD is based on the initial value problem, application to the steady Euler equations of the formulation is also made.

  17. Demand response scheme based on lottery-like rebates

    KAUST Repository

    Schwartz, Galina A.

    2014-08-24

    In this paper, we develop a novel mechanism for reducing volatility of residential demand for electricity. We construct a reward-based (rebate) mechanism that provides consumers with incentives to shift their demand to off-peak time. In contrast to most other mechanisms proposed in the literature, the key feature of our mechanism is its modest requirements on user preferences, i.e., it does not require exact knowledge of user responsiveness to rewards for shifting their demand from the peak to the off-peak time. Specifically, our mechanism utilizes a probabilistic reward structure for users who shift their demand to the off-peak time, and is robust to incomplete information about user demand and/or risk preferences. We approach the problem from the public good perspective, and demonstrate that the mechanism can be implemented via lottery-like schemes. Our mechanism permits to reduce the distribution losses, and thus improve efficiency of electricity distribution. Finally, the mechanism can be readily incorporated into the emerging demand response schemes (e.g., the time-of-day pricing, and critical peak pricing schemes), and has security and privacy-preserving properties.

  18. Rule Based Classification to Detect Malnutrition in Children

    Directory of Open Access Journals (Sweden)

    Xu Dezhi

    2011-01-01

    Full Text Available Data mining is an area which used in vast field of areas. Rule based classification is one of the sub areas in data mining. From this paper it will describe how rule based classification is used alone with Agent Technology to detect malnutrition in children. This proposed system is implemented as an egovernment system. Further it will try to research whether there is connection between number of rules which is used with the optimality of the final decision.

  19. Iris Image Classification Based on Hierarchical Visual Codebook.

    Science.gov (United States)

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection. PMID:26353275

  20. Object-Based Crop Species Classification Based on the Combination of Airborne Hyperspectral Images and LiDAR Data

    Directory of Open Access Journals (Sweden)

    Xiaolong Liu

    2015-01-01

    Full Text Available Identification of crop species is an important issue in agricultural management. In recent years, many studies have explored this topic using multi-spectral and hyperspectral remote sensing data. In this study, we perform dedicated research to propose a framework for mapping crop species by combining hyperspectral and Light Detection and Ranging (LiDAR data in an object-based image analysis (OBIA paradigm. The aims of this work were the following: (i to understand the performances of different spectral dimension-reduced features from hyperspectral data and their combination with LiDAR derived height information in image segmentation; (ii to understand what classification accuracies of crop species can be achieved by combining hyperspectral and LiDAR data in an OBIA paradigm, especially in regions that have fragmented agricultural landscape and complicated crop planting structure; and (iii to understand the contributions of the crop height that is derived from LiDAR data, as well as the geometric and textural features of image objects, to the crop species’ separabilities. The study region was an irrigated agricultural area in the central Heihe river basin, which is characterized by many crop species, complicated crop planting structures, and fragmented landscape. The airborne hyperspectral data acquired by the Compact Airborne Spectrographic Imager (CASI with a 1 m spatial resolution and the Canopy Height Model (CHM data derived from the LiDAR data acquired by the airborne Leica ALS70 LiDAR system were used for this study. The image segmentation accuracies of different feature combination schemes (very high-resolution imagery (VHR, VHR/CHM, and minimum noise fractional transformed data (MNF/CHM were evaluated and analyzed. The results showed that VHR/CHM outperformed the other two combination schemes with a segmentation accuracy of 84.8%. The object-based crop species classification results of different feature integrations indicated that

  1. SVM-based spectrum mobility prediction scheme in mobile cognitive radio networks.

    Science.gov (United States)

    Wang, Yao; Zhang, Zhongzhao; Ma, Lin; Chen, Jiamei

    2014-01-01

    Spectrum mobility as an essential issue has not been fully investigated in mobile cognitive radio networks (CRNs). In this paper, a novel support vector machine based spectrum mobility prediction (SVM-SMP) scheme is presented considering time-varying and space-varying characteristics simultaneously in mobile CRNs. The mobility of cognitive users (CUs) and the working activities of primary users (PUs) are analyzed in theory. And a joint feature vector extraction (JFVE) method is proposed based on the theoretical analysis. Then spectrum mobility prediction is executed through the classification of SVM with a fast convergence speed. Numerical results validate that SVM-SMP gains better short-time prediction accuracy rate and miss prediction rate performance than the two algorithms just depending on the location and speed information. Additionally, a rational parameter design can remedy the prediction performance degradation caused by high speed SUs with strong randomness movements. PMID:25143975

  2. A classification-based review recommender

    OpenAIRE

    O'Mahony, Michael P.; Smyth, Barry

    2010-01-01

    Many online stores encourage their users to submit product or service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare...

  3. Arbitrated quantum signature scheme based on reusable key

    Science.gov (United States)

    Yu, ChaoHua; Guo, GongDe; Lin, Song

    2014-11-01

    An arbitrated quantum signature scheme without using entangled states is proposed. In the scheme, by employing a classical hash function and random numbers, the secret keys of signer and receiver can be reused. It is shown that the proposed scheme is secure against several well-known attacks. Specifically, it can stand against the receiver's disavowal attack. Moreover, compared with previous relevant arbitrated quantum signature schemes, the scheme proposed has the advantage of less transmission complexity.

  4. Arbitrated quantum signature scheme based on reusable key

    Institute of Scientific and Technical Information of China (English)

    YU ChaoHua; GUO GongDe; LIN Song

    2014-01-01

    An arbitrated quantum signature scheme without using entangled states is proposed.In the scheme,by employing a classical hash function and random numbers,the secret keys of signer and receiver can be reused.It is shown that the proposed scheme is secure against several well-known attacks.Specifically,it can stand against the receiver's disavowal attack.Moreover,compared with previous relevant arbitrated quantum signature schemes,the scheme proposed has the advantage of less transmission complexity.

  5. Intelligent Hybrid Cluster Based Classification Algorithm for Social Network Analysis

    Directory of Open Access Journals (Sweden)

    S. Muthurajkumar

    2014-05-01

    Full Text Available In this paper, we propose an hybrid clustering based classification algorithm based on mean approach to effectively classify to mine the ordered sequences (paths from weblog data in order to perform social network analysis. In the system proposed in this work for social pattern analysis, the sequences of human activities are typically analyzed by switching behaviors, which are likely to produce overlapping clusters. In this proposed system, a robust Modified Boosting algorithm is proposed to hybrid clustering based classification for clustering the data. This work is useful to provide connection between the aggregated features from the network data and traditional indices used in social network analysis. Experimental results show that the proposed algorithm improves the decision results from data clustering when combined with the proposed classification algorithm and hence it is proved that of provides better classification accuracy when tested with Weblog dataset. In addition, this algorithm improves the predictive performance especially for multiclass datasets which can increases the accuracy.

  6. A novel fully automatic scheme for fiducial marker-based alignment in electron tomography.

    Science.gov (United States)

    Han, Renmin; Wang, Liansan; Liu, Zhiyong; Sun, Fei; Zhang, Fa

    2015-12-01

    Although the topic of fiducial marker-based alignment in electron tomography (ET) has been widely discussed for decades, alignment without human intervention remains a difficult problem. Specifically, the emergence of subtomogram averaging has increased the demand for batch processing during tomographic reconstruction; fully automatic fiducial marker-based alignment is the main technique in this process. However, the lack of an accurate method for detecting and tracking fiducial markers precludes fully automatic alignment. In this paper, we present a novel, fully automatic alignment scheme for ET. Our scheme has two main contributions: First, we present a series of algorithms to ensure a high recognition rate and precise localization during the detection of fiducial markers. Our proposed solution reduces fiducial marker detection to a sampling and classification problem and further introduces an algorithm to solve the parameter dependence of marker diameter and marker number. Second, we propose a novel algorithm to solve the tracking of fiducial markers by reducing the tracking problem to an incomplete point set registration problem. Because a global optimization of a point set registration occurs, the result of our tracking is independent of the initial image position in the tilt series, allowing for the robust tracking of fiducial markers without pre-alignment. The experimental results indicate that our method can achieve an accurate tracking, almost identical to the current best one in IMOD with half automatic scheme. Furthermore, our scheme is fully automatic, depends on fewer parameters (only requires a gross value of the marker diameter) and does not require any manual interaction, providing the possibility of automatic batch processing of electron tomographic reconstruction.

  7. An Industrial Model Based Disturbance Feedback Control Scheme

    DEFF Research Database (Denmark)

    Kawai, Fukiko; Nakazawa, Chikashi; Vinther, Kasper;

    2014-01-01

    This paper presents a model based disturbance feedback control scheme. Industrial process systems have been traditionally controlled by using relay and PID controller. However these controllers are affected by disturbances and model errors and these effects degrade control performance. The authors...... propose a new control method that can decrease the negative impact of disturbance and model errors. The control method is motivated by industrial practice by Fuji Electric. Simulation tests are examined with a conventional PID controller and the disturbance feedback control. The simulation results...

  8. A New Images Hiding Scheme Based on Chaotic Sequences

    Institute of Scientific and Technical Information of China (English)

    LIU Nian-sheng; GUO Dong-hui; WU Bo-xi; Parr G

    2005-01-01

    We propose a data hidding technique in a still image. This technique is based on chaotic sequence in the transform domain of covert image. We use different chaotic random sequences multiplied by multiple sensitive images, respectively, to spread the spectrum of sensitive images. Multiple sensitive images are hidden in a covert image as a form of noise. The results of theoretical analysis and computer simulation show the new hiding technique have better properties with high security, imperceptibility and capacity for hidden information in comparison with the conventional scheme such as LSB (Least Significance Bit).

  9. A new iterative speech enhancement scheme based on Kalman filtering

    DEFF Research Database (Denmark)

    Li, Chunjian; Andersen, Søren Vang

    2005-01-01

    A new iterative speech enhancement scheme that can be seen as an approximation to the Expectation-Maximization (EM) algorithm is proposed. The algorithm employs a Kalman filter that models the excitation source as a spectrally white process with a rapidly time-varying variance, which calls...... for a high temporal resolution estimation of this variance. A Local Variance Estimator based on a Prediction Error Kalman Filter is designed for this high temporal resolution variance estimation. To achieve fast convergence and avoid local maxima of the likelihood function, a Weighted Power Spectral...

  10. Words semantic orientation classification based on HowNet

    Institute of Scientific and Technical Information of China (English)

    LI Dun; MA Yong-tao; GUO Jian-li

    2009-01-01

    Based on the text orientation classification, a new measurement approach to semantic orientation of words was proposed. According to the integrated and detailed definition of words in HowNet, seed sets including the words with intense orientations were built up. The orientation similarity between the seed words and the given word was then calculated using the sentiment weight priority to recognize the semantic orientation of common words. Finally, the words' semantic orientation and the context were combined to recognize the given words' orientation. The experiments show that the measurement approach achieves better results for common words' orientation classification and contributes particularly to the text orientation classification of large granularities.

  11. Support vector classification algorithm based on variable parameter linear programming

    Institute of Scientific and Technical Information of China (English)

    Xiao Jianhua; Lin Jian

    2007-01-01

    To solve the problems of SVM in dealing with large sample size and asymmetric distributed samples, a support vector classification algorithm based on variable parameter linear programming is proposed.In the proposed algorithm, linear programming is employed to solve the optimization problem of classification to decrease the computation time and to reduce its complexity when compared with the original model.The adjusted punishment parameter greatly reduced the classification error resulting from asymmetric distributed samples and the detailed procedure of the proposed algorithm is given.An experiment is conducted to verify whether the proposed algorithm is suitable for asymmetric distributed samples.

  12. Radar Target Classification using Recursive Knowledge-Based Methods

    DEFF Research Database (Denmark)

    Jochumsen, Lars Wurtz

    The topic of this thesis is target classification of radar tracks from a 2D mechanically scanning coastal surveillance radar. The measurements provided by the radar are position data and therefore the classification is mainly based on kinematic data, which is deduced from the position. The target...... been terminated. Therefore, an update of the classification results must be made for each measurement of the target. The data for this work are collected throughout the PhD and are both collected from radars and other sensors such as GPS....

  13. Cancer classification based on gene expression using neural networks.

    Science.gov (United States)

    Hu, H P; Niu, Z J; Bai, Y P; Tan, X H

    2015-12-21

    Based on gene expression, we have classified 53 colon cancer patients with UICC II into two groups: relapse and no relapse. Samples were taken from each patient, and gene information was extracted. Of the 53 samples examined, 500 genes were considered proper through analyses by S-Kohonen, BP, and SVM neural networks. Classification accuracy obtained by S-Kohonen neural network reaches 91%, which was more accurate than classification by BP and SVM neural networks. The results show that S-Kohonen neural network is more plausible for classification and has a certain feasibility and validity as compared with BP and SVM neural networks.

  14. Analysis of Kernel Approach in Fuzzy-Based Image Classifications

    Directory of Open Access Journals (Sweden)

    Mragank Singhal

    2013-03-01

    Full Text Available This paper presents a framework of kernel approach in the field of fuzzy based image classification in remote sensing. The goal of image classification is to separate images according to their visual content into two or more disjoint classes. Fuzzy logic is relatively young theory. Major advantage of this theory is that it allows the natural description, in linguistic terms, of problems that should be solved rather than in terms of relationships between precise numerical values. This paper describes how remote sensing data with uncertainty are handled with fuzzy based classification using Kernel approach for land use/land cover maps generation. The introduction to fuzzification using Kernel approach provides the basis for the development of more robust approaches to the remote sensing classification problem. The kernel explicitly defines a similarity measure between two samples and implicitly represents the mapping of the input space to the feature space.

  15. A Syntactic Classification based Web Page Ranking Algorithm

    CERN Document Server

    Mukhopadhyay, Debajyoti; Kim, Young-Chon

    2011-01-01

    The existing search engines sometimes give unsatisfactory search result for lack of any categorization of search result. If there is some means to know the preference of user about the search result and rank pages according to that preference, the result will be more useful and accurate to the user. In the present paper a web page ranking algorithm is being proposed based on syntactic classification of web pages. Syntactic Classification does not bother about the meaning of the content of a web page. The proposed approach mainly consists of three steps: select some properties of web pages based on user's demand, measure them, and give different weightage to each property during ranking for different types of pages. The existence of syntactic classification is supported by running fuzzy c-means algorithm and neural network classification on a set of web pages. The change in ranking for difference in type of pages but for same query string is also being demonstrated.

  16. Feature Extraction based Face Recognition, Gender and Age Classification

    Directory of Open Access Journals (Sweden)

    Venugopal K R

    2010-01-01

    Full Text Available The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are located by using Canny edge operator and face recognition is performed. Based on the texture and shape information gender and age classification is done using Posteriori Class Probability and Artificial Neural Network respectively. It is observed that the face recognition is 100%, the gender and age classification is around 98% and 94% respectively.

  17. A NOVEL RULE-BASED FINGERPRINT CLASSIFICATION APPROACH

    Directory of Open Access Journals (Sweden)

    Faezeh Mirzaei

    2014-03-01

    Full Text Available Fingerprint classification is an important phase in increasing the speed of a fingerprint verification system and narrow down the search of fingerprint database. Fingerprint verification is still a challenging problem due to the difficulty of poor quality images and the need for faster response. The classification gets even harder when just one core has been detected in the input image. This paper has proposed a new classification approach which includes the images with one core. The algorithm extracts singular points (core and deltas from the input image and performs classification based on the number, locations and surrounded area of the detected singular points. The classifier is rule-based, where the rules are generated independent of a given data set. Moreover, shortcomings of a related paper has been reported in detail. The experimental results and comparisons on FVC2002 database have shown the effectiveness and efficiency of the proposed method.

  18. Geometrically Invariant Watermarking Scheme Based on Local Feature Points

    Directory of Open Access Journals (Sweden)

    Jing Li

    2012-06-01

    Full Text Available Based on local invariant feature points and cross ratio principle, this paper presents a feature-point-based image watermarking scheme. It is robust to geometric attacks and some signal processes. It extracts local invariant feature points from the image using the improved scale invariant feature transform algorithm. Utilizing these points as vertexes it constructs some quadrilaterals to be as local feature regions. Watermark is inserted these local feature regions repeatedly. In order to get stable local regions it adjusts the number and distribution of extracted feature points. In every chosen local feature region it decides locations to embed watermark bits based on the cross ratio of four collinear points, the cross ratio is invariant to projective transformation. Watermark bits are embedded by quantization modulation, in which the quantization step value is computed with the given PSNR. Experimental results show that the proposed method can strongly fight more geometrical attacks and the compound attacks of geometrical ones.

  19. Three-dimensional shapelets and an automated classification scheme for dark matter haloes

    OpenAIRE

    Fluke, C. J.; Malec, A.L.; Lasky, P. D.; B. R. Barsdell

    2011-01-01

    We extend the two-dimensional Cartesian shapelet formalism to d-dimensions. Concentrating on the three-dimensional case, we derive shapelet-based equations for the mass, centroid, root-mean-square radius, and components of the quadrupole moment and moment of inertia tensors. Using cosmological N-body simulations as an application domain, we show that three-dimensional shapelets can be used to replicate the complex sub-structure of dark matter halos and demonstrate the basis of an automated cl...

  20. Towards functional classification of neuronal types

    OpenAIRE

    Sharpee, Tatyana O.

    2014-01-01

    How many types of neurons are there in the brain? This basic neuroscience question remains unsettled despite many decades of research. Classification schemes have been proposed based on anatomical, electrophysiological or molecular properties. However, different schemes do not always agree with each other. This raises the question of whether one can classify neurons based on their function directly. For example, among sensory neurons, can a classification scheme be devised that is based on th...

  1. Feature Extraction based Face Recognition, Gender and Age Classification

    OpenAIRE

    Venugopal K R2; L M Patnaik; Ramesha K; K B Raja

    2010-01-01

    The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC) algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are loc...

  2. Malicious attacks on media authentication schemes based on invertible watermarks

    Science.gov (United States)

    Katzenbeisser, Stefan; Dittmann, Jana

    2004-06-01

    The increasing availability and distribution of multimedia technology has made the manipulation of digital images, videos or audio files easy. While this enables numerous new applications, a certain loss of trust in digital media can be observed. In general, there is no guarantee that a digital image "does not lie", i.e., that the image content was not altered. To counteract this risk, fragile watermarks were proposed to protect the integrity of digital multimedia objects. In high security applications, it is necessary to be able to reconstruct the original object out of the watermarked version. This can be achieved by the use of invertible watermarks. While traditional watermarking schemes introduce some small non-invertible distortion in the digital content, invertible watermarks can be completely removed from a watermarked work. In the past, the security of proposed image authentication schemes based on invertible watermarks was only analyzed using ad-hoc methods and neglected the possibility of malicious attacks, which aim at engineering a fake mark so that the attacked object appears to be genuine. In this paper, we characterize and analyze possible malicious attacks against watermark-based image authentication systems and explore the theoretical limits of previous constructions with respect to their security.

  3. Motion feature extraction scheme for content-based video retrieval

    Science.gov (United States)

    Wu, Chuan; He, Yuwen; Zhao, Li; Zhong, Yuzhuo

    2001-12-01

    This paper proposes the extraction scheme of global motion and object trajectory in a video shot for content-based video retrieval. Motion is the key feature representing temporal information of videos. And it is more objective and consistent compared to other features such as color, texture, etc. Efficient motion feature extraction is an important step for content-based video retrieval. Some approaches have been taken to extract camera motion and motion activity in video sequences. When dealing with the problem of object tracking, algorithms are always proposed on the basis of known object region in the frames. In this paper, a whole picture of the motion information in the video shot has been achieved through analyzing motion of background and foreground respectively and automatically. 6-parameter affine model is utilized as the motion model of background motion, and a fast and robust global motion estimation algorithm is developed to estimate the parameters of the motion model. The object region is obtained by means of global motion compensation between two consecutive frames. Then the center of object region is calculated and tracked to get the object motion trajectory in the video sequence. Global motion and object trajectory are described with MPEG-7 parametric motion and motion trajectory descriptors and valid similar measures are defined for the two descriptors. Experimental results indicate that our proposed scheme is reliable and efficient.

  4. Iris image recognition wavelet filter-banks based iris feature extraction schemes

    CERN Document Server

    Rahulkar, Amol D

    2014-01-01

    This book provides the new results in wavelet filter banks based feature extraction, and the classifier in the field of iris image recognition. It provides the broad treatment on the design of separable, non-separable wavelets filter banks, and the classifier. The design techniques presented in the book are applied on iris image analysis for person authentication. This book also brings together the three strands of research (wavelets, iris image analysis, and classifier). It compares the performance of the presented techniques with state-of-the-art available schemes. This book contains the compilation of basic material on the design of wavelets that avoids reading many different books. Therefore, it provide an easier path for the new-comers, researchers to master the contents. In addition, the designed filter banks and classifier can also be effectively used than existing filter-banks in many signal processing applications like pattern classification, data-compression, watermarking, denoising etc.  that will...

  5. Fuzzy-Wavelet Based Double Line Transmission System Protection Scheme in the Presence of SVC

    Science.gov (United States)

    Goli, Ravikumar; Shaik, Abdul Gafoor; Tulasi Ram, Sankara S.

    2014-07-01

    Increasing the power transfer capability and efficient utilization of available transmission lines, improving the power system controllability and stability, power oscillation damping and voltage compensation have made strides and created Flexible AC Transmission (FACTS) devices in recent decades. Shunt FACTS devices can have adverse effects on distance protection both in steady state and transient periods. Severe under reaching is the most important problem of relay which is caused by current injection at the point of connection to the system. Current absorption of compensator leads to overreach of relay. This work presents an efficient method based on wavelet transforms, fault detection, classification and location using Fuzzy logic technique which is almost independent of fault impedance, fault distance and fault inception angle. The proposed protection scheme is found to be fast, reliable and accurate for various types of faults on transmission lines with and without Static Var compensator at different locations and with various incidence angles.

  6. ID-based authentication scheme combined with identity-based encryption with fingerprint hashing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Current identity-based (ID) cryptosystem lacks the mechanisms of two-party authentication and user's private key distribution. Some ID-based signcryption schemes and ID-based authenticated key agreement protocols have been presented, but they cannot solve the problem completely. A novel ID-based authentication scheme based on ID-based encryption (IBE) and fingerprint hashing method is proposed to solve the difficulties in the IBE scheme, which includes message receiver authenticating the sender, the trusted authority (TA) authenticating the users and transmitting the private key to them. Furthermore, the scheme extends the application of fingerprint authentication from terminal to network and protects against fingerprint data fabrication. The fingerprint authentication method consists of two factors. This method combines a token key, for example, the USB key, with the user's fingerprint hash by mixing a pseudo-random number with the fingerprint feature. The security and experimental efficiency meet the requirements of practical applications.

  7. Classification of normal and pathological aging processes based on brain MRI morphology measures

    Science.gov (United States)

    Perez-Gonzalez, J. L.; Yanez-Suarez, O.; Medina-Bañuelos, V.

    2014-03-01

    Reported studies describing normal and abnormal aging based on anatomical MRI analysis do not consider morphological brain changes, but only volumetric measures to distinguish among these processes. This work presents a classification scheme, based both on size and shape features extracted from brain volumes, to determine different aging stages: healthy control (HC) adults, mild cognitive impairment (MCI), and Alzheimer's disease (AD). Three support vector machines were optimized and validated for the pair-wise separation of these three classes, using selected features from a set of 3D discrete compactness measures and normalized volumes of several global and local anatomical structures. Our analysis show classification rates of up to 98.3% between HC and AD; of 85% between HC and MCI and of 93.3% for MCI and AD separation. These results outperform those reported in the literature and demonstrate the viability of the proposed morphological indexes to classify different aging stages.

  8. Audio Classification from Time-Frequency Texture

    CERN Document Server

    Yu, Guoshen

    2008-01-01

    Time-frequency representations of audio signals often resemble texture images. This paper derives a simple audio classification algorithm based on treating sound spectrograms as texture images. The algorithm is inspired by an earlier visual classification scheme particularly efficient at classifying textures. While solely based on time-frequency texture features, the algorithm achieves surprisingly good performance in musical instrument classification experiments.

  9. A Lattice-Based Identity-Based Proxy Blind Signature Scheme in the Standard Model

    Directory of Open Access Journals (Sweden)

    Lili Zhang

    2014-01-01

    Full Text Available A proxy blind signature scheme is a special form of blind signature which allowed a designated person called proxy signer to sign on behalf of original signers without knowing the content of the message. It combines the advantages of proxy signature and blind signature. Up to date, most proxy blind signature schemes rely on hard number theory problems, discrete logarithm, and bilinear pairings. Unfortunately, the above underlying number theory problems will be solvable in the postquantum era. Lattice-based cryptography is enjoying great interest these days, due to implementation simplicity and provable security reductions. Moreover, lattice-based cryptography is believed to be hard even for quantum computers. In this paper, we present a new identity-based proxy blind signature scheme from lattices without random oracles. The new scheme is proven to be strongly unforgeable under the standard hardness assumption of the short integer solution problem (SIS and the inhomogeneous small integer solution problem (ISIS. Furthermore, the secret key size and the signature length of our scheme are invariant and much shorter than those of the previous lattice-based proxy blind signature schemes. To the best of our knowledge, our construction is the first short lattice-based identity-based proxy blind signature scheme in the standard model.

  10. Visual words based approach for tissue classification in mammograms

    Science.gov (United States)

    Diamant, Idit; Goldberger, Jacob; Greenspan, Hayit

    2013-02-01

    The presence of Microcalcifications (MC) is an important indicator for developing breast cancer. Additional indicators for cancer risk exist, such as breast tissue density type. Different methods have been developed for breast tissue classification for use in Computer-aided diagnosis systems. Recently, the visual words (VW) model has been successfully applied for different classification tasks. The goal of our work is to explore VW based methodologies for various mammography classification tasks. We start with the challenge of classifying breast density and then focus on classification of normal tissue versus Microcalcifications. The presented methodology is based on patch-based visual words model which includes building a dictionary for a training set using local descriptors and representing the image using a visual word histogram. Classification is then performed using k-nearest-neighbour (KNN) and Support vector machine (SVM) classifiers. We tested our algorithm on the MIAS and DDSM publicly available datasets. The input is a representative region-of-interest per mammography image, manually selected and labelled by expert. In the tissue density task, classification accuracy reached 85% using KNN and 88% using SVM, which competes with the state-of-the-art results. For MC vs. normal tissue, accuracy reached 95.6% using SVM. Results demonstrate the feasibility to classify breast tissue using our model. Currently, we are improving the results further while also investigating VW capability to classify additional important mammogram classification problems. We expect that the methodology presented will enable high levels of classification, suggesting new means for automated tools for mammography diagnosis support.

  11. Classification of LiDAR Data with Point Based Classification Methods

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2016-06-01

    LiDAR is one of the most effective systems for 3 dimensional (3D) data collection in wide areas. Nowadays, airborne LiDAR data is used frequently in various applications such as object extraction, 3D modelling, change detection and revision of maps with increasing point density and accuracy. The classification of the LiDAR points is the first step of LiDAR data processing chain and should be handled in proper way since the 3D city modelling, building extraction, DEM generation, etc. applications directly use the classified point clouds. The different classification methods can be seen in recent researches and most of researches work with the gridded LiDAR point cloud. In grid based data processing of the LiDAR data, the characteristic point loss in the LiDAR point cloud especially vegetation and buildings or losing height accuracy during the interpolation stage are inevitable. In this case, the possible solution is the use of the raw point cloud data for classification to avoid data and accuracy loss in gridding process. In this study, the point based classification possibilities of the LiDAR point cloud is investigated to obtain more accurate classes. The automatic point based approaches, which are based on hierarchical rules, have been proposed to achieve ground, building and vegetation classes using the raw LiDAR point cloud data. In proposed approaches, every single LiDAR point is analyzed according to their features such as height, multi-return, etc. then automatically assigned to the class which they belong to. The use of un-gridded point cloud in proposed point based classification process helped the determination of more realistic rule sets. The detailed parameter analyses have been performed to obtain the most appropriate parameters in the rule sets to achieve accurate classes. The hierarchical rule sets were created for proposed Approach 1 (using selected spatial-based and echo-based features) and Approach 2 (using only selected spatial-based features

  12. Improving Classification Performance by Integrating Multiple Classifiers Based on Landsat ™ Images: A Primary Study

    Science.gov (United States)

    Li, Xuecao; Liu, Xiaoping; Yu, Le; Gong, Peng

    2014-11-01

    Land use/cover change is crucial to many ecological and environmental issues. In this article, we presented a new approach to improve the classification performance of remotely sensed images based on a classifier ensemble scheme, which can be delineated as two procedures, namely ensemble learning and predictions combination. Bagging algorithm, which is a widely used ensemble approach, was employed in the first procedure through a bootstrapped sampling scheme to stabilize and improve the performance of single classifier. Then, in the second stage, predictions of different classifiers are combined through the scheme of Behaviour Knowledge Space (BKS). This classifier ensemble scheme was examined using a Landsat Thematic Mapper (TM) image acquired at 2 January, 2009 in Dongguan (China). The experimental results illustrate the final output (BKS, OA=90.83% and Kappa=0.881) is outperformed not only the best single classifier (SVM, OA=88.83% and Kappa=0.8624) but also the Bagging CART classifier (OA=90.26% and Kappa=0.8808), although the improvements are varying among them. We think the classifier ensemble scheme can mitigate the limitation of some single models.

  13. Triangle-based key management scheme for wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    Hangyang DAI; Hongbing XU

    2009-01-01

    For security services in wireless sensor net-works, key management is a fundamental building block.In this article, we propose a triangle-based key predis-tribution approach and show that it can improve the effectiveness of key management in wireless sensor networks. This is achieved by using the bivariate polynomial in a triangle deployment system based on deployment information about expected locations of the sensor nodes. The analysis indicates that this scheme can achieve higher probability of both direct key establishment and indirect key establishment. On the other hand, the security analysis shows that its security against node capture would increase with a decrease of the sensor node deployment density and size of the deployment model and an increase of the polynomial degree.

  14. VARIABLE LENGTH KEY BASED VISUAL CRYPTOGRAPHY SCHEME FOR COLOR IMAGE

    Directory of Open Access Journals (Sweden)

    Akhil Anjikar

    2014-11-01

    Full Text Available Visual Cryptography is a special encryption technique that encrypts the secret image into n number of shares to hide information in images in such a way that it can be decrypted by the human visual system. It is imperceptible to reveal the secret information unless a certain number of shares (k or more are superimposed. Simple visual cryptography is very insecure. Variable length key based visual cryptography for color image uses a variable length Symmetric Key based Visual Cryptographic Scheme for color images where a secret key is used to encrypt the image and division of the encrypted image is done using Random Number. Unless the secret key, the original image will not be decrypted. Here secret key ensures the security of image.

  15. Adaptive SPC monitoring scheme for DOE-based APC

    Institute of Scientific and Technical Information of China (English)

    Ye Liang; Pan Ershun; Xi Lifeng

    2008-01-01

    Automatic process control (APC) based on design of experiment (DOE) is a cost-efficient approach for variation reduction. The process changes both in mean and variance owing to online parameter adjustment make it hard to apply traditional SPC charts in such DOE-based APC applied process. An adaptive SPC scheme is developed, which can better track the process transitions and achieve the possible SPC run cost reduction when the process is stable. The control law of SPC parameters is designed by fully utilizing the estimation properties of the process model instead of traditionally using the data collected from the production line. An example is provided to illustrate the proposed adaptive SPC design approach.

  16. Ensemble polarimetric SAR image classification based on contextual sparse representation

    Science.gov (United States)

    Zhang, Lamei; Wang, Xiao; Zou, Bin; Qiao, Zhijun

    2016-05-01

    Polarimetric SAR image interpretation has become one of the most interesting topics, in which the construction of the reasonable and effective technique of image classification is of key importance. Sparse representation represents the data using the most succinct sparse atoms of the over-complete dictionary and the advantages of sparse representation also have been confirmed in the field of PolSAR classification. However, it is not perfect, like the ordinary classifier, at different aspects. So ensemble learning is introduced to improve the issue, which makes a plurality of different learners training and obtained the integrated results by combining the individual learner to get more accurate and ideal learning results. Therefore, this paper presents a polarimetric SAR image classification method based on the ensemble learning of sparse representation to achieve the optimal classification.

  17. Classification approach based on association rules mining for unbalanced data

    CERN Document Server

    Ndour, Cheikh

    2012-01-01

    This paper deals with the supervised classification when the response variable is binary and its class distribution is unbalanced. In such situation, it is not possible to build a powerful classifier by using standard methods such as logistic regression, classification tree, discriminant analysis, etc. To overcome this short-coming of these methods that provide classifiers with low sensibility, we tackled the classification problem here through an approach based on the association rules learning because this approach has the advantage of allowing the identification of the patterns that are well correlated with the target class. Association rules learning is a well known method in the area of data-mining. It is used when dealing with large database for unsupervised discovery of local patterns that expresses hidden relationships between variables. In considering association rules from a supervised learning point of view, a relevant set of weak classifiers is obtained from which one derives a classification rule...

  18. Cluster-based Multihop Synchronization Scheme for Femtocell Network

    Directory of Open Access Journals (Sweden)

    Aisha H. Abdalla

    2012-10-01

    Full Text Available ABSTRACT: Femtocell technology has been drawing considerable attention as a cost-effective means of improving cellular coverage and capacity. It is connected to the core network through an IP backhaul and can only use timing protocols such as IEEE1588 or Network Time Protocol (NTP. Furthermore, the femtocell is installed indoor, and cannot use a GPS antenna for time synchronization.  High-precision crystal oscillators can solve the timing problem, but they are often too expensive for consumer grade devices. Therefore, femtocell Base Station (fBS synchronization is one of the principle technical trends in femtocell deployment. Since fBSand macrocell Base Station (mBS network operates on the same frequency under a licensed spectrum, fBS network can interfere with the macrocell network. In addition, fBSs can also interfere with each other if multiple units are in close proximity. Furthermore, in a flat fBS structured network using IEEE 1588 synchronization algorithm and fBS-fBS synchronization scheme creates offset and frequency error which results inaccurate synchronization. In order to reduce offset and frequency error (skew, this paper proposed a cluster-based multihop synchronization scheme to achieve precise in fBS neighbor nodes. The proposed scheme is able to reduce the offset and skew significantly.ABSTRAK: Teknologi Femtocell telah menjadi tumpuan sebagai alat yang kos-efektif dalam memperbaiki liputan mudahalih dan kapasiti. Ia menghubungkan jaringan teras melalui IP backhaul dan hanya boleh menggunakan protokol masa seperti IEEE1588 atau Protokol Jaringan Masa (NTP. Seterusnya, femtocell dipasang di dalam, dan tidak boleh menggunakan antena GPS untuk sinkronisasi masa. Osilator Kristal yang tinggi kejituannya boleh menyelesaikan masalah masa, tetapi ianya mahal bagi gred peranti consumer. Oleh itu, sinkronisasi Stesen Asas femtocell (fBS adalah salah satu tren teknikal prinsip dalam deployment femtocell. Memandangkan fBS dan jaringan

  19. Pathological Bases for a Robust Application of Cancer Molecular Classification

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2015-04-01

    Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.

  20. A Novel Feature Selection Based on One-Way ANOVA F-Test for E-Mail Spam Classification

    Directory of Open Access Journals (Sweden)

    Nadir Omer Fadl Elssied

    2014-01-01

    Full Text Available Spam is commonly defined as unwanted e-mails and it became a global threat against e-mail users. Although, Support Vector Machine (SVM has been commonly used in e-mail spam classification, yet the problem of high data dimensionality of the feature space due to the massive number of e-mail dataset and features still exist. To improve the limitation of SVM, reduce the computational complexity (efficiency and enhancing the classification accuracy (effectiveness. In this study, feature selection based on one-way ANOVA F-test statistics scheme was applied to determine the most important features contributing to e-mail spam classification. This feature selection based on one-way ANOVA F-test is used to reduce the high data dimensionality of the feature space before the classification process. The experiment of the proposed scheme was carried out using spam base well-known benchmarking dataset to evaluate the feasibility of the proposed method. The comparison is achieved for different datasets, categorization algorithm and success measures. In addition, experimental results on spam base English datasets showed that the enhanced SVM (FSSVM significantly outperforms SVM and many other recent spam classification methods for English dataset in terms of computational complexity and dimension reduction.

  1. Energy Aware Cluster Based Routing Scheme For Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Roy Sohini

    2015-09-01

    Full Text Available Wireless Sensor Network (WSN has emerged as an important supplement to the modern wireless communication systems due to its wide range of applications. The recent researches are facing the various challenges of the sensor network more gracefully. However, energy efficiency has still remained a matter of concern for the researches. Meeting the countless security needs, timely data delivery and taking a quick action, efficient route selection and multi-path routing etc. can only be achieved at the cost of energy. Hierarchical routing is more useful in this regard. The proposed algorithm Energy Aware Cluster Based Routing Scheme (EACBRS aims at conserving energy with the help of hierarchical routing by calculating the optimum number of cluster heads for the network, selecting energy-efficient route to the sink and by offering congestion control. Simulation results prove that EACBRS performs better than existing hierarchical routing algorithms like Distributed Energy-Efficient Clustering (DEEC algorithm for heterogeneous wireless sensor networks and Energy Efficient Heterogeneous Clustered scheme for Wireless Sensor Network (EEHC.

  2. The CLIC positron source based on compton schemes

    CERN Document Server

    Rinolfi, L; Braun, H; Papaphilippou, Y; Schulte, D; Vivoli, A; Zimmermann, F; Dadoun, O; Lepercq, P; Roux, R; Variola, A; Zomer, F; Pogorelski, I; Yakimenko, V; Gai, W; Liu, W; Kamitani, T; Omori, T; Urakawa, J; Kuriki, M; Takahasi, TM; Bulyak, E; Gladkikh, P; Chehab, R; Clarke, J

    2010-01-01

    The CLIC polarized positron source is based on a positron production scheme in which polarized photons are produced by a Compton process. In one option, Compton backscattering takes place in a so-called “Compton ring”, where an electron beam of 1 GeV interacts with circularly-polarized photons in an optical resonator. The resulting circularly-polarized gamma photons are sent on to an amorphous target, producing pairs of longitudinally polarized electrons and positrons. The nominal CLIC bunch population is 4.2x109 particles per bunch at the exit of the Pre-Damping Ring (PDR). Since the photon flux coming out from a "Compton ring" is not sufficient to obtain the requested charge, a stacking process is required in the PDR. Another option is to use a Compton Energy Recovery Linac (ERL) where a quasicontinual stacking in the PDR could be achieved. A third option is to use a "Compton Linac" which would not require stacking. We describe the overall scheme as well as advantages and constraints of the three option...

  3. Normal Vector Based Subdivision Scheme to Generate Fractal Curves

    Directory of Open Access Journals (Sweden)

    Yi Li

    2013-08-01

    Full Text Available In this paper, we firstly devise a new and general p-ary subdivision scheme based on normal vectors with multi-parameters to generate fractals. Rich and colorful fractals including some known fractals and a lot of unknown ones can be generated directly and conveniently by using it uniformly. The method is easy to use and effective in generating fractals since the values of the parameters and the directions of normal vectors can be designed freely to control the shape of generated fractals. Secondly, we illustrate the technique with some design results of fractal generation and the corresponding fractal examples from the point of view of visualization, including the classical Lévy curves, Dragon curves, Sierpiński gasket, Koch curve, Koch-type curves and other fractals. Finally, some fractal properties of the limit of the presented subdivision scheme, including existence, self-similarity, non-rectifiability, and continuity but nowhere differentiability are described from the point of view of theoretical analysis.

  4. Saturation Detection-Based Blocking Scheme for Transformer Differential Protection

    Directory of Open Access Journals (Sweden)

    Byung Eun Lee

    2014-07-01

    Full Text Available This paper describes a current differential relay for transformer protection that operates in conjunction with a core saturation detection-based blocking algorithm. The differential current for the magnetic inrush or over-excitation has a point of inflection at the start and end of each saturation period of the transformer core. At these instants, discontinuities arise in the first-difference function of the differential current. The second- and third-difference functions convert the points of inflection into pulses, the magnitudes of which are large enough to detect core saturation. The blocking signal is activated if the third-difference of the differential current is larger than the threshold and is maintained for one cycle. In addition, a method to discriminate between transformer saturation and current transformer (CT saturation is included. The performance of the proposed blocking scheme was compared with that of a conventional harmonic blocking method. The test results indicate that the proposed scheme successfully discriminates internal faults even with CT saturation from the magnetic inrush, over-excitation, and external faults with CT saturation, and can significantly reduce the operating time delay of the relay.

  5. ELABORATION OF A VECTOR BASED SEMANTIC CLASSIFICATION OVER THE WORDS AND NOTIONS OF THE NATURAL LANGUAGE

    OpenAIRE

    Safonov, K.; Lichargin, D.

    2009-01-01

    The problem of vector-based semantic classification over the words and notions of the natural language is discussed. A set of generative grammar rules is offered for generating the semantic classification vector. Examples of the classification application and a theorem of optional formal classification incompleteness are presented. The principles of assigning the meaningful phrases functions over the classification word groups are analyzed.

  6. SAR Imagery Simulation of Ship Based on Electromagnetic Calculations and Sea Clutter Modelling for Classification Applications

    International Nuclear Information System (INIS)

    Ship detection and classification with space-borne SAR has many potential applications within the maritime surveillance, fishery activity management, monitoring ship traffic, and military security. While ship detection techniques with SAR imagery are well established, ship classification is still an open issue. One of the main reasons may be ascribed to the difficulties on acquiring the required quantities of real data of vessels under different observation and environmental conditions with precise ground truth. Therefore, simulation of SAR images with high scenario flexibility and reasonable computation costs is compulsory for ship classification algorithms development. However, the simulation of SAR imagery of ship over sea surface is challenging. Though great efforts have been devoted to tackle this difficult problem, it is far from being conquered. This paper proposes a novel scheme for SAR imagery simulation of ship over sea surface. The simulation is implemented based on high frequency electromagnetic calculations methods of PO, MEC, PTD and GO. SAR imagery of sea clutter is modelled by the representative K-distribution clutter model. Then, the simulated SAR imagery of ship can be produced by inserting the simulated SAR imagery chips of ship into the SAR imagery of sea clutter. The proposed scheme has been validated with canonical and complex ship targets over a typical sea scene

  7. Dynamic frequency feature selection based approach for classification of motor imageries.

    Science.gov (United States)

    Luo, Jing; Feng, Zuren; Zhang, Jun; Lu, Na

    2016-08-01

    Electroencephalography (EEG) is one of the most popular techniques to record the brain activities such as motor imagery, which is of low signal-to-noise ratio and could lead to high classification error. Therefore, selection of the most discriminative features could be crucial to improve the classification performance. However, the traditional feature selection methods employed in brain-computer interface (BCI) field (e.g. Mutual Information-based Best Individual Feature (MIBIF), Mutual Information-based Rough Set Reduction (MIRSR) and cross-validation) mainly focus on the overall performance on all the trials in the training set, and thus may have very poor performance on some specific samples, which is not acceptable. To address this problem, a novel sequential forward feature selection approach called Dynamic Frequency Feature Selection (DFFS) is proposed in this paper. The DFFS method emphasized the importance of the samples that got misclassified while only pursuing high overall classification performance. In the DFFS based classification scheme, the EEG data was first transformed to frequency domain using Wavelet Packet Decomposition (WPD), which is then employed as the candidate set for further discriminatory feature selection. The features are selected one by one in a boosting manner. After one feature being selected, the importance of the correctly classified samples based on the feature will be decreased, which is equivalent to increasing the importance of the misclassified samples. Therefore, a complement feature to the current features could be selected in the next run. The selected features are then fed to a classifier trained by random forest algorithm. Finally, a time series voting-based method is utilized to improve the classification performance. Comparisons between the DFFS-based approach and state-of-art methods on BCI competition IV data set 2b have been conducted, which have shown the superiority of the proposed algorithm. PMID:27253616

  8. An Improved Timestamp-Based Password Authentication Scheme Using Smart Cards

    CERN Document Server

    Pathan, Al-Sakib Khan

    2007-01-01

    With the recent proliferation of distributed systems and networking, remote authentication has become a crucial task in many networking applications. Various schemes have been proposed so far for the two-party remote authentication; however, some of them have been proved to be insecure. In this paper, we propose an efficient timestamp-based password authentication scheme using smart cards. We show various types of forgery attacks against a previously proposed timestamp-based password authentication scheme and improve that scheme to ensure robust security for the remote authentication process, keeping all the advantages that were present in that scheme. Our scheme successfully defends the attacks that could be launched against other related previous schemes. We present a detailed cryptanalysis of previously proposed Shen et. al scheme and an analysis of the improved scheme to show its improvements and efficiency.

  9. An ensemble training scheme for machine-learning classification of Hyperion satellite imagery with independent hyperspectral libraries

    Science.gov (United States)

    Friedel, Michael; Buscema, Massimo

    2016-04-01

    A training scheme is proposed for the real-time classification of soil and vegetation (landscape) components in EO-1 Hyperion hyperspectral images. First, an auto-contractive map is used to compute connectivity of reflectance values for spectral bands (N=200) from independent laboratory spectral library components. Second, a minimum spanning tree is used to identify optimal grouping of training components from connectivity values. Third, the reflectance values for optimal landscape component signatures are sorted. Fourth, empirical distribution functions (EDF) are computed for each landscape component. Fifth, the Monte-Carlo technique is used to generate realizations (N=30) for each landscape EDF. The correspondence of component realizations to original signatures validates the stochastic procedure. Presentation of the realizations to the self-organizing map (SOM) is done using three different map sizes: 14x10, 28x20, and 40 x 30. In each case, the SOM training proceeds first with a rough phase (20 iterations using a Gaussian neighborhood with an initial and final radius of 11 units and 3 units) and then fine phase (400 iterations using a Gaussian neighborhood with an initial and final radius of 3 units and 1 unit). The initial and final learning rates of 0.5 and 0.05 decay linearly down to 10-5, and the Gaussian neighborhood function decreases exponentially (decay rate of 10-3 iteration-1) providing reasonable convergence. Following training of the three networks, each corresponding SOM is used to independently classify the original spectral library signatures. In comparing the different SOM networks, the 28x20 map size is chosen for independent reproducibility and processing speed. The corresponding universal distance matrix reveals separation of the seven component classes for this map size thereby supporting it use as a Hyperion classifier.

  10. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    Science.gov (United States)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  11. MIMO transmit scheme based on morphological perceptron with competitive learning.

    Science.gov (United States)

    Valente, Raul Ambrozio; Abrão, Taufik

    2016-08-01

    This paper proposes a new multi-input multi-output (MIMO) transmit scheme aided by artificial neural network (ANN). The morphological perceptron with competitive learning (MP/CL) concept is deployed as a decision rule in the MIMO detection stage. The proposed MIMO transmission scheme is able to achieve double spectral efficiency; hence, in each time-slot the receiver decodes two symbols at a time instead one as Alamouti scheme. Other advantage of the proposed transmit scheme with MP/CL-aided detector is its polynomial complexity according to modulation order, while it becomes linear when the data stream length is greater than modulation order. The performance of the proposed scheme is compared to the traditional MIMO schemes, namely Alamouti scheme and maximum-likelihood MIMO (ML-MIMO) detector. Also, the proposed scheme is evaluated in a scenario with variable channel information along the frame. Numerical results have shown that the diversity gain under space-time coding Alamouti scheme is partially lost, which slightly reduces the bit-error rate (BER) performance of the proposed MP/CL-NN MIMO scheme. PMID:27135805

  12. Super pixel density based clustering automatic image classification method

    Science.gov (United States)

    Xu, Mingxing; Zhang, Chuan; Zhang, Tianxu

    2015-12-01

    The image classification is an important means of image segmentation and data mining, how to achieve rapid automated image classification has been the focus of research. In this paper, based on the super pixel density of cluster centers algorithm for automatic image classification and identify outlier. The use of the image pixel location coordinates and gray value computing density and distance, to achieve automatic image classification and outlier extraction. Due to the increased pixel dramatically increase the computational complexity, consider the method of ultra-pixel image preprocessing, divided into a small number of super-pixel sub-blocks after the density and distance calculations, while the design of a normalized density and distance discrimination law, to achieve automatic classification and clustering center selection, whereby the image automatically classify and identify outlier. After a lot of experiments, our method does not require human intervention, can automatically categorize images computing speed than the density clustering algorithm, the image can be effectively automated classification and outlier extraction.

  13. A new circulation type classification based upon Lagrangian air trajectories

    Directory of Open Access Journals (Sweden)

    Alexandre M. Ramos

    2014-10-01

    Full Text Available A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories. The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification.A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  14. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    Science.gov (United States)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  15. A LAGUERRE VORONOI BASED SCHEME FOR MESHING PARTICLE SYSTEMS.

    Science.gov (United States)

    Bajaj, Chandrajit

    2005-06-01

    We present Laguerre Voronoi based subdivision algorithms for the quadrilateral and hexahedral meshing of particle systems within a bounded region in two and three dimensions, respectively. Particles are smooth functions over circular or spherical domains. The algorithm first breaks the bounded region containing the particles into Voronoi cells that are then subsequently decomposed into an initial quadrilateral or an initial hexahedral scaffold conforming to individual particles. The scaffolds are subsequently refined via applications of recursive subdivision (splitting and averaging rules). Our choice of averaging rules yield a particle conforming quadrilateral/hexahedral mesh, of good quality, along with being smooth and differentiable in the limit. Extensions of the basic scheme to dynamic re-meshing in the case of addition, deletion, and moving particles are also discussed. Motivating applications of the use of these static and dynamic meshes for particle systems include the mechanics of epoxy/glass composite materials, bio-molecular force field calculations, and gas hydrodynamics simulations in cosmology.

  16. Optimization algorithm based characterization scheme for tunable semiconductor lasers.

    Science.gov (United States)

    Chen, Quanan; Liu, Gonghai; Lu, Qiaoyin; Guo, Weihua

    2016-09-01

    In this paper, an optimization algorithm based characterization scheme for tunable semiconductor lasers is proposed and demonstrated. In the process of optimization, the ratio between the power of the desired frequency and the power except of the desired frequency is used as the figure of merit, which approximately represents the side-mode suppression ratio. In practice, we use tunable optical band-pass and band-stop filters to obtain the power of the desired frequency and the power except of the desired frequency separately. With the assistance of optimization algorithms, such as the particle swarm optimization (PSO) algorithm, we can get stable operation conditions for tunable lasers at designated frequencies directly and efficiently. PMID:27607701

  17. Watermarking scheme of colour image based on chaotic sequences

    Institute of Scientific and Technical Information of China (English)

    LIU Nian-sheng; GUO Dong-hui

    2009-01-01

    The proposed perceptual mask is based on the singularity of cover image and matches very well with the properties of the human visual system. The cover colour image is decomposed into several subbands by the wavelet transform. The water-mark composed of chaotic sequence and the covert image is embedded into the subband with the largest energy. The chaos system plays an important role in the security invisibility and robustness of the proposed scheme. The parameter and initial state of chaos system can directly influence the generation of watermark information as a key. Moreover, the watermark information has the property of spread spectrum signal by chaotic sequence to improve the invisibility and security of watermarked image. Experimental results and comparisons with other watermarking techniques prove that the proposed algorithm is effective and feasible, and improves the security, invisibility and robustness of watermarking information.

  18. A Brief Summary of Dictionary Learning Based Approach for Classification

    CERN Document Server

    Shu, Kong

    2012-01-01

    This note presents some representative methods which are based on dictionary learning (DL) for classification. We do not review the sophisticated methods or frameworks that involve DL for classification, such as online DL and spatial pyramid matching (SPM), but rather, we concentrate on the direct DL-based classification methods. Here, the "so-called direct DL-based method" is the approach directly deals with DL framework by adding some meaningful penalty terms. By listing some representative methods, we can roughly divide them into two categories, i.e. (1) directly making the dictionary discriminative and (2) forcing the sparse coefficients discriminative to push the discrimination power of the dictionary. From this taxonomy, we can expect some extensions of them as future researches.

  19. FINGERPRINT CLASSIFICATION BASED ON RECURSIVE NEURAL NETWORK WITH SUPPORT VECTOR MACHINE

    Directory of Open Access Journals (Sweden)

    T. Chakravarthy

    2011-01-01

    Full Text Available Fingerprint classification based on statistical and structural (RNN and SVM approach. RNNs are trained on a structured representation of the fingerprint image. They are also used to extract a set of distributed features of the fingerprint which can be integrated in this support vector machine. SVMs are combined with a new error correcting codes scheme. This approach has two main advantages. (a It can tolerate the presence of ambiguous fingerprint images in the training set and (b It can effectively identify the most difficult fingerprint images in the test set. In this experiment on the fingerprint database NIST-4 (National Institute of Science and Technology, our best classification accuracy of 94.7% is obtained by training SVM on both fingerCode and RNN –extracted futures of segmentation algorithm which has used very sophisticated “region growing process”.

  20. Improving PLS-RFE based gene selection for microarray data classification.

    Science.gov (United States)

    Wang, Aiguo; An, Ning; Chen, Guilin; Li, Lian; Alterovitz, Gil

    2015-07-01

    Gene selection plays a crucial role in constructing efficient classifiers for microarray data classification, since microarray data is characterized by high dimensionality and small sample sizes and contains irrelevant and redundant genes. In practical use, partial least squares-based gene selection approaches can obtain gene subsets of good qualities, but are considerably time-consuming. In this paper, we propose to integrate partial least squares based recursive feature elimination (PLS-RFE) with two feature elimination schemes: simulated annealing and square root, respectively, to speed up the feature selection process. Inspired from the strategy of annealing schedule, the two proposed approaches eliminate a number of features rather than one least informative feature during each iteration and the number of removed features decreases as the iteration proceeds. To verify the effectiveness and efficiency of the proposed approaches, we perform extensive experiments on six publicly available microarray data with three typical classifiers, including Naïve Bayes, K-Nearest-Neighbor and Support Vector Machine, and compare our approaches with ReliefF, PLS and PLS-RFE feature selectors in terms of classification accuracy and running time. Experimental results demonstrate that the two proposed approaches accelerate the feature selection process impressively without degrading the classification accuracy and obtain more compact feature subsets for both two-category and multi-category problems. Further experimental comparisons in feature subset consistency show that the proposed approach with simulated annealing scheme not only has better time performance, but also obtains slightly better feature subset consistency than the one with square root scheme. PMID:25912984

  1. A robust anonymous biometric-based remote user authentication scheme using smart cards

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Das

    2015-04-01

    Full Text Available Several biometric-based remote user authentication schemes using smart cards have been proposed in the literature in order to improve the security weaknesses in user authentication system. In 2012, An proposed an enhanced biometric-based remote user authentication scheme using smart cards. It was claimed that the proposed scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. In this paper, we first analyze the security of An’s scheme and we show that this scheme has three serious security flaws in the design of the scheme: (i flaw in user’s biometric verification during the login phase, (ii flaw in user’s password verification during the login and authentication phases, and (iii flaw in user’s password change locally at any time by the user. Due to these security flaws, An’s scheme cannot support mutual authentication between the user and the server. Further, we show that An’s scheme cannot prevent insider attack. In order to remedy the security weaknesses found in An’s scheme, we propose a new robust and secure anonymous biometric-based remote user authentication scheme using smart cards. Through the informal and formal security analysis, we show that our scheme is secure against all possible known attacks including the attacks found in An’s scheme. The simulation results of our scheme using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications tool ensure that our scheme is secure against passive and active attacks. In addition, our scheme is also comparable in terms of the communication and computational overheads with An’s scheme and other related existing schemes. As a result, our scheme is more appropriate for practical applications compared to other approaches.

  2. Cryptanalysis And Further Improvement Of A Biometric-Based Remote User Authentication Scheme Using Smart Cards

    CERN Document Server

    Das, Ashok Kumar

    2011-01-01

    Recently, Li et al. proposed a secure biometric-based remote user authentication scheme using smart cards to withstand the security flaws of Li-Hwang's efficient biometric-based remote user authentication scheme using smart cards. Li et al.'s scheme is based on biometrics verification, smart card and one-way hash function, and it also uses the random nonce rather than a synchronized clock, and thus it is efficient in computational cost and more secure than Li-Hwang's scheme. Unfortunately, in this paper we show that Li et al.'s scheme still has some security weaknesses in their design. In order to withstand those weaknesses in their scheme, we further propose an improvement of their scheme so that the improved scheme always provides proper authentication and as a result, it establishes a session key between the user and the server at the end of successful user authentication.

  3. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  4. New Cryptanalysis of an ID-based Password Authentication Scheme using Smart Cards and Fingerprints

    Directory of Open Access Journals (Sweden)

    MING-JHENG LI

    2010-11-01

    Full Text Available In 2002, Lee, Ryu and Yoo proposed a fingerprint-based remote user authentication scheme using a smart card. Their scheme would strengthen system security by verifying the smart card owner’s fingerprint. In 2003, Kim, Lee and Yoo proposed two ID-based password authentication schemes without passwords or verification tables,with smart card and fingerprints. The proposed nonce-based and timestamp-based schemes can withstand message replay attacks. In addition, the schemes can withstand impersonation attack. In this paper, we will first review Kim et al.’s ID-based password authentication schemes. Next, we will show that Kim et al.’s scheme still has the disadvantage of being vulnerable to impersonation attack.

  5. An Extended Energy Consumption Analysis of Reputation-based Trust Management Schemes of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Riaz Ahmed Shaikh

    2010-03-01

    Full Text Available Energy consumption is one of the most important parameters for evaluation of a scheme proposed for wireless sensor networks (WSNs because of their resource constraint nature. Comprehensive comparative analysis of proposed reputation-based trust management schemes of WSNs from this perspective is currently not available in the literature. In this paper, we have presented a theoretical and simulation based energy consumption analysis and evaluation of three state-of-the-art reputation-based trust management schemes of WSNs. Results show that the GTMS scheme consume less energy as compared with the RFSN and PLUS schemes.

  6. An efficient entire chaos-based scheme for deniable authentication

    International Nuclear Information System (INIS)

    By using a chaotic encryption-hash parallel algorithm and the semi-group property of Chebyshev chaotic map, we propose a secure and efficient scheme for the deniable authentication. The scheme is efficient, practicable and reliable, with high potential to be adopted for e-commerce

  7. UPF based autonomous navigation scheme for deep space probe

    Institute of Scientific and Technical Information of China (English)

    Li Peng; Cui Hutao; Cui Pingyuan

    2008-01-01

    The autonomous "celestial navigation scheme" for deep space probe departing from the earth and the autonomous "optical navigation scheme" for encountering object celestial body are presented. Then, aiming at the conditions that large initial estimation errors and non-Gaussian distribution of state or measurement errors may exist in orbit determination process of the two phases, UPF (unscented particle filter) is introduced into the navigation schemes. By tackling nonlinear and non-Gaussian problems, UPF overcomes the accuracy influence brought by the traditional EKF (extended Kalman filter), UKF (unscented Kalman filter), and PF (particle filter) schemes in approximate treatment to nonlinear and non-Gaussian state model and measurement model. The numerical simulations demonstrate the feasibility and higher accuracy of the UPF navigation scheme.

  8. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    Science.gov (United States)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-02-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  9. A GOST-like Blind Signature Scheme Based on Elliptic Curve Discrete Logarithm Problem

    OpenAIRE

    HOSSEINI, Hossein; Bahrak, Behnam; Hessar, Farzad

    2013-01-01

    In this paper, we propose a blind signature scheme and three practical educed schemes based on elliptic curve discrete logarithm problem. The proposed schemes impart the GOST signature structure and utilize the inherent advantage of elliptic curve cryptosystems in terms of smaller key size and lower computational overhead to its counterpart public key cryptosystems such as RSA and ElGamal. The proposed schemes are proved to be secure and have less time complexity in comparison with the existi...

  10. Classification and Target Group Selection Based Upon Frequent Patterns

    NARCIS (Netherlands)

    W.H.L.M. Pijls (Wim); R. Potharst (Rob)

    2000-01-01

    textabstractIn this technical report , two new algorithms based upon frequent patterns are proposed. One algorithm is a classification method. The other one is an algorithm for target group selection. In both algorithms, first of all, the collection of frequent patterns in the training set is constr

  11. Hierarchical Real-time Network Traffic Classification Based on ECOC

    Directory of Open Access Journals (Sweden)

    Yaou Zhao

    2013-09-01

    Full Text Available Classification of network traffic is basic and essential for manynetwork researches and managements. With the rapid development ofpeer-to-peer (P2P application using dynamic port disguisingtechniques and encryption to avoid detection, port-based and simplepayload-based network traffic classification methods were diminished.An alternative method based on statistics and machine learning hadattracted researchers' attention in recent years. However, most ofthe proposed algorithms were off-line and usually used a single classifier.In this paper a new hierarchical real-time model was proposed which comprised of a three tuple (source ip, destination ip and destination portlook up table(TT-LUT part and layered milestone part. TT-LUT was used to quickly classify short flows whichneed not to pass the layered milestone part, and milestones in layered milestone partcould classify the other flows in real-time with the real-time feature selection and statistics.Every milestone was a ECOC(Error-Correcting Output Codes based model which was usedto improve classification performance. Experiments showed that the proposedmodel can improve the efficiency of real-time to 80%, and themulti-class classification accuracy encouragingly to 91.4% on the datasets which had been captured from the backbone router in our campus through a week.

  12. Impact of Information based Classification on Network Epidemics

    Science.gov (United States)

    Mishra, Bimal Kumar; Haldar, Kaushik; Sinha, Durgesh Nandini

    2016-06-01

    Formulating mathematical models for accurate approximation of malicious propagation in a network is a difficult process because of our inherent lack of understanding of several underlying physical processes that intrinsically characterize the broader picture. The aim of this paper is to understand the impact of available information in the control of malicious network epidemics. A 1-n-n-1 type differential epidemic model is proposed, where the differentiality allows a symptom based classification. This is the first such attempt to add such a classification into the existing epidemic framework. The model is incorporated into a five class system called the DifEpGoss architecture. Analysis reveals an epidemic threshold, based on which the long-term behavior of the system is analyzed. In this work three real network datasets with 22002, 22469 and 22607 undirected edges respectively, are used. The datasets show that classification based prevention given in the model can have a good role in containing network epidemics. Further simulation based experiments are used with a three category classification of attack and defense strengths, which allows us to consider 27 different possibilities. These experiments further corroborate the utility of the proposed model. The paper concludes with several interesting results.

  13. Classification-Based Method of Linear Multicriteria Optimization

    OpenAIRE

    Vassilev, Vassil; Genova, Krassimira; Vassileva, Mariyana; Narula, Subhash

    2003-01-01

    The paper describes a classification-based learning-oriented interactive method for solving linear multicriteria optimization problems. The method allows the decision makers describe their preferences with greater flexibility, accuracy and reliability. The method is realized in an experimental software system supporting the solution of multicriteria optimization problems.

  14. Classification of CT-brain slices based on local histograms

    Science.gov (United States)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  15. Pulse frequency classification based on BP neural network

    Institute of Scientific and Technical Information of China (English)

    WANG Rui; WANG Xu; YANG Dan; FU Rong

    2006-01-01

    In Traditional Chinese Medicine (TCM), it is an important parameter of the clinic disease diagnosis to analysis the pulse frequency. This article accords to pulse eight major essentials to identify pulse type of the pulse frequency classification based on back-propagation neural networks (BPNN). The pulse frequency classification includes slow pulse, moderate pulse, rapid pulse etc. By feature parameter of the pulse frequency analysis research and establish to identify system of pulse frequency features. The pulse signal from detecting system extracts period, frequency etc feature parameter to compare with standard feature value of pulse type. The result shows that identify-rate attains 92.5% above.

  16. TENSOR MODELING BASED FOR AIRBORNE LiDAR DATA CLASSIFICATION

    OpenAIRE

    Li, N.; Liu, C; Pfeifer, N; Yin, J. F.; Liao, Z.Y.; Zhou, Y.

    2016-01-01

    Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the “raw” data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could kee...

  17. Optimizing Mining Association Rules for Artificial Immune System based Classification

    Directory of Open Access Journals (Sweden)

    SAMEER DIXIT

    2011-08-01

    Full Text Available The primary function of a biological immune system is to protect the body from foreign molecules known as antigens. It has great pattern recognition capability that may be used to distinguish between foreigncells entering the body (non-self or antigen and the body cells (self. Immune systems have many characteristics such as uniqueness, autonomous, recognition of foreigners, distributed detection, and noise tolerance . Inspired by biological immune systems, Artificial Immune Systems have emerged during the last decade. They are incited by many researchers to design and build immune-based models for a variety of application domains. Artificial immune systems can be defined as a computational paradigm that is inspired by theoretical immunology, observed immune functions, principles and mechanisms. Association rule mining is one of the most important and well researched techniques of data mining. The goal of association rules is to extract interesting correlations, frequent patterns, associations or casual structures among sets of items in thetransaction databases or other data repositories. Association rules are widely used in various areas such as inventory control, telecommunication networks, intelligent decision making, market analysis and risk management etc. Apriori is the most widely used algorithm for mining the association rules. Other popular association rule mining algorithms are frequent pattern (FP growth, Eclat, dynamic itemset counting (DIC etc. Associative classification uses association rule mining in the rule discovery process to predict the class labels of the data. This technique has shown great promise over many other classification techniques. Associative classification also integrates the process of rule discovery and classification to build the classifier for the purpose of prediction. The main problem with the associative classification approach is the discovery of highquality association rules in a very large space of

  18. Online Network Traffic Classification Algorithm Based on RVM

    Directory of Open Access Journals (Sweden)

    Zhang Qunhui

    2013-06-01

    Full Text Available Since compared with the Support Vector Machine (SVM, the Relevance Vector Machine (RVM not only has the advantage of avoiding the over- learn which is the characteristic of the SVM, but also greatly reduces the amount of computation of the kernel function and avoids the defects of the SVM that the scarcity is not strong, the large amount of calculation as well as the kernel function must satisfy the Mercer's condition and that human empirically determined parameters, so we proposed a new online traffic classification algorithm base on the RVM for this purpose. Through the analysis of the basic principles of RVM and the steps of the modeling, we made use of the training traffic classification model of the RVM to identify the network traffic in the real time through this model and the “port number+ DPI”. When the RVM predicts that the probability is in the query interval, we jointly used the "port number" and "DPI". Finally, we made a detailed experimental validation which shows that: compared with the Support Vector Machine (SVM network traffic classification algorithm, this algorithm can achieve the online network traffic classification, and the classification predication probability is greatly improved.

  19. Torrent classification - Base of rational management of erosive regions

    Energy Technology Data Exchange (ETDEWEB)

    Gavrilovic, Zoran; Stefanovic, Milutin; Milovanovic, Irina; Cotric, Jelena; Milojevic, Mileta [Institute for the Development of Water Resources ' Jaroslav Cerni' , 11226 Beograd (Pinosava), Jaroslava Cernog 80 (Serbia)], E-mail: gavrilovicz@sbb.rs

    2008-11-01

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  20. Torrent classification - Base of rational management of erosive regions

    International Nuclear Information System (INIS)

    A complex methodology for torrents and erosion and the associated calculations was developed during the second half of the twentieth century in Serbia. It was the 'Erosion Potential Method'. One of the modules of that complex method was focused on torrent classification. The module enables the identification of hydro graphic, climate and erosion characteristics. The method makes it possible for each torrent, regardless of its magnitude, to be simply and recognizably described by the 'Formula of torrentially'. The above torrent classification is the base on which a set of optimisation calculations is developed for the required scope of erosion-control works and measures, the application of which enables the management of significantly larger erosion and torrential regions compared to the previous period. This paper will present the procedure and the method of torrent classification.

  1. Signatures of Polarimetric Parameters and their Implications on Land Cover Classification

    DEFF Research Database (Denmark)

    Skriver, Henning

    2007-01-01

    Knowledge-based or rule-based classification schemes provide robust classification of normally a few major classes. In order to determine optimum polarimetric parameters for such classification schemes, a study has been performed, where the separability between different sets of major classes usi...

  2. A scalable admission control scheme based on time label

    Institute of Scientific and Technical Information of China (English)

    杨松岸; 杨华; 杨宇航

    2004-01-01

    Resource reservation protocols allow communicating hosts to reserve resources such as bandwidth to offer guaranteed service. However, current resource reservation architectures do not scale well for a large number of flows. In this paper, we present a simple reservation protocol and a scalable admission control algorithm, which can provide QoS guarantees to individual flows without per-flow management in the network core. By mapping each flow to a definite time, this scheme addresses the problems that limit the effectiveness of current endpoint admission control schemes. The overall admission control process is described. Analysis is used to explain the reasonability of our scheme and simulation validates its performance.

  3. An Efficient ECDSA-Based Signature Scheme for Wireless Networks

    Institute of Scientific and Technical Information of China (English)

    XU Zhong; DAI Guanzhong; YANG Deming

    2006-01-01

    Wired equivalent security is difficult to provide in wireless networks due to high dynamics, wireless link vulnerability, and decentralization. The Elliptic Curve Digital Signature Algorithm(ECDSA) has been applied to wireless networks because of its low computational cost and short key size, which reduces the overheads in a wireless environment. This study improves the ECDSA scheme by reducing its time complexity. The significant advantage of the algorithm is that our new scheme needs not to calculate modular inverse operation in the phases of signature generation and signature verification. Such an improvement makes the proposed scheme more efficient and secure.

  4. Hardware Accelerators Targeting a Novel Group Based Packet Classification Algorithm

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2013-01-01

    Full Text Available Packet classification is a ubiquitous and key building block for many critical network devices. However, it remains as one of the main bottlenecks faced when designing fast network devices. In this paper, we propose a novel Group Based Search packet classification Algorithm (GBSA that is scalable, fast, and efficient. GBSA consumes an average of 0.4 megabytes of memory for a 10 k rule set. The worst-case classification time per packet is 2 microseconds, and the preprocessing speed is 3 M rules/second based on an Xeon processor operating at 3.4 GHz. When compared with other state-of-the-art classification techniques, the results showed that GBSA outperforms the competition with respect to speed, memory usage, and processing time. Moreover, GBSA is amenable to implementation in hardware. Three different hardware implementations are also presented in this paper including an Application Specific Instruction Set Processor (ASIP implementation and two pure Register-Transfer Level (RTL implementations based on Impulse-C and Handel-C flows, respectively. Speedups achieved with these hardware accelerators ranged from 9x to 18x compared with a pure software implementation running on an Xeon processor.

  5. The normalization of citation counts based on classification systems

    CERN Document Server

    Bornmann, Lutz; Barth, Andreas

    2013-01-01

    If we want to assess whether the paper in question has had a particularly high or low citation impact compared to other papers, the standard practice in bibliometrics is to normalize citations in respect of the subject category and publication year. A number of proposals for an improved procedure in the normalization of citation impact have been put forward in recent years. Against the background of these proposals this study describes an ideal solution for the normalization of citation impact: in a first step, the reference set for the publication in question is collated by means of a classification scheme, where every publication is associated with a single principal research field or subfield entry (e. g. via Chemical Abstracts sections) and a publication year. In a second step, percentiles of citation counts are calculated for this set and used to assign the normalized citation impact score to the publications (and also to the publication in question).

  6. Content-based image classification with circular harmonic wavelets

    Science.gov (United States)

    Jacovitti, Giovanni; Neri, Alessandro

    1998-07-01

    Classification of an image on the basis of contained patterns is considered in a context of detection and estimation theory. To simplify mathematical derivations, image and reference patterns are represented on a complex support. This allows to convert the four positional parameters into two complex numbers: complex displacement and complex scale factor. The latter one represents isotropic dilations with its magnitude, and rotations with its phase. In this context, evaluation of the likelihood function under additive Gaussian noise assumption allows to relate basic template matching strategy to wavelet theory. It is shown that using circular harmonic wavelets simplifies the problem from a computational viewpoint. A general purpose pattern detection/estimation scheme is introduced by decomposing the images on a orthogonal basis formed by complex Laguerre-Gauss Harmonic wavelets.

  7. The Normalization of Citation Counts Based on Classification Systems

    Directory of Open Access Journals (Sweden)

    Andreas Barth

    2013-08-01

    Full Text Available If we want to assess whether the paper in question has had a particularly high or low citation impact compared to other papers, the standard practice in bibliometrics is to normalize citations in respect of the subject category and publication year. A number of proposals for an improved procedure in the normalization of citation impact have been put forward in recent years. Against the background of these proposals, this study describes an ideal solution for the normalization of citation impact: in a first step, the reference set for the publication in question is collated by means of a classification scheme, where every publication is associated with a single principal research field or subfield entry (e.g., via Chemical Abstracts sections and a publication year. In a second step, percentiles of citation counts are calculated for this set and used to assign the normalized citation impact score to the publications (and also to the publication in question.

  8. A Chaos-Based Encryption Scheme for DCT Precoded OFDM-Based Visible Light Communication Systems

    Directory of Open Access Journals (Sweden)

    Zhongpeng Wang

    2016-01-01

    Full Text Available This paper proposes a physical encryption scheme for discrete cosine transform (DCT precoded OFDM-based visible light communication systems by employing chaos scrambling. In the proposed encryption scheme, the Logistic map is adopted for the chaos mapping. The chaos scrambling strategy can allocate the two scrambling sequences to the real (I and imaginary (Q parts of OFDM frames according to the initial condition, which enhance the confidentiality of the physical layer. The simulation experimental results prove the efficiency of the proposed encryption method for DCT precoded OFDM-based VLC systems. The experimental results show that the proposed security scheme can protect the DCT precoded OFDM-based VLC from eavesdropper, while keeping the advantage of the DCT precoding technique, which can reduce the PAPR and improve the BER performance of OFDM-based VLC.

  9. Automatic classification of eclipsing binaries light curves using neural networks

    CERN Document Server

    Sarro, L M; Giménez, A

    2005-01-01

    In this work we present a system for the automatic classification of the light curves of eclipsing binaries. This system is based on a classification scheme that aims to separate eclipsing binary sistems according to their geometrical configuration in a modified version of the traditional classification scheme. The classification is performed by a Bayesian ensemble of neural networks trained with {\\em Hipparcos} data of seven different categories including eccentric binary systems and two types of pulsating light curve morphologies.

  10. A rhythm-based authentication scheme for smart media devices.

    Science.gov (United States)

    Lee, Jae Dong; Jeong, Young-Sik; Park, Jong Hyuk

    2014-01-01

    In recent years, ubiquitous computing has been rapidly emerged in our lives and extensive studies have been conducted in a variety of areas related to smart devices, such as tablets, smartphones, smart TVs, smart refrigerators, and smart media devices, as a measure for realizing the ubiquitous computing. In particular, smartphones have significantly evolved from the traditional feature phones. Increasingly higher-end smartphone models that can perform a range of functions are now available. Smart devices have become widely popular since they provide high efficiency and great convenience for not only private daily activities but also business endeavors. Rapid advancements have been achieved in smart device technologies to improve the end users' convenience. Consequently, many people increasingly rely on smart devices to store their valuable and important data. With this increasing dependence, an important aspect that must be addressed is security issues. Leaking of private information or sensitive business data due to loss or theft of smart devices could result in exorbitant damage. To mitigate these security threats, basic embedded locking features are provided in smart devices. However, these locking features are vulnerable. In this paper, an original security-locking scheme using a rhythm-based locking system (RLS) is proposed to overcome the existing security problems of smart devices. RLS is a user-authenticated system that addresses vulnerability issues in the existing locking features and provides secure confidentiality in addition to convenience. PMID:25110743

  11. A Rhythm-Based Authentication Scheme for Smart Media Devices

    Directory of Open Access Journals (Sweden)

    Jae Dong Lee

    2014-01-01

    Full Text Available In recent years, ubiquitous computing has been rapidly emerged in our lives and extensive studies have been conducted in a variety of areas related to smart devices, such as tablets, smartphones, smart TVs, smart refrigerators, and smart media devices, as a measure for realizing the ubiquitous computing. In particular, smartphones have significantly evolved from the traditional feature phones. Increasingly higher-end smartphone models that can perform a range of functions are now available. Smart devices have become widely popular since they provide high efficiency and great convenience for not only private daily activities but also business endeavors. Rapid advancements have been achieved in smart device technologies to improve the end users’ convenience. Consequently, many people increasingly rely on smart devices to store their valuable and important data. With this increasing dependence, an important aspect that must be addressed is security issues. Leaking of private information or sensitive business data due to loss or theft of smart devices could result in exorbitant damage. To mitigate these security threats, basic embedded locking features are provided in smart devices. However, these locking features are vulnerable. In this paper, an original security-locking scheme using a rhythm-based locking system (RLS is proposed to overcome the existing security problems of smart devices. RLS is a user-authenticated system that addresses vulnerability issues in the existing locking features and provides secure confidentiality in addition to convenience.

  12. Digital Signature Scheme Based on a New Hard Problem

    Directory of Open Access Journals (Sweden)

    Nikolay A. Moldovyan

    2008-07-01

    Full Text Available Factorizing composite number n=qr, where q and r are two large primes, and finding discrete logarithm modulo large prime number p are two difficult computational problems which are usually put into the base of different digital signature schemes (DSSes. This paper introduces a new hard computational problem that consists in finding the k th roots modulo large prime p=Nk2+1 where N is an even number and k is a prime with the length |k|≥160. Difficulty of the last problem is estimated as O(√k. It is proposed a new DSS with the public key xkmodp, where x is the private key. The signature corresponding to some message M represents a pair of the |p|$-bit numbers S and R calculated as follows: R=tk mod p and S=txf(R,Mmodp, where f(R, M is a compression function. The verification equation is Sk mod p=yf(R, MRmodp. The DSS is used to implement an efficient protocol for generating collective digital signatures.

  13. Arbitrated quantum signature scheme based on cluster states

    Science.gov (United States)

    Yang, Yu-Guang; Lei, He; Liu, Zhi-Chao; Zhou, Yi-Hua; Shi, Wei-Min

    2016-06-01

    Cluster states can be exploited for some tasks such as topological one-way computation, quantum error correction, teleportation and dense coding. In this paper, we investigate and propose an arbitrated quantum signature scheme with cluster states. The cluster states are used for quantum key distribution and quantum signature. The proposed scheme can achieve an efficiency of 100 %. Finally, we also discuss its security against various attacks.

  14. A Class of Key Predistribution Schemes Based on Orthogonal Arrays

    Institute of Scientific and Technical Information of China (English)

    Jun-Wu Dong; Ding-Yi Pei; Xue-Li Wang

    2008-01-01

    Pairwise key establishment is a fundamental security service in sensor networks; it enables sensor nodes to communicate securely with each other using cryptographic techniques. In order to ensure this security, many approaches have been proposed recently. One of them is to use key predistribution schemes for distributed sensor networks. The secure connectivity and resilience of the resulting sensor network are analyzed. This KPS constructed in our paper has some better properties than those of the existing schemes.

  15. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    Science.gov (United States)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  16. Vertical diffuse attenuation coefficient () based optical classification of IRS-P3 MOS-B satellite ocean colour data

    Indian Academy of Sciences (India)

    R K Sarangi; Prakash Chauhan; S R Nayak

    2002-09-01

    The optical classification of the different water types provides vital input for studies related to primary productivity, water clarity and determination of euphotic depth. Image data of the IRS- P3 MOS-B, for Path 90 of 27th February, 1998 was used for deriving vertical diffuse attenuation Coeffcient () and an optical classification based on values was performed. An atmospheric correction scheme was used for retrieving water leaving radiances in blue and green channels of 412, 443, 490 and 550 nm. The upwelling radiances from 443nm and 550nm spectral channels were used for computation of vertical diffuse attenuation coeffcient at 490 nm. The waters off the Gujarat coast were classified into different water types based on Jerlov classification scheme. The oceanic water type IA ( range 0.035-0.040m-1), type IB (0.042-0.065m-1), type II (0.07-0.1m-1) and type III (0.115-0.14m-1) were identified. For the coastal waters along Gujarat coast and Gulf of Kachchh, (490) values ranged between 0.15m-1 and 0.35m-1. The depth of 1% of surface light for water type IA, IB, II and III corresponds to 88, 68, 58 and 34 meters respectively. Classification of oceanic and coastal waters based on is useful in understanding the light transmission characteristics for sub-marine navigation and under-water imaging.

  17. Classification of Mental Disorders Based on Temperament

    Directory of Open Access Journals (Sweden)

    Nadi Sakhvidi

    2015-08-01

    Full Text Available Context Different paradoxical theories are available regarding psychiatric disorders. The current study aimed to establish a more comprehensive overall approach. Evidence Acquisition This basic study examined ancient medical books. “The Canon” by Avicenna and “Comprehensive Textbook of Psychiatry” by Kaplan and Sadock were the most important and frequently consulted books in this study. Results Four groups of temperaments were identified: high active, high flexible; high active, low flexible; low active, low flexible; and low active, high flexible. When temperament deteriorates personality, non-psychotic, and psychotic psychiatric disorders can develop. Conclusions Temperaments can provide a basis to classify psychiatric disorders. Psychiatric disorders can be placed in a spectrum based on temperaments.

  18. IDENTITY-BASED MULTISIGNATURE AND AGGREGATE SIGNATURE SCHEMES FROM M-TORSION GROUPS

    Institute of Scientific and Technical Information of China (English)

    Cheng Xiangguo; Liu Jingmei; Guo Lifeng; Wang Xinmei

    2006-01-01

    An identity-based multisignature scheme and an identity-based aggregate signature scheme are proposed in this paper. They are both from m-torsion groups on super-singular elliptic curves or hyper-elliptic curves and based on the recently proposed identity-based signature scheme of Cha and Cheon. Due to the sound properties of m-torsion groups and the base scheme, it turns out that our schemes are very simple and efficient. Both schemes are proven to be secure against adaptive chosen message attack in the random oracle model under the normal security notions with the assumption that the Computational Diffie-Hellman problem is hard in the m-torsion groups.

  19. DESIGN OF A DIGITAL SIGNATURE SCHEME BASED ON FACTORING AND DISCRETE LOGARITHMS

    Institute of Scientific and Technical Information of China (English)

    杨利英; 覃征; 胡广伍; 王志敏

    2004-01-01

    Objective Focusing on the security problem of authentication and confidentiality in the context of computer networks, a digital signature scheme was proposed based on the public key cryptosystem. Methods Firstly, the course of digital signature based on the public key cryptosystem was given. Then, RSA and ELGamal schemes were described respectively. They were the basis of the proposed scheme. Generalized ELGamal type signature schemes were listed. After comparing with each other, one scheme, whose Signature equation was (m+r)x=j+s modΦ(p) , was adopted in the designing. Results Based on two well-known cryptographic assumptions, the factorization and the discrete logarithms, a digital signature scheme was presented. It must be required that s' was not equal to p'q' in the signing procedure, because attackers could forge the signatures with high probabilities if the discrete logarithms modulo a large prime were solvable. The variable public key "e" is used instead of the invariable parameter "3" in Harns signature scheme to enhance the security. One generalized ELGamal type scheme made the proposed scheme escape one multiplicative inverse operation in the signing procedure and one modular exponentiation in the verification procedure. Conclusion The presented scheme obtains the security that Harn's scheme was originally claimed. It is secure if the factorization and the discrete logarithms are simultaneously unsolvable.

  20. CONSTRUCTION OF PROXY BLIND SIGNATURE SCHEME BASED ON MULTI-LINEAR TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Zhao Zemao; Liu Fengyu

    2004-01-01

    A general method of constructing proxy blind signature is proposed based on multilinear transform. Based on this method, the four proxy blind signature schemes are correspondently generated with four different signature equations, and each of them has four forms of variations of signs. Hence there are sixteen signatures in all, and all of them are proxy stronglyblind signature schemes. Furthermore, the two degenerated situations of multi-linear transform are discussed. Their corresponding proxy blind signature schemes are shown, too. But some schemes come from one of these degenerate situations are proxy weakly-blind signature scheme.The security for proposed scheme is analyzed in details. The results indicate that these signature schemes have many good properties such as unforgeability, distinguish-ability of proxy signature,non-repudiation and extensive value of application etc.

  1. Upper limit for context based crop classification

    DEFF Research Database (Denmark)

    Midtiby, Henrik; Åstrand, Björn; Jørgensen, Rasmus Nyholm;

    2012-01-01

    Mechanical in-row weed control of crops like sugarbeet require precise knowledge of where individual crop plants are located. If crop plants are placed in known pattern, information about plant locations can be used to discriminate between crop and weed plants. The success rate of such a classifier...... depends on the weed pressure, the position uncertainty of the crop plants and the crop upgrowth percentage. The first two measures can be combined to a normalized weed pressure, \\lambda. Given the normalized weed pressure an upper bound on the positive predictive value is shown to be 1/(1+\\lambda). If the...... weed pressure is \\rho = 400/m^2 and the crop position uncertainty is \\sigma_x = 0.0148m along the row and \\sigma_y = 0.0108m perpendicular to the row, the normalized weed pressure is \\lambda ~ 0.40$; the upper bound on the positive predictive value is then 0.71. This means that when a position based...

  2. Classification of Regional Ionospheric Disturbances Based on Support Vector Machines

    Science.gov (United States)

    Begüm Terzi, Merve; Arikan, Feza; Arikan, Orhan; Karatay, Secil

    2016-07-01

    Ionosphere is an anisotropic, inhomogeneous, time varying and spatio-temporally dispersive medium whose parameters can be estimated almost always by using indirect measurements. Geomagnetic, gravitational, solar or seismic activities cause variations of ionosphere at various spatial and temporal scales. This complex spatio-temporal variability is challenging to be identified due to extensive scales in period, duration, amplitude and frequency of disturbances. Since geomagnetic and solar indices such as Disturbance storm time (Dst), F10.7 solar flux, Sun Spot Number (SSN), Auroral Electrojet (AE), Kp and W-index provide information about variability on a global scale, identification and classification of regional disturbances poses a challenge. The main aim of this study is to classify the regional effects of global geomagnetic storms and classify them according to their risk levels. For this purpose, Total Electron Content (TEC) estimated from GPS receivers, which is one of the major parameters of ionosphere, will be used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. In this work, for the automated classification of the regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. SVM is a supervised learning model used for classification with associated learning algorithm that analyze the data and recognize patterns. In addition to performing linear classification, SVM can efficiently perform nonlinear classification by embedding data into higher dimensional feature spaces. Performance of the developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from the GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing the developed classification

  3. Object-Based Classification and Change Detection of Hokkaido, Japan

    Science.gov (United States)

    Park, J. G.; Harada, I.; Kwak, Y.

    2016-06-01

    Topography and geology are factors to characterize the distribution of natural vegetation. Topographic contour is particularly influential on the living conditions of plants such as soil moisture, sunlight, and windiness. Vegetation associations having similar characteristics are present in locations having similar topographic conditions unless natural disturbances such as landslides and forest fires or artificial disturbances such as deforestation and man-made plantation bring about changes in such conditions. We developed a vegetation map of Japan using an object-based segmentation approach with topographic information (elevation, slope, slope direction) that is closely related to the distribution of vegetation. The results found that the object-based classification is more effective to produce a vegetation map than the pixel-based classification.

  4. An arbitrated quantum signature scheme based on entanglement swapping with signer anonymity

    Science.gov (United States)

    Li, Wei; Fan, Ming-Yu; Wang, Guang-Wei

    2012-12-01

    In this paper an arbitrated quantum signature scheme based on entanglement swapping is proposed. In this scheme a message to be signed is coded with unitary operators. Combining quantum measurement with quantum encryption, the signer can generate the signature for a given message. Combining the entangled states generated by the TTP's Bell measurement with the signature information, the verifier can verify the authentication of a signature through a single quantum state measurement. Compared with previous schemes, our scheme is more efficient and less complex, furthermore, our scheme can ensure the anonymity of the signer.

  5. Classification data mining method based on dynamic RBF neural networks

    Science.gov (United States)

    Zhou, Lijuan; Xu, Min; Zhang, Zhang; Duan, Luping

    2009-04-01

    With the widely application of databases and sharp development of Internet, The capacity of utilizing information technology to manufacture and collect data has improved greatly. It is an urgent problem to mine useful information or knowledge from large databases or data warehouses. Therefore, data mining technology is developed rapidly to meet the need. But DM (data mining) often faces so much data which is noisy, disorder and nonlinear. Fortunately, ANN (Artificial Neural Network) is suitable to solve the before-mentioned problems of DM because ANN has such merits as good robustness, adaptability, parallel-disposal, distributing-memory and high tolerating-error. This paper gives a detailed discussion about the application of ANN method used in DM based on the analysis of all kinds of data mining technology, and especially lays stress on the classification Data Mining based on RBF neural networks. Pattern classification is an important part of the RBF neural network application. Under on-line environment, the training dataset is variable, so the batch learning algorithm (e.g. OLS) which will generate plenty of unnecessary retraining has a lower efficiency. This paper deduces an incremental learning algorithm (ILA) from the gradient descend algorithm to improve the bottleneck. ILA can adaptively adjust parameters of RBF networks driven by minimizing the error cost, without any redundant retraining. Using the method proposed in this paper, an on-line classification system was constructed to resolve the IRIS classification problem. Experiment results show the algorithm has fast convergence rate and excellent on-line classification performance.

  6. Linear Models Based on Noisy Data and the Frisch Scheme*

    Science.gov (United States)

    Ning, Lipeng; Georgiou, Tryphon T.; Tannenbaum, Allen; Boyd, Stephen P.

    2016-01-01

    We address the problem of identifying linear relations among variables based on noisy measurements. This is a central question in the search for structure in large data sets. Often a key assumption is that measurement errors in each variable are independent. This basic formulation has its roots in the work of Charles Spearman in 1904 and of Ragnar Frisch in the 1930s. Various topics such as errors-in-variables, factor analysis, and instrumental variables all refer to alternative viewpoints on this problem and on ways to account for the anticipated way that noise enters the data. In the present paper we begin by describing certain fundamental contributions by the founders of the field and provide alternative modern proofs to certain key results. We then go on to consider a modern viewpoint and novel numerical techniques to the problem. The central theme is expressed by the Frisch–Kalman dictum, which calls for identifying a noise contribution that allows a maximal number of simultaneous linear relations among the noise-free variables—a rank minimization problem. In the years since Frisch’s original formulation, there have been several insights, including trace minimization as a convenient heuristic to replace rank minimization. We discuss convex relaxations and theoretical bounds on the rank that, when met, provide guarantees for global optimality. A complementary point of view to this minimum-rank dictum is presented in which models are sought leading to a uniformly optimal quadratic estimation error for the error-free variables. Points of contact between these formalisms are discussed, and alternative regularization schemes are presented. PMID:27168672

  7. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  8. A New Digital Signature Scheme Based on Factoring and Discrete Logarithms

    Directory of Open Access Journals (Sweden)

    E. S. Ismail

    2008-01-01

    Full Text Available Problem statement: A digital signature scheme allows one to sign an electronic message and later the produced signature can be validated by the owner of the message or by any verifier. Most of the existing digital signature schemes were developed based on a single hard problem like factoring, discrete logarithm, residuosity or elliptic curve discrete logarithm problems. Although these schemes appear secure, one day in a near future they may be exploded if one finds a solution of the single hard problem. Approach: To overcome this problem, in this study, we proposed a new signature scheme based on multiple hard problems namely factoring and discrete logarithms. We combined the two problems into both signing and verifying equations such that the former depends on two secret keys whereas the latter depends on two corresponding public keys. Results: The new scheme was shown to be secure against the most five considering attacks for signature schemes. The efficiency performance of our scheme only requires 1203Tmul+Th time complexity for signature generation and 1202Tmul+Th time complexity for verification generation and this magnitude of complexity is considered minimal for multiple hard problems-like signature schemes. Conclusions: The new signature scheme based on multiple hard problems provides longer and higher security level than that scheme based on one problem. This is because no enemy can solve multiple hard problems simultaneously.

  9. Novel Sequence Number Based Secure Authentication Scheme for Wireless LANs

    Institute of Scientific and Technical Information of China (English)

    Rajeev Singh; Teek Parval Sharma

    2015-01-01

    Authentication per frame is an implicit necessity for security in wireless local area networks (WLANs). We propose a novel per frame secure authentication scheme which provides authentication to data frames in WLANs. The scheme involves no cryptographic overheads for authentication of frames. It utilizes the sequence number of the frame along with the authentication stream generators for authentication. Hence, it requires no extra bits or messages for the authentication purpose and also no change in the existing frame format is required. The scheme provides authentication by modifying the sequence number of the frame at the sender, and that the modification is verified at the receiver. The modified sequence number is protected by using the XOR operation with a random number selected from the random stream. The authentication is lightweight due to the fact that it requires only trivial arithmetic operations like the subtraction and XOR operation.

  10. A scalable admission control scheme based on time label

    Institute of Scientific and Technical Information of China (English)

    杨松岸; 杨华; 杨宇航

    2004-01-01

    Resource reservation protocols allow communicating hosts to reserve resources such as bandwidth to offer guaranteed service. However,current resource reservation architectures do not scale well for a large number of flows. In this paper,we present a simple reservation protocol and a scalable admission control algorithm,which can provide QoS guarantees to individual flows without per-flow management in the network core. By mapping each flow to a definite time,this scheme addresses the problems that limit the effectiveness of current endpoint admission control schemes. The overall admission control process is described. Analysis is used to explain the reasonability of our scheme and simulation validates its performance.

  11. One-step discrimination scheme on N-particle Greenberger-Horne-Zeilinger bases

    Institute of Scientific and Technical Information of China (English)

    Wang Xin-Wen; Liu Xiang; Fang Mao-Fa

    2007-01-01

    We present an experimentally feasible one-step discrimination scheme on Bell bases with trapped ions, and then generalize it to the case of N-ion Greenberger-Horne-Zeilinger (GHZ) bases. In the scheme, all the orthogonal and complete N-ion GHZ internal states can be exactly discriminated only by one step, and thus it takes very short time. Moreover, the scheme is insensitive to thermal motion and dose not require the individual addressing of the ions. The Bell-state and GHZ-state one-step discrimination scheme can be widely used in quantum information processing based on ion-trap set-up.

  12. A training-based scheme for communicating over unknown channels with feedback

    CERN Document Server

    Mahajan, Aditya

    2009-01-01

    We consider communication with noiseless feedback over a channel that is either BSC(p) or BSC(1-p); neither the transmitter nor the receiver know which one. The parameter $p \\in [0, 1/2]$ is known to both. We propose a variable length training-based scheme for this channel. The error exponent of this scheme is within a constant fraction of the best possible error exponent. Thus, contrary to popular belief, variable length training-based schemes need not have poor error exponents. Moreover, training-based schemes can preserve the main advantage of feedback -- an error exponent with non-zero slope at rates close to capacity.

  13. A New Loss-Tolerant Image Encryption Scheme Based on Secret Sharing and Two Chaotic Systems

    OpenAIRE

    Li Li; Ahmed A. Abd El-Latif; Zhenfeng Shi and Xiamu Niu

    2012-01-01

    In this study, we propose an efficient loss-tolerant image encryption scheme that protects both confidentiality and loss-tolerance simultaneously in shadow images. In this scheme, we generate the key sequence based on two chaotic maps and then encrypt the image during the sharing phase based on Shamir’s method. Experimental results show a better performance of the proposed scheme for different images than other methods from human vision. Security analysis confirms a high probability to resist...

  14. An AERONET-based aerosol classification using the Mahalanobis distance

    Science.gov (United States)

    Hamill, Patrick; Giordano, Marco; Ward, Carolyne; Giles, David; Holben, Brent

    2016-09-01

    We present an aerosol classification based on AERONET aerosol data from 1993 to 2012. We used the AERONET Level 2.0 almucantar aerosol retrieval products to define several reference aerosol clusters which are characteristic of the following general aerosol types: Urban-Industrial, Biomass Burning, Mixed Aerosol, Dust, and Maritime. The classification of a particular aerosol observation as one of these aerosol types is determined by its five-dimensional Mahalanobis distance to each reference cluster. We have calculated the fractional aerosol type distribution at 190 AERONET sites, as well as the monthly variation in aerosol type at those locations. The results are presented on a global map and individually in the supplementary material. Our aerosol typing is based on recognizing that different geographic regions exhibit characteristic aerosol types. To generate reference clusters we only keep data points that lie within a Mahalanobis distance of 2 from the centroid. Our aerosol characterization is based on the AERONET retrieved quantities, therefore it does not include low optical depth values. The analysis is based on "point sources" (the AERONET sites) rather than globally distributed values. The classifications obtained will be useful in interpreting aerosol retrievals from satellite borne instruments.

  15. New Steganographic Scheme Based Of Reed- Solomon Codes

    Directory of Open Access Journals (Sweden)

    DIOP

    2012-04-01

    Full Text Available Modern steganography [1] is a new science that makes a secret communication. Using the technique of Matrix Embedding in steganography schemes tends to reduce distortion during insertion. Recently, Fontaine and Galand [2] showed that the Reed-Solomon codes can be good tools for the design of a Steganographic scheme. In this paper, we present an implementation of the technique Matrix Embedding using the Reed-Solomon codes. The advantage of these codes is that they allow easy way to solve the problem of bounded syndrome decoding, a problem which is the basis of the technique of embedding matrix.

  16. A secure quantum group signature scheme based on Bell states

    Science.gov (United States)

    Zhang, Kejia; Song, Tingting; Zuo, Huijuan; Zhang, Weiwei

    2013-04-01

    In this paper, we propose a new secure quantum group signature with Bell states, which may have applications in e-payment system, e-government, e-business, etc. Compared with the recent quantum group signature protocols, our scheme is focused on the most general situation in practice, i.e. only the arbitrator is trusted and no intermediate information needs to be stored in the signing phase to ensure the security. Furthermore, our scheme has achieved all the characteristics of group signature—anonymity, verifiability, traceability, unforgetability and undeniability, by using some current developed quantum and classical technologies. Finally, a feasible security analysis model for quantum group signature is presented.

  17. Vessel-guided airway segmentation based on voxel classification

    DEFF Research Database (Denmark)

    Lo, Pechin Chien Pau; Sporring, Jon; Ashraf, Haseem;

    2008-01-01

    This paper presents a method for improving airway tree segmentation using vessel orientation information. We use the fact that an airway branch is always accompanied by an artery, with both structures having similar orientations. This work is based on a  voxel classification airway segmentation...... method proposed previously. The probability of a voxel belonging to the airway, from the voxel classification method, is augmented with an orientation similarity measure as a criterion for region growing. The orientation similarity measure of a voxel indicates how similar is the orientation...... of the surroundings of a voxel, estimated based on a tube model, is to that of a neighboring vessel. The proposed method is tested on 20 CT images from different subjects selected randomly from a lung cancer screening study. Length of the airway branches from the results of the proposed method are significantly...

  18. Vascular bone tumors: a proposal of a classification based on clinicopathological, radiographic and genetic features

    Energy Technology Data Exchange (ETDEWEB)

    Errani, Costantino [Istituto Ortopedico Rizzoli, Ortopedia Generale, Orthopaedic Service, Bagheria (Italy); Struttura Complessa Ortopedia Generale, Dipartimento Rizzoli-Sicilia, Bagheria, PA (Italy); Vanel, Daniel; Gambarotti, Marco; Alberghini, Marco [Istituto Ortopedico Rizzoli, Pathology Service, Bologna (Italy); Picci, Piero [Istituto Ortopedico Rizzoli, Laboratory for Cancer Research, Bologna (Italy); Faldini, Cesare [Istituto Ortopedico Rizzoli, Ortopedia Generale, Orthopaedic Service, Bagheria (Italy)

    2012-12-15

    The classification of vascular bone tumors remains challenging, with considerable morphological overlap spanning across benign to malignant categories. The vast majority of both benign and malignant vascular tumors are readily diagnosed based on their characteristic histological features, such as the formation of vascular spaces and the expression of endothelial markers. However, some vascular tumors have atypical histological features, such as a solid growth pattern, epithelioid change, or spindle cell morphology, which complicates their diagnosis. Pathologically, these tumors are remarkably similar, which makes differentiating them from each other very difficult. For this rare subset of vascular bone tumors, there remains considerable controversy with regard to the terminology and the classification that should be used. Moreover, one of the most confusing issues related to vascular bone tumors is the myriad of names that are used to describe them. Because the clinical behavior and, consequently, treatment and prognosis of vascular bone tumors can vary significantly, it is important to effectively and accurately distinguish them from each other. Upon review of the nomenclature and the characteristic clinicopathological, radiographic and genetic features of vascular bone tumors, we propose a classification scheme that includes hemangioma, hemangioendothelioma, angiosarcoma, and their epithelioid variants. (orig.)

  19. Vascular bone tumors: a proposal of a classification based on clinicopathological, radiographic and genetic features

    International Nuclear Information System (INIS)

    The classification of vascular bone tumors remains challenging, with considerable morphological overlap spanning across benign to malignant categories. The vast majority of both benign and malignant vascular tumors are readily diagnosed based on their characteristic histological features, such as the formation of vascular spaces and the expression of endothelial markers. However, some vascular tumors have atypical histological features, such as a solid growth pattern, epithelioid change, or spindle cell morphology, which complicates their diagnosis. Pathologically, these tumors are remarkably similar, which makes differentiating them from each other very difficult. For this rare subset of vascular bone tumors, there remains considerable controversy with regard to the terminology and the classification that should be used. Moreover, one of the most confusing issues related to vascular bone tumors is the myriad of names that are used to describe them. Because the clinical behavior and, consequently, treatment and prognosis of vascular bone tumors can vary significantly, it is important to effectively and accurately distinguish them from each other. Upon review of the nomenclature and the characteristic clinicopathological, radiographic and genetic features of vascular bone tumors, we propose a classification scheme that includes hemangioma, hemangioendothelioma, angiosarcoma, and their epithelioid variants. (orig.)

  20. A New Digital Signature Scheme Based on Mandelbrot and Julia Fractal Sets

    Directory of Open Access Journals (Sweden)

    M. A. Alia

    2007-01-01

    Full Text Available This paper describes a new cryptographic digital signature scheme based on Mandelbrot and Julia fractal sets. Having fractal based digital signature scheme is possible due to the strong connection between the Mandelbrot and Julia fractal sets. The link between the two fractal sets used for the conversion of the private key to the public key. Mandelbrot fractal function takes the chosen private key as the input parameter and generates the corresponding public-key. Julia fractal function then used to sign the message with receiver's public key and verify the received message based on the receiver's private key. The propose scheme was resistant against attacks, utilizes small key size and performs comparatively faster than the existing DSA, RSA digital signature scheme. Fractal digital signature scheme was an attractive alternative to the traditional number theory digital signature scheme.

  1. Cryptanalysis And Further Improvement Of A Biometric-Based Remote User Authentication Scheme Using Smart Cards

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Das

    2011-03-01

    Full Text Available Recently, Li et al. proposed a secure biometric-based remote user authentication scheme using smartcards to withstand the security flaws of Li-Hwang’s efficient biometric-based remote user authenticationscheme using smart cards. Li et al.’s scheme is based on biometrics verification, smart card and one-wayhash function, and it also uses the random nonce rather than a synchronized clock, and thus it is efficientin computational cost and more secure than Li-Hwang’s scheme. Unfortunately, in this paper we showthat Li et al.’s scheme still has some security weaknesses in their design. In order to withstand thoseweaknesses in their scheme, we further propose an improvement of their scheme so that the improvedscheme always provides proper authentication and as a result, it establishes a session key between theuser and the server at the end of successful user authentication.

  2. Hierarchical Classification of Chinese Documents Based on N-grams

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We explore the techniques of utilizing N-gram informatio n tocategorize Chinese text documents hierarchically so that the classifier can shak e off the burden of large dictionaries and complex segmentation processing, and subsequently be domain and time independent. A hierarchical Chinese text classif ier is implemented. Experimental results show that hierarchically classifying Chinese text documents based N-grams can achieve satisfactory performance and outperforms the other traditional Chinese text classifiers.

  3. Understanding Acupuncture Based on ZHENG Classification from System Perspective

    OpenAIRE

    Junwei Fang; Ningning Zheng; Yang Wang; Huijuan Cao; Shujun Sun; Jianye Dai; Qianhua Li; Yongyu Zhang

    2013-01-01

    Acupuncture is an efficient therapy method originated in ancient China, the study of which based on ZHENG classification is a systematic research on understanding its complexity. The system perspective is contributed to understand the essence of phenomena, and, as the coming of the system biology era, broader technology platforms such as omics technologies were established for the objective study of traditional chinese medicine (TCM). Omics technologies could dynamically determine molecular c...

  4. Active Dictionary Learning in Sparse Representation Based Classification

    OpenAIRE

    Xu, Jin; He, Haibo; Man, Hong

    2014-01-01

    Sparse representation, which uses dictionary atoms to reconstruct input vectors, has been studied intensively in recent years. A proper dictionary is a key for the success of sparse representation. In this paper, an active dictionary learning (ADL) method is introduced, in which classification error and reconstruction error are considered as the active learning criteria in selection of the atoms for dictionary construction. The learned dictionaries are caculated in sparse representation based...

  5. Label-Embedding for Attribute-Based Classification

    OpenAIRE

    Akata, Zeynep; Perronnin, Florent; Harchaoui, Zaid; Schmid, Cordelia

    2013-01-01

    International audience; Attributes are an intermediate representation, which enables parameter sharing between classes, a must when training data is scarce. We propose to view attribute-based image classification as a label-embedding problem: each class is embedded in the space of attribute vectors. We introduce a function which measures the compatibility between an image and a label embedding. The parameters of this function are learned on a training set of labeled samples to ensure that, gi...

  6. DATA MINING BASED TECHNIQUE FOR IDS ALERT CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    Hany Nashat Gabra

    2015-06-01

    Full Text Available Intrusion detection systems (IDSs have become a widely used measure for security systems. The main problem for such systems is the irrelevant alerts. We propose a data mining based method for classification to distinguish serious and irrelevant alerts with a performance of 99.9%, which is better in comparison with the other recent data mining methods that achieved 97%. A ranked alerts list is also created according to the alert’s importance to minimize human interventions.

  7. Simple-Random-Sampling-Based Multiclass Text Classification Algorithm

    OpenAIRE

    Wuying Liu; Lin Wang; Mianzhu Yi

    2014-01-01

    Multiclass text classification (MTC) is a challenging issue and the corresponding MTC algorithms can be used in many applications. The space-time overhead of the algorithms must be concerned about the era of big data. Through the investigation of the token frequency distribution in a Chinese web document collection, this paper reexamines the power law and proposes a simple-random-sampling-based MTC (SRSMTC) algorithm. Supported by a token level memory to store labeled documents, the SRSMTC al...

  8. Expected energy-based restricted Boltzmann machine for classification.

    Science.gov (United States)

    Elfwing, S; Uchibe, E; Doya, K

    2015-04-01

    In classification tasks, restricted Boltzmann machines (RBMs) have predominantly been used in the first stage, either as feature extractors or to provide initialization of neural networks. In this study, we propose a discriminative learning approach to provide a self-contained RBM method for classification, inspired by free-energy based function approximation (FE-RBM), originally proposed for reinforcement learning. For classification, the FE-RBM method computes the output for an input vector and a class vector by the negative free energy of an RBM. Learning is achieved by stochastic gradient-descent using a mean-squared error training objective. In an earlier study, we demonstrated that the performance and the robustness of FE-RBM function approximation can be improved by scaling the free energy by a constant that is related to the size of network. In this study, we propose that the learning performance of RBM function approximation can be further improved by computing the output by the negative expected energy (EE-RBM), instead of the negative free energy. To create a deep learning architecture, we stack several RBMs on top of each other. We also connect the class nodes to all hidden layers to try to improve the performance even further. We validate the classification performance of EE-RBM using the MNIST data set and the NORB data set, achieving competitive performance compared with other classifiers such as standard neural networks, deep belief networks, classification RBMs, and support vector machines. The purpose of using the NORB data set is to demonstrate that EE-RBM with binary input nodes can achieve high performance in the continuous input domain. PMID:25318375

  9. A Provably-Secure ECC-Based Authentication Scheme for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junghyun Nam

    2014-11-01

    Full Text Available A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000. Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC, and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure schemes.

  10. A soft-hard combination-based cooperative spectrum sensing scheme for cognitive radio networks.

    Science.gov (United States)

    Do, Nhu Tri; An, Beongku

    2015-01-01

    In this paper we propose a soft-hard combination scheme, called SHC scheme, for cooperative spectrum sensing in cognitive radio networks. The SHC scheme deploys a cluster based network in which Likelihood Ratio Test (LRT)-based soft combination is applied at each cluster, and weighted decision fusion rule-based hard combination is utilized at the fusion center. The novelties of the SHC scheme are as follows: the structure of the SHC scheme reduces the complexity of cooperative detection which is an inherent limitation of soft combination schemes. By using the LRT, we can detect primary signals in a low signal-to-noise ratio regime (around an average of -15 dB). In addition, the computational complexity of the LRT is reduced since we derive the closed-form expression of the probability density function of LRT value. The SHC scheme also takes into account the different effects of large scale fading on different users in the wide area network. The simulation results show that the SHC scheme not only provides the better sensing performance compared to the conventional hard combination schemes, but also reduces sensing overhead in terms of reporting time compared to the conventional soft combination scheme using the LRT. PMID:25688589

  11. A soft-hard combination-based cooperative spectrum sensing scheme for cognitive radio networks.

    Science.gov (United States)

    Do, Nhu Tri; An, Beongku

    2015-02-13

    In this paper we propose a soft-hard combination scheme, called SHC scheme, for cooperative spectrum sensing in cognitive radio networks. The SHC scheme deploys a cluster based network in which Likelihood Ratio Test (LRT)-based soft combination is applied at each cluster, and weighted decision fusion rule-based hard combination is utilized at the fusion center. The novelties of the SHC scheme are as follows: the structure of the SHC scheme reduces the complexity of cooperative detection which is an inherent limitation of soft combination schemes. By using the LRT, we can detect primary signals in a low signal-to-noise ratio regime (around an average of -15 dB). In addition, the computational complexity of the LRT is reduced since we derive the closed-form expression of the probability density function of LRT value. The SHC scheme also takes into account the different effects of large scale fading on different users in the wide area network. The simulation results show that the SHC scheme not only provides the better sensing performance compared to the conventional hard combination schemes, but also reduces sensing overhead in terms of reporting time compared to the conventional soft combination scheme using the LRT.

  12. Tree-based disease classification using protein data.

    Science.gov (United States)

    Zhu, Hongtu; Yu, Chang-Yung; Zhang, Heping

    2003-09-01

    A reliable and precise classification of diseases is essential for successful diagnosis and treatment. Using mass spectrometry from clinical specimens, scientists may find the protein variations among disease and use this information to improve diagnosis. In this paper, we propose a novel procedure to classify disease status based on the protein data from mass spectrometry. Our new tree-based algorithm consists of three steps: projection, selection and classification tree. The projection step aims to project all observations from specimens into the same bases so that the projected data have fixed coordinates. Thus, for each specimen, we obtain a large vector of 'coefficients' on the same basis. The purpose of the selection step is data reduction by condensing the large vector from the projection step into a much lower order of informative vector. Finally, using these reduced vectors, we apply recursive partitioning to construct an informative classification tree. This method has been successfully applied to protein data, provided by the Department of Radiology and Chemistry at Duke University.

  13. A unified classification of alien species based on the magnitude of their environmental impacts.

    Directory of Open Access Journals (Sweden)

    Tim M Blackburn

    2014-05-01

    Full Text Available Species moved by human activities beyond the limits of their native geographic ranges into areas in which they do not naturally occur (termed aliens can cause a broad range of significant changes to recipient ecosystems; however, their impacts vary greatly across species and the ecosystems into which they are introduced. There is therefore a critical need for a standardised method to evaluate, compare, and eventually predict the magnitudes of these different impacts. Here, we propose a straightforward system for classifying alien species according to the magnitude of their environmental impacts, based on the mechanisms of impact used to code species in the International Union for Conservation of Nature (IUCN Global Invasive Species Database, which are presented here for the first time. The classification system uses five semi-quantitative scenarios describing impacts under each mechanism to assign species to different levels of impact-ranging from Minimal to Massive-with assignment corresponding to the highest level of deleterious impact associated with any of the mechanisms. The scheme also includes categories for species that are Not Evaluated, have No Alien Population, or are Data Deficient, and a method for assigning uncertainty to all the classifications. We show how this classification system is applicable at different levels of ecological complexity and different spatial and temporal scales, and embraces existing impact metrics. In fact, the scheme is analogous to the already widely adopted and accepted Red List approach to categorising extinction risk, and so could conceivably be readily integrated with existing practices and policies in many regions.

  14. A Topic Space Oriented User Group Discovering Scheme in Social Network: A Trust Chain Based Interest Measuring Perspective

    Directory of Open Access Journals (Sweden)

    Wang Dong

    2016-01-01

    Full Text Available Currently, user group has become an effective platform for information sharing and communicating among users in social network sites. In present work, we propose a single topic user group discovering scheme, which includes three phases: topic impact evaluation, interest degree measurement, and trust chain based discovering, to enable selecting influential topic and discovering users into a topic oriented group. Our main works include (1 an overview of proposed scheme and its related definitions; (2 topic space construction method based on topic relatedness clustering and its impact (influence degree and popularity degree evaluation; (3 a trust chain model to take user relation network topological information into account with a strength classification perspective; (4 an interest degree (user explicit and implicit interest degree evaluation method based on trust chain among users; and (5 a topic space oriented user group discovering method to group core users according to their explicit interest degrees and to predict ordinary users under implicit interest and user trust chain. Finally, experimental results are given to explain effectiveness and feasibility of our scheme.

  15. Intra-generational Redistribution under Public Pension Planning Based on Generation-based Funding Scheme

    Science.gov (United States)

    Banjo, Daisuke; Tamura, Hiroyuki; Murata, Tadahiko

    In this paper, we propose a method of determining the pension in the generation-based funding scheme. In this proposal, we include two types of pensions in the scheme. One is the payment-amount related pension and the other is the payment-frequency related pension. We set the ratio of the total amount of payment-amount related pension to the total amount of both pensions, and simulate income gaps and the relationship between contributions and benefits for each individual when the proposed method is applied.

  16. ECC Based Threshold Decryption Scheme and Its Application in Web Security

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xian-feng; ZHANG Feng; QIN Zhi-guang; LIU Jin-de

    2004-01-01

    The threshold cryptography provides a new approach to building intrusion tolerance applications. In this paper, a threshold decryption scheme based elliptic curve cryptography is presented. A zero-knowledge test approach based on elliptic curve cryptography is designed. The application of these techniques in Web security is studied. Performance analysis shows that our scheme is characterized by excellent security as well as high efficiency.

  17. UNIFIED COMPUTATION OF FLOW WITH COMPRESSIBLE AND INCOMPRESSIBLE FLUID BASED ON ROE'S SCHEME

    Institute of Scientific and Technical Information of China (English)

    HUANG Dian-gui

    2006-01-01

    A unified numerical scheme for the solutions of the compressible and incompressible Navier-Stokes equations is investigated based on a time-derivative preconditioning algorithm. The primitive variables are pressure, velocities and temperature. The time integration scheme is used in conjunction with a finite volume discretization. The preconditioning is coupled with a high order implicit upwind scheme based on the definition of a Roe's type matrix. Computational capabilities are demonstrated through computations of high Mach number, middle Mach number, very low Mach number, and incompressible flow. It has also been demonstrated that the discontinuous surface in flow field can be captured for the implementation Roe's scheme.

  18. Performance Analysis of Virtual MIMO Relaying Schemes Based on Detect–Split–Forward

    KAUST Repository

    Al-Basit, Suhaib M.

    2014-10-29

    © 2014, Springer Science+Business Media New York. Virtual multi-input multi-output (vMIMO) schemes in wireless communication systems improve coverage, throughput, capacity, and quality of service. In this paper, we propose three uplink vMIMO relaying schemes based on detect–split–forward (DSF). In addition, we investigate the effect of several physical parameters such as distance, modulation type and number of relays. Furthermore, an adaptive vMIMO DSF scheme based on VBLAST and STBC is proposed. In order to do that, we provide analytical tools to evaluate the performance of the propose vMIMO relaying scheme.

  19. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua;

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  20. The DTW-based representation space for seismic pattern classification

    Science.gov (United States)

    Orozco-Alzate, Mauricio; Castro-Cabrera, Paola Alexandra; Bicego, Manuele; Londoño-Bonilla, John Makario

    2015-12-01

    Distinguishing among the different seismic volcanic patterns is still one of the most important and labor-intensive tasks for volcano monitoring. This task could be lightened and made free from subjective bias by using automatic classification techniques. In this context, a core but often overlooked issue is the choice of an appropriate representation of the data to be classified. Recently, it has been suggested that using a relative representation (i.e. proximities, namely dissimilarities on pairs of objects) instead of an absolute one (i.e. features, namely measurements on single objects) is advantageous to exploit the relational information contained in the dissimilarities to derive highly discriminant vector spaces, where any classifier can be used. According to that motivation, this paper investigates the suitability of a dynamic time warping (DTW) dissimilarity-based vector representation for the classification of seismic patterns. Results show the usefulness of such a representation in the seismic pattern classification scenario, including analyses of potential benefits from recent advances in the dissimilarity-based paradigm such as the proper selection of representation sets and the combination of different dissimilarity representations that might be available for the same data.

  1. Data Classification Based on Confidentiality in Virtual Cloud Environment

    Directory of Open Access Journals (Sweden)

    Munwar Ali Zardari

    2014-10-01

    Full Text Available The aim of this study is to provide suitable security to data based on the security needs of data. It is very difficult to decide (in cloud which data need what security and which data do not need security. However it will be easy to decide the security level for data after data classification according to their security level based on the characteristics of the data. In this study, we have proposed a data classification cloud model to solve data confidentiality issue in cloud computing environment. The data are classified into two major classes: sensitive and non-sensitive. The K-Nearest Neighbour (K-NN classifier is used for data classification and the Rivest, Shamir and Adelman (RSA algorithm is used to encrypt sensitive data. After implementing the proposed model, it is found that the confidentiality level of data is increased and this model is proved to be more cost and memory friendly for the users as well as for the cloud services providers. The data storage service is one of the cloud services where data servers are virtualized of all users. In a cloud server, the data are stored in two ways. First encrypt the received data and store on cloud servers. Second store data on the cloud servers without encryption. Both of these data storage methods can face data confidentiality issue, because the data have different values and characteristics that must be identified before sending to cloud severs.

  2. Changing Histopathological Diagnostics by Genome-Based Tumor Classification

    Directory of Open Access Journals (Sweden)

    Michael Kloth

    2014-05-01

    Full Text Available Traditionally, tumors are classified by histopathological criteria, i.e., based on their specific morphological appearances. Consequently, current therapeutic decisions in oncology are strongly influenced by histology rather than underlying molecular or genomic aberrations. The increase of information on molecular changes however, enabled by the Human Genome Project and the International Cancer Genome Consortium as well as the manifold advances in molecular biology and high-throughput sequencing techniques, inaugurated the integration of genomic information into disease classification. Furthermore, in some cases it became evident that former classifications needed major revision and adaption. Such adaptations are often required by understanding the pathogenesis of a disease from a specific molecular alteration, using this molecular driver for targeted and highly effective therapies. Altogether, reclassifications should lead to higher information content of the underlying diagnoses, reflecting their molecular pathogenesis and resulting in optimized and individual therapeutic decisions. The objective of this article is to summarize some particularly important examples of genome-based classification approaches and associated therapeutic concepts. In addition to reviewing disease specific markers, we focus on potentially therapeutic or predictive markers and the relevance of molecular diagnostics in disease monitoring.

  3. Simple-Random-Sampling-Based Multiclass Text Classification Algorithm

    Directory of Open Access Journals (Sweden)

    Wuying Liu

    2014-01-01

    Full Text Available Multiclass text classification (MTC is a challenging issue and the corresponding MTC algorithms can be used in many applications. The space-time overhead of the algorithms must be concerned about the era of big data. Through the investigation of the token frequency distribution in a Chinese web document collection, this paper reexamines the power law and proposes a simple-random-sampling-based MTC (SRSMTC algorithm. Supported by a token level memory to store labeled documents, the SRSMTC algorithm uses a text retrieval approach to solve text classification problems. The experimental results on the TanCorp data set show that SRSMTC algorithm can achieve the state-of-the-art performance at greatly reduced space-time requirements.

  4. A Fuzzy Similarity Based Concept Mining Model for Text Classification

    CERN Document Server

    Puri, Shalini

    2012-01-01

    Text Classification is a challenging and a red hot field in the current scenario and has great importance in text categorization applications. A lot of research work has been done in this field but there is a need to categorize a collection of text documents into mutually exclusive categories by extracting the concepts or features using supervised learning paradigm and different classification algorithms. In this paper, a new Fuzzy Similarity Based Concept Mining Model (FSCMM) is proposed to classify a set of text documents into pre - defined Category Groups (CG) by providing them training and preparing on the sentence, document and integrated corpora levels along with feature reduction, ambiguity removal on each level to achieve high system performance. Fuzzy Feature Category Similarity Analyzer (FFCSA) is used to analyze each extracted feature of Integrated Corpora Feature Vector (ICFV) with the corresponding categories or classes. This model uses Support Vector Machine Classifier (SVMC) to classify correct...

  5. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  6. Entropy coders for image compression based on binary forward classification

    Science.gov (United States)

    Yoo, Hoon; Jeong, Jechang

    2000-12-01

    Entropy coders as a noiseless compression method are widely used as final step compression for images, and there have been many contributions to increase of entropy coder performance and to reduction of entropy coder complexity. In this paper, we propose some entropy coders based on the binary forward classification (BFC). The BFC requires overhead of classification but there is no change between the amount of input information and the total amount of classified output information, which we prove this property in this paper. And using the proved property, we propose entropy coders that are the BFC followed by Golomb-Rice coders (BFC+GR) and the BFC followed by arithmetic coders (BFC+A). The proposed entropy coders introduce negligible additional complexity due to the BFC. Simulation results also show better performance than other entropy coders that have similar complexity to the proposed coders.

  7. An ellipse detection algorithm based on edge classification

    Science.gov (United States)

    Yu, Liu; Chen, Feng; Huang, Jianming; Wei, Xiangquan

    2015-12-01

    In order to enhance the speed and accuracy of ellipse detection, an ellipse detection algorithm based on edge classification is proposed. Too many edge points are removed by making edge into point in serialized form and the distance constraint between the edge points. It achieves effective classification by the criteria of the angle between the edge points. And it makes the probability of randomly selecting the edge points falling on the same ellipse greatly increased. Ellipse fitting accuracy is significantly improved by the optimization of the RED algorithm. It uses Euclidean distance to measure the distance from the edge point to the elliptical boundary. Experimental results show that: it can detect ellipse well in case of edge with interference or edges blocking each other. It has higher detecting precision and less time consuming than the RED algorithm.

  8. Efficient Identity Based Signcryption Scheme with Public Verifiability and Forward Security

    Institute of Scientific and Technical Information of China (English)

    LEI Fei-yu; CHEN Wen; CHEN Ke-fei; MA Chang-she

    2005-01-01

    In this paper, we point out that Libert and Quisquater's signcryption scheme cannot provide public verifiability. Then we present a new identity based signcryption scheme using quadratic residue and pairings over elliptic curves. It combines the functionalities of both public verifiability and forward security at the same time. Under the Bilinear Diffie-Hellman and quadratic residue assumption, we describe the new scheme that is more secure and can be somewhat more efficient than Libert and Quisquater's one.

  9. Error Robust H.264 Video Transmission Schemes Based on Multi-frame

    Institute of Scientific and Technical Information of China (English)

    余红斌; 余松煜; 王慈

    2004-01-01

    Multi-frame coding is supported by the emerging H. 264. It is important for the enhancement of both coding efficiency and error robustness. In this paper, error resilient schemes for H. 264 based on multi-frame were investigated. Error robust H. 264 video transmission schemes were introduced for the applications with and without a feedback channel. The experimental results demonstrate the effectiveness of the proposed schemes.

  10. DWT-Based Robust Color Image Watermarking Scheme

    Institute of Scientific and Technical Information of China (English)

    Liu Lianshan; Li Renhou; Gao Qi

    2005-01-01

    A scheme of embedding an encrypted watermark into the green component of a color image is proposed. The embedding process is implemented in the discrete wavelet transformation (DWT) domain. The original binary watermark image is firstly encrypted through scrambling technique, and then spread with two orthogonal pseudo-random sequences whose mean values are equal to zero, and finally embedded into the DWT low frequency sub-band of green components. The coefficients whose energies are larger than the others are selected to hide watermark, and the hidden watermark strength is determined by the energy ratio between the selected coefficients energies and the mean energy of the subband. The experiment results demonstrate that the proposed watermarking scheme is very robust against the attacks such as additive noise, low-pass filtering, scaling, cropping image, row ( or column ) deleting, and JPEG compression.

  11. A continuous and prognostic convection scheme based on buoyancy, PCMT

    Science.gov (United States)

    Guérémy, Jean-François; Piriou, Jean-Marcel

    2016-04-01

    A new and consistent convection scheme (PCMT: Prognostic Condensates Microphysics and Transport), providing a continuous and prognostic treatment of this atmospheric process, is described. The main concept ensuring the consistency of the whole system is the buoyancy, key element of any vertical motion. The buoyancy constitutes the forcing term of the convective vertical velocity, which is then used to define the triggering condition, the mass flux, and the rates of entrainment-detrainment. The buoyancy is also used in its vertically integrated form (CAPE) to determine the closure condition. The continuous treatment of convection, from dry thermals to deep precipitating convection, is achieved with the help of a continuous formulation of the entrainment-detrainment rates (depending on the convective vertical velocity) and of the CAPE relaxation time (depending on the convective over-turning time). The convective tendencies are directly expressed in terms of condensation and transport. Finally, the convective vertical velocity and condensates are fully prognostic, the latter being treated using the same microphysics scheme as for the resolved condensates but considering the convective environment. A Single Column Model (SCM) validation of this scheme is shown, allowing detailed comparisons with observed and explicitly simulated data. Four cases covering the convective spectrum are considered: over ocean, sensitivity to environmental moisture (S. Derbyshire) non precipitating shallow convection to deep precipitating convection, trade wind shallow convection (BOMEX) and strato-cumulus (FIRE), together with an entire continental diurnal cycle of convection (ARM). The emphasis is put on the characteristics of the scheme which enable a continuous treatment of convection. Then, a 3D LAM validation is presented considering an AMMA case with both observations and a CRM simulation using the same initial and lateral conditions as for the parameterized one. Finally, global

  12. Local fractal dimension based approaches for colonic polyp classification.

    Science.gov (United States)

    Häfner, Michael; Tamaki, Toru; Tanaka, Shinji; Uhl, Andreas; Wimmer, Georg; Yoshida, Shigeto

    2015-12-01

    This work introduces texture analysis methods that are based on computing the local fractal dimension (LFD; or also called the local density function) and applies them for colonic polyp classification. The methods are tested on 8 HD-endoscopic image databases, where each database is acquired using different imaging modalities (Pentax's i-Scan technology combined with or without staining the mucosa) and on a zoom-endoscopic image database using narrow band imaging. In this paper, we present three novel extensions to a LFD based approach. These extensions additionally extract shape and/or gradient information of the image to enhance the discriminativity of the original approach. To compare the results of the LFD based approaches with the results of other approaches, five state of the art approaches for colonic polyp classification are applied to the employed databases. Experiments show that LFD based approaches are well suited for colonic polyp classification, especially the three proposed extensions. The three proposed extensions are the best performing methods or at least among the best performing methods for each of the employed databases. The methods are additionally tested by means of a public texture image database, the UIUCtex database. With this database, the viewpoint invariance of the methods is assessed, an important features for the employed endoscopic image databases. Results imply that most of the LFD based methods are more viewpoint invariant than the other methods. However, the shape, size and orientation adapted LFD approaches (which are especially designed to enhance the viewpoint invariance) are in general not more viewpoint invariant than the other LFD based approaches.

  13. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  14. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  15. Gene function classification using Bayesian models with hierarchy-based priors

    Directory of Open Access Journals (Sweden)

    Neal Radford M

    2006-10-01

    Full Text Available Abstract Background We investigate whether annotation of gene function can be improved using a classification scheme that is aware that functional classes are organized in a hierarchy. The classifiers look at phylogenic descriptors, sequence based attributes, and predicted secondary structure. We discuss three Bayesian models and compare their performance in terms of predictive accuracy. These models are the ordinary multinomial logit (MNL model, a hierarchical model based on a set of nested MNL models, and an MNL model with a prior that introduces correlations between the parameters for classes that are nearby in the hierarchy. We also provide a new scheme for combining different sources of information. We use these models to predict the functional class of Open Reading Frames (ORFs from the E. coli genome. Results The results from all three models show substantial improvement over previous methods, which were based on the C5 decision tree algorithm. The MNL model using a prior based on the hierarchy outperforms both the non-hierarchical MNL model and the nested MNL model. In contrast to previous attempts at combining the three sources of information in this dataset, our new approach to combining data sources produces a higher accuracy rate than applying our models to each data source alone. Conclusion Together, these results show that gene function can be predicted with higher accuracy than previously achieved, using Bayesian models that incorporate suitable prior information.

  16. Network Traffic Anomalies Identification Based on Classification Methods

    Directory of Open Access Journals (Sweden)

    Donatas Račys

    2015-07-01

    Full Text Available A problem of network traffic anomalies detection in the computer networks is analyzed. Overview of anomalies detection methods is given then advantages and disadvantages of the different methods are analyzed. Model for the traffic anomalies detection was developed based on IBM SPSS Modeler and is used to analyze SNMP data of the router. Investigation of the traffic anomalies was done using three classification methods and different sets of the learning data. Based on the results of investigation it was determined that C5.1 decision tree method has the largest accuracy and performance and can be successfully used for identification of the network traffic anomalies.

  17. Land Cover - Minnesota Land Cover Classification System

    Data.gov (United States)

    Minnesota Department of Natural Resources — Land cover data set based on the Minnesota Land Cover Classification System (MLCCS) coding scheme. This data was produced using a combination of aerial photograph...

  18. Spectral classification of stars based on LAMOST spectra

    CERN Document Server

    Liu, Chao; Zhang, Bo; Wan, Jun-Chen; Deng, Li-Cai; Hou, Yonghui; Wang, Yuefei; Yang, Ming; Zhang, Yong

    2015-01-01

    In this work, we select the high signal-to-noise ratio spectra of stars from the LAMOST data andmap theirMK classes to the spectral features. The equivalentwidths of the prominent spectral lines, playing the similar role as the multi-color photometry, form a clean stellar locus well ordered by MK classes. The advantage of the stellar locus in line indices is that it gives a natural and continuous classification of stars consistent with either the broadly used MK classes or the stellar astrophysical parameters. We also employ a SVM-based classification algorithm to assignMK classes to the LAMOST stellar spectra. We find that the completenesses of the classification are up to 90% for A and G type stars, while it is down to about 50% for OB and K type stars. About 40% of the OB and K type stars are mis-classified as A and G type stars, respectively. This is likely owe to the difference of the spectral features between the late B type and early A type stars or between the late G and early K type stars are very we...

  19. Risk Classification and Risk-based Safety and Mission Assurance

    Science.gov (United States)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  20. Classification of body movements based on posturographic data.

    Science.gov (United States)

    Saripalle, Sashi K; Paiva, Gavin C; Cliett, Thomas C; Derakhshani, Reza R; King, Gregory W; Lovelace, Christopher T

    2014-02-01

    The human body, standing on two feet, produces a continuous sway pattern. Intended movements, sensory cues, emotional states, and illnesses can all lead to subtle changes in sway appearing as alterations in ground reaction forces and the body's center of pressure (COP). The purpose of this study is to demonstrate that carefully selected COP parameters and classification methods can differentiate among specific body movements while standing, providing new prospects in camera-free motion identification. Force platform data were collected from participants performing 11 choreographed postural and gestural movements. Twenty-three different displacement- and frequency-based features were extracted from COP time series, and supplied to classification-guided feature extraction modules. For identification of movement type, several linear and nonlinear classifiers were explored; including linear discriminants, nearest neighbor classifiers, and support vector machines. The average classification rates on previously unseen test sets ranged from 67% to 100%. Within the context of this experiment, no single method was able to uniformly outperform the others for all movement types, and therefore a set of movement-specific features and classifiers is recommended.

  1. Genetic comparison of breeding schemes based on semen importation and local breeding schemes: framework and application to Costa Rica.

    Science.gov (United States)

    Vargas, B; van Arendonk, J A M

    2004-05-01

    Local breeding schemes for Holstein cattle of Costa Rica were compared with the current practice based on continuous semen importation (SI) by deterministic simulation. Comparison was made on the basis of genetic response and correlation between breeding goals. A local breeding goal was defined on the basis of prevailing production circumstances and compared against a typical breeding goal for an exporting country. Differences in genetic response were strategies because of the reduction in the rate of genetic response when SI was used. When the genetic correlation was assumed equal to 0.75, the genetic response achieved with SI was reduced at the same level as local PT. When an initial difference in average genetic merit of the populations was assumed, this only had a temporal effect on the relative ranking of strategies, which is reverted after some years of selection because the rate of change in genetic responses remains unchanged. Given that the actual levels of genetic correlation between countries may be around 0.60, it was concluded that a local breeding scheme based on a nucleus herd could provide better results than the current strategy based on SI. PMID:15290999

  2. An Approach for Leukemia Classification Based on Cooperative Game Theory

    Directory of Open Access Journals (Sweden)

    Atefeh Torkaman

    2011-01-01

    Full Text Available Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO. Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5 with (90.16% in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment

  3. A Novel Filter Scheme of Data Processing for SQUID-Based Magnetocardiogram

    Institute of Scientific and Technical Information of China (English)

    LIU Dang-Ting; TIAN Ye; REN Yu-Feng; YU Hong-wei; ZHANG Li-Hua; YANG Qian-Sheng; CHEN Geng-Hua

    2008-01-01

    We present a new filter scheme for magnetocardiogram (MCG) signal processing based on the quasi-periodic characteristic of the signals. The key points of this scheme are to determine the exact numbers of data points in each cardiac cycle by using electrocardiogram (ECG) data acquired simultaneously with the MCG signal and to normalize the MCG data sequence in each cycle into an identical length. Compared with conventional filters, the scheme has the advantage of more powerful noise suppression with less signal distortion. The desire for having high quality output signals from raw MCG data acquired in a simple shielded room or even in unshielded environment may be realized with the scheme.

  4. Cost-based droop scheme with lower generation costs for microgrids

    DEFF Research Database (Denmark)

    Nutkani, I. U.; Loh, Poh Chiang; Blaabjerg, Frede

    2013-01-01

    on the DG kVA ratings. Other operating characteristics like generation costs, efficiencies and emission penalties at different loadings have not been considered. This makes existing droop schemes not too well-suited for standalone microgrids without central management system, where different types of...... DGs usually exist. As an alternative, this paper proposes a cost-based droop scheme, whose objective is to reduce a generation cost realized with various DG operating characteristics taken into consideration. The proposed droop scheme therefore retains all advantages of the traditional droop schemes......, while at the same time keep its generation cost low. These findings have been validated through simulation and scaled down lab experiment....

  5. MULTIMEDIA DATA TRANSMISSION THROUGH TCP/IP USING HASH BASED FEC WITH AUTO-XOR SCHEME

    Directory of Open Access Journals (Sweden)

    R. Shalin

    2012-09-01

    Full Text Available The most preferred mode for communication of multimedia data is through the TCP/IP protocol. But on the other hand the TCP/IP protocol produces huge packet loss unavoidable due to network traffic and congestion. In order to provide a efficient communication it is necessary to recover the loss of packets. The proposed scheme implements Hash based FEC with auto XOR scheme for this purpose. The scheme is implemented through Forward error correction, MD5 and XOR for providing efficient transmission of multimedia data. The proposed scheme provides transmission high accuracy, throughput and low latency and loss.

  6. PSO Based Optimized Security Scheme for Image Authentication and Tamper Proofing

    Directory of Open Access Journals (Sweden)

    K. Kuppusamy

    2013-05-01

    Full Text Available The hash function offers an authentication and an i ntegrity to digital images. In this paper an innovative optimized security scheme based on Parti cle swarm optimization (PSO for image authentication and tamper proofing is proposed. Thi s scheme provide solutions to the issues such as robustness, security and tamper detection with precise localization. The features are extracted in Daubechies4 wavelet transform domain w ith help of PSO to generate the image hash. This scheme is moderately robust against atta cks and to detect and locate the tampered areas in an image. The experimental results are pre sented to exhibit the effectiveness of the proposed scheme.

  7. Immunophenotype Discovery, Hierarchical Organization, and Template-Based Classification of Flow Cytometry Samples

    Science.gov (United States)

    Azad, Ariful; Rajwa, Bartek; Pothen, Alex

    2016-01-01

    We describe algorithms for discovering immunophenotypes from large collections of flow cytometry samples and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations’ characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters), a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class) that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples while ignoring noise and small sample-specific variations. We have applied the template-based scheme to analyze several datasets, including one representing a healthy immune system and one of acute myeloid leukemia (AML) samples. The last task is challenging due to the phenotypic heterogeneity of the several subtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML and were able to distinguish acute promyelocytic leukemia (APL) samples with the markers provided. Clinically, this is helpful since APL has a different treatment regimen from other subtypes of AML. Core algorithms used in our data analysis are

  8. Content-based image retrieval applied to BI-RADS tissue classification in screening mammography

    OpenAIRE

    2011-01-01

    AIM: To present a content-based image retrieval (CBIR) system that supports the classification of breast tissue density and can be used in the processing chain to adapt parameters for lesion segmentation and classification.

  9. Transonic inviscid/turbulent airfoil flow simulations using a pressure based method with high order schemes

    Science.gov (United States)

    Zhou, Gang; Davidson, Lars; Olsson, Erik

    This paper presents computations of transonic aerodynamic flow simulations using a pressure-based Euler/Navier-Stokes solver. In this work emphasis is focused on the implementation of higher-order schemes such as QUICK, LUDS and MUSCL. A new scheme CHARM is proposed for convection approximation. Inviscid flow simulations are carried out for the airfoil NACA 0012. The CHARM scheme gives better resolution for the present inviscid case. The turbulent flow computations are carried out for the airfoil RAE 2822. Good results were obtained using QUICK scheme for mean motion equation combined with the MUSCL scheme for k and ɛ equations. No unphysical oscillations were observed. The results also show that the second-order and thir-dorder schemes yielded a comparable accuracy compared with the experimental data.

  10. A Data Gathering Scheme in Wireless Sensor Networks Based on Synchronization of Chaotic Spiking Oscillator Networks

    International Nuclear Information System (INIS)

    This paper studies chaos-based data gathering scheme in multiple sink wireless sensor networks. In the proposed scheme, each wireless sensor node has a simple chaotic oscillator. The oscillators generate spike signals with chaotic interspike intervals, and are impulsively coupled by the signals via wireless communication. Each wireless sensor node transmits and receives sensor information only in the timing of the couplings. The proposed scheme can exhibit various chaos synchronous phenomena and their breakdown phenomena, and can effectively gather sensor information with the significantly small number of transmissions and receptions compared with the conventional scheme. Also, the proposed scheme can flexibly adapt various wireless sensor networks not only with a single sink node but also with multiple sink nodes. This paper introduces our previous works. Through simulation experiments, we show effectiveness of the proposed scheme and discuss its development potential.

  11. Improvement of Identity-Based Threshold Proxy Signature Scheme with Known Signers

    Institute of Scientific and Technical Information of China (English)

    LI Fagen; HU Yupu; CHEN Jie

    2006-01-01

    In 2006, Bao et al proposed an identity-based threshold proxy signature scheme with known signers. In this paper, we show that Bao et al 's scheme is vulnerable to the forgery attack. An adversary can forge a valid threshold proxy signature for any message with knowing a previously valid threshold proxy signature. In addition, their scheme also suffers from the weakness that the proxy signers might change the threshold value. That is, the proxy signers can arbitrarily modify the threshold strategy without being detected by the original signer or verifiers, which might violate the original signer's intent. Furthermore, we propose an improved scheme that remedies the weaknesses of Bao et al 's scheme. The improved scheme satisfies all secure requirements for threshold proxy signature.

  12. A Novel Digital Certificate Based Remote Data Access Control Scheme in WSN

    Directory of Open Access Journals (Sweden)

    Wei Liang

    2015-01-01

    Full Text Available A digital certificate based remote data access control scheme is proposed for safe authentication of accessor in wireless sensor network (WSN. The scheme is founded on the access control scheme on the basis of characteristic expression (named CEB scheme. Data is divided by characteristics and the key for encryption is related to characteristic expression. Only the key matching with characteristic expression can decrypt the data. Meanwhile, three distributed certificate detection methods are designed to prevent the certificate from being misappropriated by hostile anonymous users. When a user starts query, the key access control method can judge whether the query is valid. In this case, the scheme can achieve public certificate of users and effectively protect query privacy as well. The security analysis and experiments show that the proposed scheme is superior in communication overhead, storage overhead, and detection probability.

  13. An efficient authentication scheme based on one-way key chain for sensor network

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To strike a tradeoff between the security and the consumption of energy, computing and communication resources in the nodes, this paper presents an efficient authentication scheme based on one-way key chain for sensor network. The scheme can provide immediate authentication to fulfill the latency and the storage requirements and defends against various attacks such as replay, impersonation and denial of service. Meanwhile,our scheme possesses low overhead and scalability to large networks. Furthermore, the simple related protocols or algorithms in the scheme and inexpensive public-key operation required in view of resource-starved sensor nodes minimize the storage, computation and communication overhead, and improve the efficiency of our scheme. In addition, the proposed scheme also supports source authentication without precluding in-network processing and passive participation.

  14. Evaluation of Scheme Design of Blast Furnace Based on Artificial Neural Network

    Institute of Scientific and Technical Information of China (English)

    TANG Hong; LI Jing-min; YAO Bi-qiang; LIAO Hong-fu; YAO Jin

    2008-01-01

    Blast furnace scheme design is very important, since it directly affects the performance, cost and configuration of the blast furnace. An evaluation approach to furnace scheme design was brought forward based on artificial neural network. Ten independent parameters which determined a scheme design were proposed. The improved threelayer BP network algorithm was used to build the evaluation model in which the 10 independent parameters were taken as input evaluation indexes and the degree to which the scheme design satisfies the requirements of the blast furnace as output. It was trained by the existing samples of the scheme design and the experts' experience, and then tested by the other samples so as to develop the evaluation model. As an example, it is found that a good scheme design of blast furnace can be chosen by using the evaluation model proposed.

  15. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  16. Comment Fail-Stop Blind Signature Scheme Design Based on Pairings

    Institute of Scientific and Technical Information of China (English)

    HU Xiaoming; HUANG Shangteng

    2006-01-01

    Fail-stop signature schemes provide security for a signer against forgeries of an enemy with unlimited computational power by enabling the signer to provide a proof of forgery when a forgery happens. Chang et al proposed a robust fail-stop blind signature scheme based on bilinear pairings. However, in this paper, it will be found that there are several mistakes in Chang et al' fail-stop blind signature scheme. Moreover, it will be pointed out that this scheme doesn' meet the property of a fail-stop signature: unconditionally secure for a signer. In Chang et al' scheme, a forger can forge a valid signature that can' be proved by a signer using the "proof of forgery". The scheme also doesn' possess the unlinkability property of a blind signature.

  17. Intrusion Awareness Based on Data Fusion and SVM Classification

    Directory of Open Access Journals (Sweden)

    Ramnaresh Sharma

    2012-06-01

    Full Text Available Network intrusion awareness is important factor for risk analysis of network security. In the current decade various method and framework are available for intrusion detection and security awareness. Some method based on knowledge discovery process and some framework based on neural network. These entire model take rule based decision for the generation of security alerts. In this paper we proposed a novel method for intrusion awareness using data fusion and SVM classification. Data fusion work on the biases of features gathering of event. Support vector machine is super classifier of data. Here we used SVM for the detection of closed item of ruled based technique. Our proposed method simulate on KDD1999 DARPA data set and get better empirical evaluation result in comparison of rule based technique and neural network model.

  18. Intrusion Awareness Based on Data Fusion and SVM Classification

    Directory of Open Access Journals (Sweden)

    Ramnaresh Sharma

    2012-06-01

    Full Text Available Network intrusion awareness is important factor forrisk analysis of network security. In the currentdecade various method and framework are availablefor intrusion detection and security awareness.Some method based on knowledge discovery processand some framework based on neural network.These entire model take rule based decision for thegeneration of security alerts. In this paper weproposed a novel method for intrusion awarenessusing data fusion and SVM classification. Datafusion work on the biases of features gathering ofevent. Support vector machine is super classifier ofdata. Here we used SVM for the detection of closeditem of ruled based technique. Our proposedmethod simulate on KDD1999 DARPA data set andget better empirical evaluation result in comparisonof rule based technique and neural network model.

  19. Content Based Image Retrieval : Classification Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Shereena V.B

    2014-10-01

    Full Text Available In a content-based image retrieval system (CBIR, the main issue is to extract the image features that effectively represent the image contents in a database. Such an extraction requires a detailed evaluation of retrieval performance of image features. This paper presents a review of fundamental aspects of content based image retrieval including feature extraction of color and texture features. Commonly used color features including color moments, color histogram and color correlogram and Gabor texture are compared. The paper reviews the increase in efficiency of image retrieval when the color and texture features are combined. The similarity measures based on which matches are made and images are retrieved are also discussed. For effective indexing and fast searching of images based on visual features, neural network based pattern learning can be used to achieve effective classification.

  20. Content Based Image Retrieval : Classification Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Shereena V.B

    2014-11-01

    Full Text Available In a content-based image retrieval system (CBIR, the main issue is to extract the image features that effectively represent the image contents in a database. Such an extraction requires a detailed evaluation of retrieval performance of image features. This paper presents a review of fundamental aspects of content based image retrieval including feature extraction of color and texture features. Commonly used color features including color moments, color histogram and color correlogram and Gabor texture are compared. The paper reviews the increase in efficiency of image retrieval when the color and texture features are combined. The similarity measures based on which matches are made and images are retrieved are also discussed. For effective indexing and fast searching of images based on visual features, neural network based pattern learning can be used to achieve effective classification.

  1. Enrollment in community based health insurance schemes in rural Bihar and Uttar Pradesh, India

    NARCIS (Netherlands)

    P. Panda (Pradeep ); A. Chakraborty (Arpita); D.M. Dror (David); A.S. Bedi (Arjun Singh)

    2013-01-01

    textabstractThis paper assesses insurance uptake in three community based health insurance (CBHI) schemes located in rural parts of two of India’s poorest states and offered through women’s self-help groups (SHGs). We examine what drives uptake, the degree of inclusive practices of the schemes, and

  2. A dispersion minimizing scheme for the 3-D Helmholtz equation based on ray theory

    NARCIS (Netherlands)

    C.C. Stolk

    2016-01-01

    We develop a new dispersion minimizing compact finite difference scheme for the Helmholtz equation in 2 and 3 dimensions. The scheme is based on a newly developed ray theory for difference equations. A discrete Helmholtz operator and a discrete operator to be applied to the source and the wavefields

  3. Remodulation scheme based on a two-section reflective SOA

    Science.gov (United States)

    Guiying, Jiang; Lirong, Huang

    2014-05-01

    A simple and cost-effective remodulation scheme using a two-section reflective semiconductor optical amplifier (RSOA) is proposed for a colorless optical network unit (ONU). Under proper injection currents, the front section functions as a modulator to upload the upstream signal while the rear section serves as a data eraser for efficient suppression of the downstream data. The dependences of the upstream transmission performance on the lengths and driven currents of the RSOA, the injection optical power and extinction ratio of the downstream are investigated. By optimizing these parameters, the downstream data can be more completely suppressed and the upstream transmission performance can be greatly improved.

  4. Remodulation scheme based on a two-section reflective SOA

    International Nuclear Information System (INIS)

    A simple and cost-effective remodulation scheme using a two-section reflective semiconductor optical amplifier (RSOA) is proposed for a colorless optical network unit (ONU). Under proper injection currents, the front section functions as a modulator to upload the upstream signal while the rear section serves as a data eraser for efficient suppression of the downstream data. The dependences of the upstream transmission performance on the lengths and driven currents of the RSOA, the injection optical power and extinction ratio of the downstream are investigated. By optimizing these parameters, the downstream data can be more completely suppressed and the upstream transmission performance can be greatly improved. (semiconductor devices)

  5. Scheme of adaptive polarization filtering based on Kalman model

    Institute of Scientific and Technical Information of China (English)

    Song Lizhong; Qi Haiming; Qiao Xiaolin; Meng Xiande

    2006-01-01

    A new kind of adaptive polarization filtering algorithm in order to suppress the angle cheating interference for the active guidance radar is presented. The polarization characteristic of the interference is dynamically tracked by using Kalman estimator under variable environments with time. The polarization filter parameters are designed according to the polarization characteristic of the interference, and the polarization filtering is finished in the target cell. The system scheme of adaptive polarization filter is studied and the tracking performance of polarization filter and improvement of angle measurement precision are simulated. The research results demonstrate this technology can effectively suppress the angle cheating interference in guidance radar and is feasible in engineering.

  6. Content-based and Algorithmic Classifications of Journals: Perspectives on the Dynamics of Scientific Communication and Indexer Effects

    CERN Document Server

    Rafols, Ismael

    2008-01-01

    The aggregated journal-journal citation matrix -based on the Journal Citation Reports (JCR) of the Science Citation Index- can be decomposed by indexers and/or algorithmically. In this study, we test the results of two recently available algorithms for the decomposition of large matrices against two content-based classifications of journals: the ISI Subject Categories and the field/subfield classification of Glaenzel & Schubert (2003). The content-based schemes allow for the attribution of more than a single category to a journal, whereas the algorithms maximize the ratio of within-category citations over between-category citations in the aggregated category-category citation matrix. By adding categories, indexers generate between-category citations, which may enrich the database, for example, in the case of inter-disciplinary developments. The consequent indexer effects are significant in sparse areas of the matrix more than in denser ones. Algorithmic decompositions, on the other hand, are more heavily ...

  7. Texton Based Shape Features on Local Binary Pattern for Age Classification

    OpenAIRE

    V. Vijaya Kumar; B. Eswara Reddy; P. Chandra Sekhar Reddy

    2012-01-01

    Classification and recognition of objects is interest of many researchers. Shape is a significant feature of objects and it plays a crucial role in image classification and recognition. The present paper assumes that the features that drastically affect the adulthood classification system are the Shape features (SF) of face. Based on this, the present paper proposes a new technique of adulthood classification by extracting feature parameters of face on Integrated Texton based LBP (IT-LBP) ima...

  8. A generalized representation-based approach for hyperspectral image classification

    Science.gov (United States)

    Li, Jiaojiao; Li, Wei; Du, Qian; Li, Yunsong

    2016-05-01

    Sparse representation-based classifier (SRC) is of great interest recently for hyperspectral image classification. It is assumed that a testing pixel is linearly combined with atoms of a dictionary. Under this circumstance, the dictionary includes all the training samples. The objective is to find a weight vector that yields a minimum L2 representation error with the constraint that the weight vector is sparse with a minimum L1 norm. The pixel is assigned to the class whose training samples yield the minimum error. In addition, collaborative representation-based classifier (CRC) is also proposed, where the weight vector has a minimum L2 norm. The CRC has a closed-form solution; when using class-specific representation it can yield even better performance than the SRC. Compared to traditional classifiers such as support vector machine (SVM), SRC and CRC do not have a traditional training-testing fashion as in supervised learning, while their performance is similar to or even better than SVM. In this paper, we investigate a generalized representation-based classifier which uses Lq representation error, Lp weight norm, and adaptive regularization. The classification performance of Lq and Lp combinations is evaluated with several real hyperspectral datasets. Based on these experiments, recommendation is provide for practical implementation.

  9. Decommissioning technology development for research reactors; establishment on the classification scheme of the decommissioning information and data

    Energy Technology Data Exchange (ETDEWEB)

    Ser, J. S.; Jang, Se Kyu; Kim, Young Do [Chungchong Institute of Regional Information System, Taejeon (Korea)

    2002-04-01

    The establishment of the decommissioning DB is the first thing in KOREA. It has never been decided the standardization in relation to the decommissioning DB all over the world and many countries has been constructed their decommissioning DB which serve their purpose. Owing to get the classification of the decommissioning information and data, it is used a prototyping design that is needed the DB construction as a basic data and applied to a nuclear facilities in the future. 10 refs. (Author)

  10. PERFORMANCE COMPARISON OF CELL-BASED AND PACKET-BASED SWITCHING SCHEMES FOR SHARED MEMORY SWITCHES

    Institute of Scientific and Technical Information of China (English)

    Xi Kang; Ge Ning; Feng Chongxi

    2004-01-01

    Shared Memory (SM) switches are widely used for its high throughput, low delay and efficient use of memory. This paper compares the performance of two prominent switching schemes of SM packet switches: Cell-Based Switching (CBS) and Packet-Based Switching (PBS).Theoretical analysis is carried out to draw qualitative conclusion on the memory requirement,throughput and packet delay of the two schemes. Furthermore, simulations are carried out to get quantitative results of the performance comparison under various system load, traffic patterns,and memory sizes. Simulation results show that PBS has the advantage of shorter time delay while CBS has lower memory requirement and outperforms in throughput when the memory size is limited. The comparison can be used for tradeoff between performance and complexity in switch design.

  11. Generalization performance of graph-based semisupervised classification

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Semi-supervised learning has been of growing interest over the past few years and many methods have been proposed. Although various algorithms are provided to implement semi-supervised learning,there are still gaps in our understanding of the dependence of generalization error on the numbers of labeled and unlabeled data. In this paper,we consider a graph-based semi-supervised classification algorithm and establish its generalization error bounds. Our results show the close relations between the generalization performance and the structural invariants of data graph.

  12. Classification Based on Hierarchical Linear Models: The Need for Incorporation of Social Contexts in Classification Analysis

    Science.gov (United States)

    Vaughn, Brandon K.; Wang, Qui

    2009-01-01

    Many areas in educational and psychological research involve the use of classification statistical analysis. For example, school districts might be interested in attaining variables that provide optimal prediction of school dropouts. In psychology, a researcher might be interested in the classification of a subject into a particular psychological…

  13. A Method for Data Classification Based on Discernibility Matrix and Discernibility Function

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A method for data classification will influence the efficiency of classification. Attributes reduction based on discernibility matrix and discernibility function in rough sets can use in data classification, so we put forward a method for data classification. Namely, firstly, we use discernibility matrix and discernibility function to delete superfluous attributes in formation system and get a necessary attribute set. Secondly, we delete superfluous attribute values and get decision rules. Finally, we classify data by means of decision rules. The experiments show that data classification using this method is simpler in the structure, and can improve the efficiency of classification.

  14. Semi-Supervised Classification based on Gaussian Mixture Model for remote imagery

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Semi-Supervised Classification (SSC),which makes use of both labeled and unlabeled data to determine classification borders in feature space,has great advantages in extracting classification information from mass data.In this paper,a novel SSC method based on Gaussian Mixture Model (GMM) is proposed,in which each class’s feature space is described by one GMM.Experiments show the proposed method can achieve high classification accuracy with small amount of labeled data.However,for the same accuracy,supervised classification methods such as Support Vector Machine,Object Oriented Classification,etc.should be provided with much more labeled data.

  15. Pairing-Free ID-Based Key-Insulated Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    Guo-Bin Zhu; Hu Xiong; Zhi-Guang Qin

    2015-01-01

    Abstract⎯Without the assumption that the private keys are kept secure perfectly, cryptographic primitives cannot be deployed in the insecure environments where the key leakage is inevitable. In order to reduce the damage caused by the key exposure in the identity-based (ID-based) signature scenarios efficiently, we propose an ID-based key-insulated signature scheme in this paper, which eliminates the expensive bilinear pairing operations. Compared with the previous work, our scheme minimizes the computation cost without any extra cost. Under the discrete logarithm (DL) assumption, a security proof of our scheme in the random oracle model has also been given.

  16. Efficient enhancing scheme for TCP performance over satellite-based internet

    Institute of Scientific and Technical Information of China (English)

    Wang Lina; Gu Xuemai

    2007-01-01

    Satellite link characteristics drastically degrade transport control protocol (TCP) performance. An efficient performance enhancing scheme is proposed. The improvement of TCP performance over satellite-based Intemet is accomplished by protocol transition gateways at each end ora satellite link. The protocol which runs over a satellite link executes the receiver-driven flow control and acknowledgements- and timeouts-based error control strategies. The validity of this TCP performance enhancing scheme is verified by a series of simulation experiments. Results show that the proposed scheme can efficiently enhance the TCP performance over satellite-based Intemet and ensure that the available bandwidth resources of the satellite link are fully utilized.

  17. 基于分类矩阵的决策树算法%Decision tree algorithm based on classification matrix

    Institute of Scientific and Technical Information of China (English)

    陶道强; 马良荔; 彭超

    2012-01-01

    为了提高决策树分类的速度和精确率,提出了一种基于分类矩阵的决策树算法.介绍了ID3算法的理论基础,定义了一种分类矩阵,指出了ID3算法的取值偏向性并利用分类矩阵给出了证明.在此基础上,引入了一个权重因子,抑制了原有算法的取值偏向,并利用分类矩阵给出相应证明,同时根据基于分类矩阵增益的特点,提出了新的决策树分类方案,旨在运算速率上进行优化,与原有算法进行了实验比较.对实验结果分析表明,优化后的方案在性能上有明显改善.%To improve the classification speed and accuracy of the decision tree algorithm, a new program is proposed based on classification matrix. Firstly, the basic theory of the ID3 algorithm is introduced and a classification matrix is defined. Then the variety bias of this algorithm is pointed out, which is proved using the classification matrix. On the basis of the above, a weighting factor is cited to suppress the variety bias of the ID3 algorithm on the premise of a corresponding proof. According to the characteristics of the gain based on the classification matrix, a new decision tree scheme is proposed, aiming to optimize computing speed. Finally, the program is compared with the ID3 algorithm through experiment Experimental results show that the optimized scheme is obviously better than the original one in performance.

  18. A multihop key agreement scheme for wireless ad hoc networks based on channel characteristics.

    Science.gov (United States)

    Hao, Zhuo; Zhong, Sheng; Yu, Nenghai

    2013-01-01

    A number of key agreement schemes based on wireless channel characteristics have been proposed recently. However, previous key agreement schemes require that two nodes which need to agree on a key are within the communication range of each other. Hence, they are not suitable for multihop wireless networks, in which nodes do not always have direct connections with each other. In this paper, we first propose a basic multihop key agreement scheme for wireless ad hoc networks. The proposed basic scheme is resistant to external eavesdroppers. Nevertheless, this basic scheme is not secure when there exist internal eavesdroppers or Man-in-the-Middle (MITM) adversaries. In order to cope with these adversaries, we propose an improved multihop key agreement scheme. We show that the improved scheme is secure against internal eavesdroppers and MITM adversaries in a single path. Both performance analysis and simulation results demonstrate that the improved scheme is efficient. Consequently, the improved key agreement scheme is suitable for multihop wireless ad hoc networks. PMID:23766725

  19. Implementation of an energy-efficient scheduling scheme based on pipeline flux leak monitoring networks

    Institute of Scientific and Technical Information of China (English)

    ZHOU Peng; YAO JiangHe; PEI JiuLing

    2009-01-01

    Flow against pipeline leakage and the pipe network sudden burst pipe to pipeline leakage flow for the application objects,an energy-efficient real-time scheduling scheme is designed extensively used in pipeline leak monitoring.The proposed scheme can adaptively adjust the network rate in real-time and reduce the cell loss rate,so that it can efficiently avoid the traffic congestion.The recent evolution of wireless sensor networks has yielded a demand to improve energy-efficient scheduling algorithms and energy-efficient medium access protocols.This paper proposes an energy-efficient real-time scheduling scheme that reduces power consumption and network errors on pipeline flux leak monitoring networks.The proposed scheme is based on a dynamic modulation scaling scheme which can scale the number of bits per symbol and a switching scheme which can swap the polling schedule between channels.Built on top of EDF scheduling policy,the proposed scheme enhances the power performance without violating the constraints of real-time streams.The simulation results show that the proposed scheme enhances fault-tolerance and reduces power consumption.Furthermore,that Network congestion avoidance strategy with an energy-efficient real-time scheduling scheme can efficiently improve the bandwidth utilization,TCP friendliness and reduce the packet drop rate in pipeline flux leak monitoring networks.

  20. Cryptanalysis and Performance Evaluation of Enhanced Threshold Proxy Signature Scheme Based on RSA for Known Signers

    Directory of Open Access Journals (Sweden)

    Raman Kumar

    2013-01-01

    Full Text Available In these days there are plenty of signature schemes such as the threshold proxy signature scheme (Kumar and Verma 2010. The network is a shared medium so that the weakness security attacks such as eavesdropping, replay attack, and modification attack. Thus, we have to establish a common key for encrypting/decrypting our communications over an insecure network. In this scheme, a threshold proxy signature scheme based on RSA, any or more proxy signers can cooperatively generate a proxy signature while or fewer of them cannot do it. The threshold proxy signature scheme uses the RSA cryptosystem to generate the private and the public key of the signers (Rivest et al., 1978. Comparison is done on the basis of time complexity, space complexity, and communication overhead. We compare the performance of four schemes (Hwang et al. (2003, Kuo and Chen (2005, Yong-Jun et al. (2007, and Li et al. (2007, with the performance of a scheme that has been proposed earlier by the authors of this paper. In the proposed scheme, both the combiner and the secret share holder can verify the correctness of the information that they are receiving from each other. Therefore, the enhanced threshold proxy signature scheme is secure and efficient against notorious conspiracy attacks.

  1. Security enhancement of a biometric based authentication scheme for telecare medicine information systems with nonce.

    Science.gov (United States)

    Mishra, Dheerendra; Mukhopadhyay, Sourav; Kumari, Saru; Khan, Muhammad Khurram; Chaturvedi, Ankita

    2014-05-01

    Telecare medicine information systems (TMIS) present the platform to deliver clinical service door to door. The technological advances in mobile computing are enhancing the quality of healthcare and a user can access these services using its mobile device. However, user and Telecare system communicate via public channels in these online services which increase the security risk. Therefore, it is required to ensure that only authorized user is accessing the system and user is interacting with the correct system. The mutual authentication provides the way to achieve this. Although existing schemes are either vulnerable to attacks or they have higher computational cost while an scalable authentication scheme for mobile devices should be secure and efficient. Recently, Awasthi and Srivastava presented a biometric based authentication scheme for TMIS with nonce. Their scheme only requires the computation of the hash and XOR functions.pagebreak Thus, this scheme fits for TMIS. However, we observe that Awasthi and Srivastava's scheme does not achieve efficient password change phase. Moreover, their scheme does not resist off-line password guessing attack. Further, we propose an improvement of Awasthi and Srivastava's scheme with the aim to remove the drawbacks of their scheme. PMID:24771484

  2. Feature selection gait-based gender classification under different circumstances

    Science.gov (United States)

    Sabir, Azhin; Al-Jawad, Naseer; Jassim, Sabah

    2014-05-01

    This paper proposes a gender classification based on human gait features and investigates the problem of two variations: clothing (wearing coats) and carrying bag condition as addition to the normal gait sequence. The feature vectors in the proposed system are constructed after applying wavelet transform. Three different sets of feature are proposed in this method. First, Spatio-temporal distance that is dealing with the distance of different parts of the human body (like feet, knees, hand, Human Height and shoulder) during one gait cycle. The second and third feature sets are constructed from approximation and non-approximation coefficient of human body respectively. To extract these two sets of feature we divided the human body into two parts, upper and lower body part, based on the golden ratio proportion. In this paper, we have adopted a statistical method for constructing the feature vector from the above sets. The dimension of the constructed feature vector is reduced based on the Fisher score as a feature selection method to optimize their discriminating significance. Finally k-Nearest Neighbor is applied as a classification method. Experimental results demonstrate that our approach is providing more realistic scenario and relatively better performance compared with the existing approaches.

  3. A Fair Off-Line Electronic Cash Scheme Based on Restrictive Partially Blind Signature

    Institute of Scientific and Technical Information of China (English)

    王常吉; 吴建平; 段海新

    2004-01-01

    A fair off-line electronic cash scheme was presented based on a provable secure restrictive partially blind signature.The scheme is more efficient than those of previous works as the expiry date and denomination information are embedded in the electronic cash,which alleviates the storage pressure for the bank to check double spending,and the bank need not use different public keys for different coin values,shops and users need not carry a list of bank's public keys to verify in their electronic wallet.The modular exponentiations are reduced for both the user and the bank by letting the trustee publish the public values with different structure as those of previous electronic cash schemes.The scheme security is based on the random oracle model and the decision Diffie-Hellman assumption.The scheme can be easily extended to multi-trustees and multi-banks using threshold cryptography.

  4. A kind of signature scheme based on class groups of quadratic fields

    Institute of Scientific and Technical Information of China (English)

    董晓蕾; 曹珍富

    2004-01-01

    Quadratic-field cryptosystem is a cryptosystem built from discrete logarithm problem in ideal class groups of quadratic fields(CL-DLP). The problem on digital signature scheme based on ideal class groups of quadratic fields remained open, because of the difficulty of computing class numbers of quadratic fields. In this paper, according to our researches on quadratic fields, we construct the first digital signature scheme in ideal class groups of quadratic fields, using q as modulus, which denotes the prime divisors of ideal class numbers of quadratic fields. Security of the new signature scheme is based fully on CL-DLP. This paper also investigates realization of the scheme, and proposes the concrete technique. In addition, the technique introduced in the paper can be utilized to realize signature schemes of other kinds.

  5. Joint application of feature extraction based on EMD-AR strategy and multi-class classifier based on LS-SVM in EMG motion classification

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents an effective and efficient combination of feature extraction and multi-class classifier for motion classification by analyzing the surface electromyografic (sEMG) signals. In contrast to the existing methods, considering the non-stationary and nonlinear characteristics of EMG signals, to get the more separable feature set, we introduce the empirical mode decomposition (EMD) to decompose the original EMG signals into several intrinsic mode functions (IMFs) and then compute the coefficients of autoregressive models of each IMF to form the feature set. Based on the least squares support vector machines (LS-SVMs), the multi-class classifier is designed and constructed to classify various motions. The results of contrastive experiments showed that the accuracy of motion recognition is improved with the described classification scheme. Furthermore,compared with other classifiers using different features, the excellent performance indicated the potential of the SVM techniques embedding the EMD-AR kernel in motion classification.

  6. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  7. Scene classification of infrared images based on texture feature

    Science.gov (United States)

    Zhang, Xiao; Bai, Tingzhu; Shang, Fei

    2008-12-01

    Scene Classification refers to as assigning a physical scene into one of a set of predefined categories. Utilizing the method texture feature is good for providing the approach to classify scenes. Texture can be considered to be repeating patterns of local variation of pixel intensities. And texture analysis is important in many applications of computer image analysis for classification or segmentation of images based on local spatial variations of intensity. Texture describes the structural information of images, so it provides another data to classify comparing to the spectrum. Now, infrared thermal imagers are used in different kinds of fields. Since infrared images of the objects reflect their own thermal radiation, there are some shortcomings of infrared images: the poor contrast between the objectives and background, the effects of blurs edges, much noise and so on. Because of these shortcomings, it is difficult to extract to the texture feature of infrared images. In this paper we have developed an infrared image texture feature-based algorithm to classify scenes of infrared images. This paper researches texture extraction using Gabor wavelet transform. The transformation of Gabor has excellent capability in analysis the frequency and direction of the partial district. Gabor wavelets is chosen for its biological relevance and technical properties In the first place, after introducing the Gabor wavelet transform and the texture analysis methods, the infrared images are extracted texture feature by Gabor wavelet transform. It is utilized the multi-scale property of Gabor filter. In the second place, we take multi-dimensional means and standard deviation with different scales and directions as texture parameters. The last stage is classification of scene texture parameters with least squares support vector machine (LS-SVM) algorithm. SVM is based on the principle of structural risk minimization (SRM). Compared with SVM, LS-SVM has overcome the shortcoming of

  8. Curve interpolation based on Catmull-Clark subdivision scheme

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    An efficient algorithm for curve interpolation is proposed. The algorithm can produce a subdivision surface that can interpolate the predefined cubic B-spline curves by applying the Catmull-Clark scheme to a polygonal mesh containing "symmetric zonal meshes", which possesses some special properties. Many kinds of curve interpolation problems can be dealt with by this algorithm, such as interpolating single open curve or closed curve, a mesh of nonintersecting or intersecting curve. The interpolating surface is C2 everywhere excepting at a finite number of points. At the same time, sharp creases can also be modeled on the limit subdivision surface by duplicating the vertices of the tagged edges of initial mesh, i.e. the surface is only C0 along the cubic B-spline curve that is defined by the tagged edges. Because of being simple and easy to implement, this method can be used for product shape design and graphic software development.

  9. Sparse Parallel MRI Based on Accelerated Operator Splitting Schemes

    Science.gov (United States)

    Xie, Weisi; Su, Zhenghang

    2016-01-01

    Recently, the sparsity which is implicit in MR images has been successfully exploited for fast MR imaging with incomplete acquisitions. In this paper, two novel algorithms are proposed to solve the sparse parallel MR imaging problem, which consists of l1 regularization and fidelity terms. The two algorithms combine forward-backward operator splitting and Barzilai-Borwein schemes. Theoretically, the presented algorithms overcome the nondifferentiable property in l1 regularization term. Meanwhile, they are able to treat a general matrix operator that may not be diagonalized by fast Fourier transform and to ensure that a well-conditioned optimization system of equations is simply solved. In addition, we build connections between the proposed algorithms and the state-of-the-art existing methods and prove their convergence with a constant stepsize in Appendix. Numerical results and comparisons with the advanced methods demonstrate the efficiency of proposed algorithms. PMID:27746824

  10. Security Encryption Scheme for Communication of Web Based Control Systems

    Science.gov (United States)

    Robles, Rosslin John; Kim, Tai-Hoon

    A control system is a device or set of devices to manage, command, direct or regulate the behavior of other devices or systems. The trend in most systems is that they are connected through the Internet. Traditional Supervisory Control and Data Acquisition Systems (SCADA) is connected only in a limited private network Since the internet Supervisory Control and Data Acquisition Systems (SCADA) facility has brought a lot of advantages in terms of control, data viewing and generation. Along with these advantages, are security issues regarding web SCADA, operators are pushed to connect Control Systems through the internet. Because of this, many issues regarding security surfaced. In this paper, we discuss web SCADA and the issues regarding security. As a countermeasure, a web SCADA security solution using crossed-crypto-scheme is proposed to be used in the communication of SCADA components.

  11. AN AGENT BASED TRANSACTION PROCESSING SCHEME FOR DISCONNECTED MOBILE NODES

    Directory of Open Access Journals (Sweden)

    J.L. Walter Jeyakumar

    2010-12-01

    Full Text Available We present a mobile transaction framework in which mobile users can share data which is stored in the cache of a mobile agent. This mobile agent is a special mobile node which coordinates the sharing process. The proposed framework allows mobile affiliation work groups to be formed dynamically with a mobile agent and mobile hosts. Using short range wireless communication technology, mobile users can simultaneously access the data from the cache of the mobile agent. The data Access Manager module at the mobile agent enforces concurrency control using cache invalidation technique. This model supports disconnected mobile computing allowing mobile agent to move along with the Mobile Hosts. The proposed Transaction frame work has been simulated in Java 2 and performance of this scheme is compared with existing frame works.

  12. A novel dynamical community detection algorithm based on weighting scheme

    Science.gov (United States)

    Li, Ju; Yu, Kai; Hu, Ke

    2015-12-01

    Network dynamics plays an important role in analyzing the correlation between the function properties and the topological structure. In this paper, we propose a novel dynamical iteration (DI) algorithm, which incorporates the iterative process of membership vector with weighting scheme, i.e. weighting W and tightness T. These new elements can be used to adjust the link strength and the node compactness for improving the speed and accuracy of community structure detection. To estimate the optimal stop time of iteration, we utilize a new stability measure which is defined as the Markov random walk auto-covariance. We do not need to specify the number of communities in advance. It naturally supports the overlapping communities by associating each node with a membership vector describing the node's involvement in each community. Theoretical analysis and experiments show that the algorithm can uncover communities effectively and efficiently.

  13. AN IMPROVED DOS-RESISTANT ID-BASED PASSWORD AUTHENTICATION SCHEME WITHOUT USING SMART CARD

    Institute of Scientific and Technical Information of China (English)

    Wen Fengtong; Li Xuelei; Cui Shenjun

    2011-01-01

    In 2010,Hwang,et al.proposed a 'DoS-resistant ID-based password authentication scheme using smart cards' as an improvement of Kim-Lee-Yoo's ‘ID-based password authentication scheme'.In this paper,we cryptanalyze Hwang,et al.'s scheme and point out that the revealed session key could threat the security of the scheme.We demonstrate that extracting information from smart cards is equal to knowing the session key.Thus known session key attacks are also effective under the assumption that the adversary could obtain the information stored in the smart cards.We proposed an improved scheme with security analysis to remedy the weaknesses of Hwang,et al.'s scheme.The new scheme does not only keep all the merits of the original,but also provides several additional phases to improve the flexibility.Finally,the improved scheme is more secure,efficient,practical,and convenient,because elliptic curve cryptosystem is introduced,the expensive smart cards and synchronized clock system are replaced by mobile devices and nonces.

  14. Soft computing based feature selection for environmental sound classification

    NARCIS (Netherlands)

    Shakoor, A.; May, T.M.; Van Schijndel, N.H.

    2010-01-01

    Environmental sound classification has a wide range of applications,like hearing aids, mobile communication devices, portable media players, and auditory protection devices. Sound classification systemstypically extract features from the input sound. Using too many features increases complexity unne

  15. A physiologically-inspired model of numerical classification based on graded stimulus coding

    Directory of Open Access Journals (Sweden)

    John Pearson

    2010-01-01

    Full Text Available In most natural decision contexts, the process of selecting among competing actions takes place in the presence of informative, but potentially ambiguous, stimuli. Decisions about magnitudes—quantities like time, length, and brightness that are linearly ordered—constitute an important subclass of such decisions. It has long been known that perceptual judgments about such quantities obey Weber’s Law, wherein the just-noticeable difference in a magnitude is proportional to the magnitude itself. Current physiologically inspired models of numerical classification assume discriminations are made via a labeled line code of neurons selectively tuned for numerosity, a pattern observed in the firing rates of neurons in the ventral intraparietal area (VIP of the macaque. By contrast, neurons in the contiguous lateral intraparietal area (LIP signal numerosity in a graded fashion, suggesting the possibility that numerical classification could be achieved in the absence of neurons tuned for number. Here, we consider the performance of a decision model based on this analog coding scheme in a paradigmatic discrimination task—numerosity bisection. We demonstrate that a basic two-neuron classifier model, derived from experimentally measured monotonic responses of LIP neurons, is sufficient to reproduce the numerosity bisection behavior of monkeys, and that the threshold of the classifier can be set by reward maximization via a simple learning rule. In addition, our model predicts deviations from Weber Law scaling of choice behavior at high numerosity. Together, these results suggest both a generic neuronal framework for magnitude-based decisions and a role for reward contingency in the classification of such stimuli.

  16. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  17. Understanding Acupuncture Based on ZHENG Classification from System Perspective

    Directory of Open Access Journals (Sweden)

    Junwei Fang

    2013-01-01

    Full Text Available Acupuncture is an efficient therapy method originated in ancient China, the study of which based on ZHENG classification is a systematic research on understanding its complexity. The system perspective is contributed to understand the essence of phenomena, and, as the coming of the system biology era, broader technology platforms such as omics technologies were established for the objective study of traditional chinese medicine (TCM. Omics technologies could dynamically determine molecular components of various levels, which could achieve a systematic understanding of acupuncture by finding out the relationships of various response parts. After reviewing the literature of acupuncture studied by omics approaches, the following points were found. Firstly, with the help of omics approaches, acupuncture was found to be able to treat diseases by regulating the neuroendocrine immune (NEI network and the change of which could reflect the global effect of acupuncture. Secondly, the global effect of acupuncture could reflect ZHENG information at certain structure and function levels, which might reveal the mechanism of Meridian and Acupoint Specificity. Furthermore, based on comprehensive ZHENG classification, omics researches could help us understand the action characteristics of acupoints and the molecular mechanisms of their synergistic effect.

  18. ECG-based heartbeat classification for arrhythmia detection: A survey.

    Science.gov (United States)

    Luz, Eduardo José da S; Schwartz, William Robson; Cámara-Chávez, Guillermo; Menotti, David

    2016-04-01

    An electrocardiogram (ECG) measures the electric activity of the heart and has been widely used for detecting heart diseases due to its simplicity and non-invasive nature. By analyzing the electrical signal of each heartbeat, i.e., the combination of action impulse waveforms produced by different specialized cardiac tissues found in the heart, it is possible to detect some of its abnormalities. In the last decades, several works were developed to produce automatic ECG-based heartbeat classification methods. In this work, we survey the current state-of-the-art methods of ECG-based automated abnormalities heartbeat classification by presenting the ECG signal preprocessing, the heartbeat segmentation techniques, the feature description methods and the learning algorithms used. In addition, we describe some of the databases used for evaluation of methods indicated by a well-known standard developed by the Association for the Advancement of Medical Instrumentation (AAMI) and described in ANSI/AAMI EC57:1998/(R)2008 (ANSI/AAMI, 2008). Finally, we discuss limitations and drawbacks of the methods in the literature presenting concluding remarks and future challenges, and also we propose an evaluation process workflow to guide authors in future works.

  19. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    Directory of Open Access Journals (Sweden)

    Rui Sun

    2016-08-01

    Full Text Available Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods.

  20. A Cluster Based Approach for Classification of Web Results

    Directory of Open Access Journals (Sweden)

    Apeksha Khabia

    2014-12-01

    Full Text Available Nowadays significant amount of information from web is present in the form of text, e.g., reviews, forum postings, blogs, news articles, email messages, web pages. It becomes difficult to classify documents in predefined categories as the number of document grows. Clustering is the classification of a data into clusters, so that the data in each cluster share some common trait – often vicinity according to some defined measure. Underlying distribution of data set can somewhat be depicted based on the learned clusters under the guidance of initial data set. Thus, clusters of documents can be employed to train the classifier by using defined features of those clusters. One of the important issues is also to classify the text data from web into different clusters by mining the knowledge. Conforming to that, this paper presents a review on most of document clustering technique and cluster based classification techniques used so far. Also pre-processing on text dataset and document clustering method is explained in brief.

  1. ECG-based heartbeat classification for arrhythmia detection: A survey.

    Science.gov (United States)

    Luz, Eduardo José da S; Schwartz, William Robson; Cámara-Chávez, Guillermo; Menotti, David

    2016-04-01

    An electrocardiogram (ECG) measures the electric activity of the heart and has been widely used for detecting heart diseases due to its simplicity and non-invasive nature. By analyzing the electrical signal of each heartbeat, i.e., the combination of action impulse waveforms produced by different specialized cardiac tissues found in the heart, it is possible to detect some of its abnormalities. In the last decades, several works were developed to produce automatic ECG-based heartbeat classification methods. In this work, we survey the current state-of-the-art methods of ECG-based automated abnormalities heartbeat classification by presenting the ECG signal preprocessing, the heartbeat segmentation techniques, the feature description methods and the learning algorithms used. In addition, we describe some of the databases used for evaluation of methods indicated by a well-known standard developed by the Association for the Advancement of Medical Instrumentation (AAMI) and described in ANSI/AAMI EC57:1998/(R)2008 (ANSI/AAMI, 2008). Finally, we discuss limitations and drawbacks of the methods in the literature presenting concluding remarks and future challenges, and also we propose an evaluation process workflow to guide authors in future works. PMID:26775139

  2. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation.

    Science.gov (United States)

    Sun, Rui; Zhang, Guanghai; Yan, Xiaoxing; Gao, Jun

    2016-01-01

    Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods. PMID:27537888

  3. Gear Crack Level Classification Based on EMD and EDT

    Directory of Open Access Journals (Sweden)

    Haiping Li

    2015-01-01

    Full Text Available Gears are the most essential parts in rotating machinery. Crack fault is one of damage modes most frequently occurring in gears. So, this paper deals with the problem of different crack levels classification. The proposed method is mainly based on empirical mode decomposition (EMD and Euclidean distance technique (EDT. First, vibration signal acquired by accelerometer is processed by EMD and intrinsic mode functions (IMFs are obtained. Then, a correlation coefficient based method is proposed to select the sensitive IMFs which contain main gear fault information. And energy of these IMFs is chosen as the fault feature by comparing with kurtosis and skewness. Finally, Euclidean distances between test sample and four classes trained samples are calculated, and on this basis, fault level classification of the test sample can be made. The proposed approach is tested and validated through a gearbox experiment, in which four crack levels and three kinds of loads are utilized. The results show that the proposed method has high accuracy rates in classifying different crack levels and may be adaptive to different conditions.

  4. Delaunay triangulation-based pit density estimation for the classification of polyps in high-magnification chromo-colonoscopy.

    Science.gov (United States)

    Häfner, M; Liedlgruber, M; Uhl, A; Vécsei, A; Wrba, F

    2012-09-01

    In this work we propose a method to extract shape-based features from endoscopic images for an automated classification of colonic polyps. This method is based on the density of pits as used in the pit pattern classification scheme which is commonly used for the classification of colonic polyps. For the detection of pits we employ a noise-robust variant of the LBP operator. To be able to be robust against local texture variations we extend this operator by an adaptive thresholding. Based on the detected pit candidates we compute a Delaunay triangulation and use the edge lengths of the resulting triangles to construct histograms. These are then used in conjunction with the k-NN classifier to classify images. We show that, compared to a previously developed method, we are not only able to almost always get higher classification results in our application scenario, but that the proposed method is also able to significantly outperform the previously developed method in terms of the computational demand.

  5. Knowledge-Based Trajectory Error Pattern Method Applied to an Active Force Control Scheme

    Directory of Open Access Journals (Sweden)

    Endra Pitowarno, Musa Mailah, Hishamuddin Jamaluddin

    2012-08-01

    Full Text Available The active force control (AFC method is known as a robust control scheme that dramatically enhances the performance of a robot arm particularly in compensating the disturbance effects. The main task of the AFC method is to estimate the inertia matrix in the feedback loop to provide the correct (motor torque required to cancel out these disturbances. Several intelligent control schemes have already been introduced to enhance the estimation methods of acquiring the inertia matrix such as those using neural network, iterative learning and fuzzy logic. In this paper, we propose an alternative scheme called Knowledge-Based Trajectory Error Pattern Method (KBTEPM to suppress the trajectory track error of the AFC scheme. The knowledge is developed from the trajectory track error characteristic based on the previous experimental results of the crude approximation method. It produces a unique, new and desirable error pattern when a trajectory command is forced. An experimental study was performed using simulation work on the AFC scheme with KBTEPM applied to a two-planar manipulator in which a set of rule-based algorithm is derived. A number of previous AFC schemes are also reviewed as benchmark. The simulation results show that the AFC-KBTEPM scheme successfully reduces the trajectory track error significantly even in the presence of the introduced disturbances.Key Words:  Active force control, estimated inertia matrix, robot arm, trajectory error pattern, knowledge-based.

  6. FINGERPRINT-BASED KEY BINDING/RECOVERING SCHEME BASED ON FUZZY VAULT

    Institute of Scientific and Technical Information of China (English)

    Feng Quan; Su Fei; Cai Anni

    2008-01-01

    This letter proposes fingerprint-based key binding/recovering with fuzzy vault. Fingerprint minutiae data and the cryptographic key are merged together by a multivariable linear function. First,the minutiae data are bound by a set of random data through the linear function. The number of the function's variables is determined by the required number of matched minutiae. Then, a new key derived from the random data is used to encrypt the cryptographic key. Lastly, the binding data are protected using fuzzy vault scheme. The proposed scheme provides the system with the flexibility to use changeable number of minutiae to bind/recover the protected key and a unified method regardless of the length of the key.

  7. Rainfall Prediction using Data-Core Based Fuzzy Min-Max Neural Network for Classification

    OpenAIRE

    Rajendra Palange,; Nishikant Pachpute

    2015-01-01

    This paper proposes the Rainfall Prediction System by using classification technique. The advanced and modified neural network called Data Core Based Fuzzy Min Max Neural Network (DCFMNN) is used for pattern classification. This classification method is applied to predict Rainfall. The neural network called fuzzy min max neural network (FMNN) that creates hyperboxes for classification and predication, has a problem of overlapping neurons that resoled in DCFMNN to give greater accu...

  8. DISTRIBUTED CERTIFICATE AUTHORITY IN CLUSTER-BASED MANET USING MULTI SECRET SHARING SCHEME

    Directory of Open Access Journals (Sweden)

    Mohammed Azza

    2015-11-01

    Full Text Available Providing secure communications in mobile ad hoc networks (MANET is an important and difficult problem, due to a lack of a key management infrastructure. The authentication is an important security service in (MANETs. To provide a node authentication service we use a fully distributed certificate authorities (FDCA based on the threshold cryptography. In this paper we propose an efficient and verifiable multi secret sharing scheme in cluster-based MANET with a low computation system. Our scheme is based on the overdetermined linear system equation in Galois fields GF(2r. We have analyzed our scheme based on security and performance criteria, and compared with existing approaches. The efficiency of our proposed schemes was verified and evaluated by simulation. Simulation results show that this approach is scalable.

  9. An Improved MAC Scheme of HORNET Based on Node Structure with Variable Optical Buffer

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    An improved unslotted CSMA/CA MAC scheme of HORNET based on the node structure with variable optical buffer is reported. It can be used for transmitting high effectively all variable IP packets in the WDM network.

  10. A New Loss-Tolerant Image Encryption Scheme Based on Secret Sharing and Two Chaotic Systems

    Directory of Open Access Journals (Sweden)

    Li Li

    2012-04-01

    Full Text Available In this study, we propose an efficient loss-tolerant image encryption scheme that protects both confidentiality and loss-tolerance simultaneously in shadow images. In this scheme, we generate the key sequence based on two chaotic maps and then encrypt the image during the sharing phase based on Shamir’s method. Experimental results show a better performance of the proposed scheme for different images than other methods from human vision. Security analysis confirms a high probability to resist both brute-force and collusion attacks.

  11. Utilizing ECG-Based Heartbeat Classification for Hypertrophic Cardiomyopathy Identification.

    Science.gov (United States)

    Rahman, Quazi Abidur; Tereshchenko, Larisa G; Kongkatong, Matthew; Abraham, Theodore; Abraham, M Roselle; Shatkay, Hagit

    2015-07-01

    Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is (potentially fatally) obstructed. A test based on electrocardiograms (ECG) that record the heart electrical activity can help in early detection of HCM patients. This paper presents a cardiovascular-patient classifier we developed to identify HCM patients using standard 10-second, 12-lead ECG signals. Patients are classified as having HCM if the majority of their recorded heartbeats are recognized as characteristic of HCM. Thus, the classifier's underlying task is to recognize individual heartbeats segmented from 12-lead ECG signals as HCM beats, where heartbeats from non-HCM cardiovascular patients are used as controls. We extracted 504 morphological and temporal features—both commonly used and newly-developed ones—from ECG signals for heartbeat classification. To assess classification performance, we trained and tested a random forest classifier and a support vector machine classifier using 5-fold cross validation. We also compared the performance of these two classifiers to that obtained by a logistic regression classifier, and the first two methods performed better than logistic regression. The patient-classification precision of random forests and of support vector machine classifiers is close to 0.85. Recall (sensitivity) and specificity are approximately 0.90. We also conducted feature selection experiments by gradually removing the least informative features; the results show that a relatively small subset of 264 highly informative features can achieve performance measures comparable to those achieved by using the complete set of features. PMID:25915962

  12. A RBF Based Local Gridfree Scheme for Unsteady Convection-Diffusion Problems

    Directory of Open Access Journals (Sweden)

    Sanyasiraju VSS Yedida

    2009-12-01

    Full Text Available In this work a Radial Basis Function (RBF based local gridfree scheme has been presented for unsteady convection diffusion equations. Numerical studies have been made using multiquadric (MQ radial function. Euler and a three stage Runge-Kutta schemes have been used for temporal discretization. The developed scheme is compared with the corresponding finite difference (FD counterpart and found that the solutions obtained using the former are more superior. As expected, for a fixed time step and for large nodal densities, thought the Runge-Kutta scheme is able to maintain higher order of accuracy over the Euler method, the temporal discretization is independent of the improvement in the solution which in the developed scheme has been achived by optimizing the shape parameter of the RBF.

  13. Optimal Scheme Selection of Agricultural Production Structure Adjustment - Based on DEA Model; Punjab (Pakistan)

    Institute of Scientific and Technical Information of China (English)

    Zeeshan Ahmad; Meng Jun; Muhammad Abdullah; Mazhar Nadeem Ishaq; Majid Lateef; Imran Khan

    2015-01-01

    This paper used the modern evaluation method of DEA (Data Envelopment Analysis) to assess the comparative efficiency and then on the basis of this among multiple schemes chose the optimal scheme of agricultural production structure adjustment. Based on the results of DEA model, we dissected scale advantages of each discretionary scheme or plan. We examined scale advantages of each discretionary scheme, tested profoundly a definitive purpose behind not-DEA efficient, which elucidated the system and methodology to enhance these discretionary plans. At the end, another method had been proposed to rank and select the optimal scheme. The research was important to guide the practice if the modification of agricultural production industrial structure was carried on.

  14. A Sentiment Delivering Estimate Scheme Based on Trust Chain in Mobile Social Network

    Directory of Open Access Journals (Sweden)

    Meizi Li

    2015-01-01

    Full Text Available User sentiment analysis has become a flourishing frontier in data mining mobile social network platform since the mobile social network plays a significant role in users’ daily communication and sentiment interaction. This study studies the scheme of sentiment estimate by using the users’ trustworthy relationships for evaluating sentiment delivering. First, we address an overview of sentiment delivering estimate scheme and propose its related definitions, that is, trust chain among users, sentiment semantics, and sentiment ontology. Second, this study proposes the trust chain model and its evaluation method, which is composed of evaluation of atomic, serial, parallel, and combined trust chains. Then, we propose sentiment modeling method by presenting its modeling rules. Further, we propose the sentiment delivering estimate scheme from two aspects: explicit and implicit sentiment delivering estimate schemes, based on trust chain and sentiment modeling method. Finally, examinations and results are given to further explain effectiveness and feasibility of our scheme.

  15. RTP-Packets' Loss Recovery Scheme Based on Layered Buffer-Routers

    Institute of Scientific and Technical Information of China (English)

    XU Xian-bin; YU Wei; CHEN Xin-meng

    2004-01-01

    This paper introduces an RTP-packets' loss recovery scheme in MPEG-4 playback type multicast application model, which is based on retransmission scheme. Through the auxiliary and coordinated buffer playing scheme of layered "buffer-routers", the RTP-packets' loss recovery in limited time is made possible. We consider in the scheme to handle retransmission request with buffer waiting when network congestion occurs. Thus, neither the degree of congestion will be worsened nor the retransmission request will be lost when sending the request to higher-level buffer router. The RTP-packets' loss recovery scheme suggested by us is not only applied to MPEG-4 multicast application in LAN, but also can be extended to more spacious WAN (wide area network) when user groups comparatively centralize in certain number of local areas.

  16. A Non-symmetric Digital Image Secure Communication Scheme Based on Generalized Chaos Synchronization System

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-Hong; MIN Le-Quan

    2005-01-01

    Based on a generalized chaos synchronization system and a discrete Sinai map, a non-symmetric true color(RGB) digital image secur e communication scheme is proposed. The scheme first changes an ordinary RGB digital image with 8 bits into unrecognizable disorder codes and then transforms the disorder codes into an RGB digital image with 16 bits for transmitting. A receiver uses a non-symmetric key to verify the authentication of the received data origin,and decrypts the ciphertext. The scheme can encrypt and decrypt most formatted digital RGB images recognized by computers, and recover the plaintext almost without any errors. The scheme is suitable to be applied in network image communications. The analysis of the key space, sensitivity of key parameters, and correlation of encrypted images imply that this scheme has sound security.

  17. A Novel Spectrum Detection Scheme Based on Dynamic Threshold in Cognitive Radio Systems

    OpenAIRE

    Guicai Yu; Chengzhi Long; Mantian Xiang

    2012-01-01

    In cognitive radio networks, nodes should have the capability to decide whether a signal from a primary transmitter is locally present or not in a certain spectrum in short detection period. This study presents a new spectrum detection algorithm based on dynamic threshold. Spectrum detection schemes based on fixed threshold are sensitive to noise uncertainty, the proposed scheme can improve the antagonism of noise uncertainty, get a good performance of detection while without increasing the c...

  18. Application of Object Based Classification and High Resolution Satellite Imagery for Savanna Ecosystem Analysis

    Directory of Open Access Journals (Sweden)

    Jane Southworth

    2010-12-01

    Full Text Available Savanna ecosystems are an important component of dryland regions and yet are exceedingly difficult to study using satellite imagery. Savannas are composed are varying amounts of trees, shrubs and grasses and typically traditional classification schemes or vegetation indices cannot differentiate across class type. This research utilizes object based classification (OBC for a region in Namibia, using IKONOS imagery, to help differentiate tree canopies and therefore woodland savanna, from shrub or grasslands. The methodology involved the identification and isolation of tree canopies within the imagery and the creation of tree polygon layers had an overall accuracy of 84%. In addition, the results were scaled up to a corresponding Landsat image of the same region, and the OBC results compared to corresponding pixel values of NDVI. The results were not compelling, indicating once more the problems of these traditional image analysis techniques for savanna ecosystems. Overall, the use of the OBC holds great promise for this ecosystem and could be utilized more frequently in studies of vegetation structure.

  19. Hubble Classification

    Science.gov (United States)

    Murdin, P.

    2000-11-01

    A classification scheme for galaxies, devised in its original form in 1925 by Edwin P Hubble (1889-1953), and still widely used today. The Hubble classification recognizes four principal types of galaxy—elliptical, spiral, barred spiral and irregular—and arranges these in a sequence that is called the tuning-fork diagram....

  20. Credal Classification based on AODE and compression coefficients

    CERN Document Server

    Corani, Giorgio

    2012-01-01

    Bayesian model averaging (BMA) is an approach to average over alternative models; yet, it usually gets excessively concentrated around the single most probable model, therefore achieving only sub-optimal classification performance. The compression-based approach (Boulle, 2007) overcomes this problem, averaging over the different models by applying a logarithmic smoothing over the models' posterior probabilities. This approach has shown excellent performances when applied to ensembles of naive Bayes classifiers. AODE is another ensemble of models with high performance (Webb, 2005), based on a collection of non-naive classifiers (called SPODE) whose probabilistic predictions are aggregated by simple arithmetic mean. Aggregating the SPODEs via BMA rather than by arithmetic mean deteriorates the performance; instead, we aggregate the SPODEs via the compression coefficients and we show that the resulting classifier obtains a slight but consistent improvement over AODE. However, an important issue in any Bayesian e...