WorldWideScience

Sample records for classification schemes based

  1. Hepatic CT Image Query Based on Threshold-based Classification Scheme with Gabor Features

    Institute of Scientific and Technical Information of China (English)

    JIANG Li-jun; LUO Yong-zing; ZHAO Jun; ZHUANG Tian-ge

    2008-01-01

    Hepatic computed tomography (CT) images with Gabor function were analyzed.Then a thresholdbased classification scheme was proposed using Gabor features and proceeded with the retrieval of the hepatic CT images.In our experiments,a batch of hepatic CT images containing several types of CT findings was used and compared with the Zhao's image classification scheme,support vector machines (SVM) scheme and threshold-based scheme.

  2. Investigation into Text Classification With Kernel Based Schemes

    Science.gov (United States)

    2010-03-01

    classification/categorization applications. The text database considered in this study was collected from the IEEE Xplore database website [2]. The...database considered in this study was collected from the IEEE Xplore database website [2]. The documents collected were limited to Electrical engineering...Linear Discriminant Analysis (LDA) scheme. Titles, along with abstracts from IEEE journal articles published between 1990 and 1999 with specific key

  3. Evaluation of Effectiveness of Wavelet Based Denoising Schemes Using ANN and SVM for Bearing Condition Classification

    Directory of Open Access Journals (Sweden)

    Vijay G. S.

    2012-01-01

    Full Text Available The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR and reducing the root-mean-square error (RMSE. In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN and the Support Vector Machine (SVM, for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher’s Criterion (FC. Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal.

  4. A new Fourier transform based CBIR scheme for mammographic mass classification: a preliminary invariance assessment

    Science.gov (United States)

    Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin

    2015-03-01

    The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.

  5. Segmentation techniques evaluation based on a single compact breast mass classification scheme

    Science.gov (United States)

    Matheus, Bruno R. N.; Marcomini, Karem D.; Schiabel, Homero

    2016-03-01

    In this work some segmentation techniques are evaluated by using a simple centroid-based classification system regarding breast mass delineation in digital mammography images. The aim is to determine the best one for future CADx developments. Six techniques were tested: Otsu, SOM, EICAMM, Fuzzy C-Means, K-Means and Level-Set. All of them were applied to segment 317 mammography images from DDSM database. A single compact set of attributes was extracted and two centroids were defined, one for malignant and another for benign cases. The final classification was based on proximity with a given centroid and the best results were presented by the Level-Set technique with a 68.1% of Accuracy, which indicates this method as the most promising for breast masses segmentation aiming a more precise interpretation in schemes CADx.

  6. A new classification scheme of plastic wastes based upon recycling labels

    Energy Technology Data Exchange (ETDEWEB)

    Özkan, Kemal, E-mail: kozkan@ogu.edu.tr [Computer Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Ergin, Semih, E-mail: sergin@ogu.edu.tr [Electrical Electronics Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Işık, Şahin, E-mail: sahini@ogu.edu.tr [Computer Engineering Dept., Eskişehir Osmangazi University, 26480 Eskişehir (Turkey); Işıklı, İdil, E-mail: idil.isikli@bilecik.edu.tr [Electrical Electronics Engineering Dept., Bilecik University, 11210 Bilecik (Turkey)

    2015-01-15

    Highlights: • PET, HPDE or PP types of plastics are considered. • An automated classification of plastic bottles based on the feature extraction and classification methods is performed. • The decision mechanism consists of PCA, Kernel PCA, FLDA, SVD and Laplacian Eigenmaps methods. • SVM is selected to achieve the classification task and majority voting technique is used. - Abstract: Since recycling of materials is widely assumed to be environmentally and economically beneficial, reliable sorting and processing of waste packaging materials such as plastics is very important for recycling with high efficiency. An automated system that can quickly categorize these materials is certainly needed for obtaining maximum classification while maintaining high throughput. In this paper, first of all, the photographs of the plastic bottles have been taken and several preprocessing steps were carried out. The first preprocessing step is to extract the plastic area of a bottle from the background. Then, the morphological image operations are implemented. These operations are edge detection, noise removal, hole removing, image enhancement, and image segmentation. These morphological operations can be generally defined in terms of the combinations of erosion and dilation. The effect of bottle color as well as label are eliminated using these operations. Secondly, the pixel-wise intensity values of the plastic bottle images have been used together with the most popular subspace and statistical feature extraction methods to construct the feature vectors in this study. Only three types of plastics are considered due to higher existence ratio of them than the other plastic types in the world. The decision mechanism consists of five different feature extraction methods including as Principal Component Analysis (PCA), Kernel PCA (KPCA), Fisher’s Linear Discriminant Analysis (FLDA), Singular Value Decomposition (SVD) and Laplacian Eigenmaps (LEMAP) and uses a simple

  7. A novel fractal image compression scheme with block classification and sorting based on Pearson's correlation coefficient.

    Science.gov (United States)

    Wang, Jianji; Zheng, Nanning

    2013-09-01

    Fractal image compression (FIC) is an image coding technology based on the local similarity of image structure. It is widely used in many fields such as image retrieval, image denoising, image authentication, and encryption. FIC, however, suffers from the high computational complexity in encoding. Although many schemes are published to speed up encoding, they do not easily satisfy the encoding time or the reconstructed image quality requirements. In this paper, a new FIC scheme is proposed based on the fact that the affine similarity between two blocks in FIC is equivalent to the absolute value of Pearson's correlation coefficient (APCC) between them. First, all blocks in the range and domain pools are chosen and classified using an APCC-based block classification method to increase the matching probability. Second, by sorting the domain blocks with respect to APCCs between these domain blocks and a preset block in each class, the matching domain block for a range block can be searched in the selected domain set in which these APCCs are closer to APCC between the range block and the preset block. Experimental results show that the proposed scheme can significantly speed up the encoding process in FIC while preserving the reconstructed image quality well.

  8. A risk-based classification scheme for genetically modified foods. I: Conceptual development.

    Science.gov (United States)

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    The predominant paradigm for the premarket assessment of genetically modified (GM) foods reflects heightened public concern by focusing on foods modified by recombinant deoxyribonucleic acid (rDNA) techniques, while foods modified by other methods of genetic modification are generally not assessed for safety. To determine whether a GM product requires less or more regulatory oversight and testing, we developed and evaluated a risk-based classification scheme (RBCS) for crop-derived GM foods. The results of this research are presented in three papers. This paper describes the conceptual development of the proposed RBCS that focuses on two categories of adverse health effects: (1) toxic and antinutritional effects, and (2) allergenic effects. The factors that may affect the level of potential health risks of GM foods are identified. For each factor identified, criteria for differentiating health risk potential are developed. The extent to which a GM food satisfies applicable criteria for each factor is rated separately. A concern level for each category of health effects is then determined by aggregating the ratings for the factors using predetermined aggregation rules. An overview of the proposed scheme is presented, as well as the application of the scheme to a hypothetical GM food.

  9. Small-scale classification schemes

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2004-01-01

    . While coordination mechanisms focus on how classification schemes enable cooperation among people pursuing a common goal, boundary objects embrace the implicit consequences of classification schemes in situations involving conflicting goals. Moreover, the requirements specification focused on functional...... requirements and provided little information about why these requirements were considered relevant. This stands in contrast to the discussions at the project meetings where the software engineers made frequent use of both abstract goal descriptions and concrete examples to make sense of the requirements....... This difference between the written requirements specification and the oral discussions at the meetings may help explain software engineers’ general preference for people, rather than documents, as their information sources....

  10. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme

    Science.gov (United States)

    Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali

    2017-01-01

    With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public’s feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users’ reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users’ reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods. PMID:28231286

  11. Current terminology and diagnostic classification schemes.

    Science.gov (United States)

    Okeson, J P

    1997-01-01

    This article reviews the current terminology and classification schemes available for temporomandibular disorders. The origin of each term is presented, and the classification schemes that have been offered for temporomandibular disorders are briefly reviewed. Several important classifications are presented in more detail, with mention of advantages and disadvantages. Final recommendations are provided for future direction in the area of classification schemes.

  12. A risk-based classification scheme for genetically modified foods. II: Graded testing.

    Science.gov (United States)

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    This paper presents a graded approach to the testing of crop-derived genetically modified (GM) foods based on concern levels in a proposed risk-based classification scheme (RBCS) and currently available testing methods. A graded approach offers the potential for more efficient use of testing resources by focusing less on lower concern GM foods, and more on higher concern foods. In this proposed approach to graded testing, products that are classified as Level I would have met baseline testing requirements that are comparable to what is widely applied to premarket assessment of GM foods at present. In most cases, Level I products would require no further testing, or very limited confirmatory analyses. For products classified as Level II or higher, additional testing would be required, depending on the type of the substance, prior dietary history, estimated exposure level, prior knowledge of toxicity of the substance, and the nature of the concern related to unintended changes in the modified food. Level III testing applies only to the assessment of toxic and antinutritional effects from intended changes and is tailored to the nature of the substance in question. Since appropriate test methods are not currently available for all effects of concern, future research to strengthen the testing of GM foods is discussed.

  13. A classification scheme for chimera states

    Science.gov (United States)

    Kemeth, Felix P.; Haugland, Sindre W.; Schmidt, Lennart; Kevrekidis, Ioannis G.; Krischer, Katharina

    2016-09-01

    We present a universal characterization scheme for chimera states applicable to both numerical and experimental data sets. The scheme is based on two correlation measures that enable a meaningful definition of chimera states as well as their classification into three categories: stationary, turbulent, and breathing. In addition, these categories can be further subdivided according to the time-stationarity of these two measures. We demonstrate that this approach is both consistent with previously recognized chimera states and enables us to classify states as chimeras which have not been categorized as such before. Furthermore, the scheme allows for a qualitative and quantitative comparison of experimental chimeras with chimeras obtained through numerical simulations.

  14. A new classification scheme of plastic wastes based upon recycling labels.

    Science.gov (United States)

    Özkan, Kemal; Ergin, Semih; Işık, Şahin; Işıklı, Idil

    2015-01-01

    results agree on. The proposed classification scheme provides high accuracy rate, and also it is able to run in real-time applications. It can automatically classify the plastic bottle types with approximately 90% recognition accuracy. Besides this, the proposed methodology yields approximately 96% classification rate for the separation of PET or non-PET plastic types. It also gives 92% accuracy for the categorization of non-PET plastic types into HPDE or PP.

  15. 基于特征点分类的模糊金库方案%Fuzzy Vault Scheme Based on Classification of Fingerprint Features Scheme

    Institute of Scientific and Technical Information of China (English)

    孙方圆; 郑建德; 徐千惠

    2016-01-01

    For the purpose of solving fingerprint template leakage problem and the inability of combining fingerprints and traditional keys in the traditional fingerprintidentification,fuzzy vault scheme based on classification of fingerprint features scheme (CFM-FV) is proposed in this paper.In our scheme,singularities will be as helper data for pre-align the fingerprint,while the minutia features will be used to encode the vault in this scheme.In the stage of verification,singularities will be extracted as helper data for fingerprint pre-aligned,then the extracted minutia features will be used to reconstruct the polynomial.In this scheme,the problem that the tradi-tional scheme cannot align the fingerprint blind will be solved to some extent by combining classification method of fingerprint fea-tures with fuzzy vault scheme.%为解决传统指纹认证方案中指纹模板信息泄露以及指纹和密钥无法融合等问题,提出一种基于指纹特征点分类的模糊金库方案(CFM-FV).该方案中,使用指纹奇异点作为辅助数据对指纹图像进行预对齐,将指纹细节点特征应用于模糊金库方案进行密钥绑定.验证时,提取查询指纹奇异点作为辅助数据对指纹预对齐,然后提取指纹细节点特征信息进行多项式的重构.本方案将指纹特征点分类方法与模糊金库方案相结合,一定程度上解决了传统模糊金库方案中无法实现指纹盲对齐带来的影响问题.

  16. Intelligent Video Object Classification Scheme using Offline Feature Extraction and Machine Learning based Approach

    Directory of Open Access Journals (Sweden)

    Chandra Mani Sharma

    2012-01-01

    Full Text Available Classification of objects in video stream is important because of its application in many emerging areas such as visual surveillance, content based video retrieval and indexing etc. The task is far more challenging because the video data is of heavy and highly variable nature. The processing of video data is required to be in real-time. This paper presents a multiclass object classification technique using machine learning approach. Haar-like features are used for training the classifier. The feature calculation is performed using Integral Image representation and we train the classifier offline using a Stage-wise Additive Modeling using a Multiclass Exponential loss function (SAMME. The validity of the method has been verified from the implementation of a real-time human-car detector. Experimental results show that the proposed method can accurately classify objects, in video, into their respective classes. The proposed object classifier works well in outdoor environment in presence of moderate lighting conditions and variable scene background. The proposed technique is compared, with other object classification techniques, based on various performance parameters.

  17. Parallel Implementation of Morphological Profile Based Spectral-Spatial Classification Scheme for Hyperspectral Imagery

    Science.gov (United States)

    Kumar, B.; Dikshit, O.

    2016-06-01

    Extended morphological profile (EMP) is a good technique for extracting spectral-spatial information from the images but large size of hyperspectral images is an important concern for creating EMPs. However, with the availability of modern multi-core processors and commodity parallel processing systems like graphics processing units (GPUs) at desktop level, parallel computing provides a viable option to significantly accelerate execution of such computations. In this paper, parallel implementation of an EMP based spectralspatial classification method for hyperspectral imagery is presented. The parallel implementation is done both on multi-core CPU and GPU. The impact of parallelization on speed up and classification accuracy is analyzed. For GPU, the implementation is done in compute unified device architecture (CUDA) C. The experiments are carried out on two well-known hyperspectral images. It is observed from the experimental results that GPU implementation provides a speed up of about 7 times, while parallel implementation on multi-core CPU resulted in speed up of about 3 times. It is also observed that parallel implementation has no adverse impact on the classification accuracy.

  18. A new classification scheme for deep geothermal systems based on geologic controls

    Science.gov (United States)

    Moeck, I.

    2012-04-01

    A key element in the characterization, assessment and development of geothermal energy systems is the resource classification. Throughout the past 30 years many classifications and definitions were published mainly based on temperature and thermodynamic properties. In the past classification systems, temperature has been the essential measure of the quality of the resource and geothermal systems have been divided into three different temperature (or enthalpy) classes: low-temperature, moderate-temperature and high-temperature. There are, however, no uniform temperature ranges for these classes. It is still a key requirement of a geothermal classification that resource assessment provides logical and consistent frameworks simplified enough to communicate important aspects of geothermal energy potential to both non-experts and general public. One possible solution may be to avoid classifying geothermal resources by temperature and simply state the range of temperatures at the individual site. Due to technological development, in particular in EGS (Enhanced Geothermal Systems or Engineered Geothermal Systems; both terms are considered synonymously in this thesis) technology, currently there are more geothermal systems potentially economic than 30 years ago. An alternative possibility is to classify geothermal energy systems by their geologic setting. Understanding and characterizing the geologic controls on geothermal systems has been an ongoing focus on different scales from plate tectonics to local tectonics/structural geology. In fact, the geologic setting has a fundamental influence on the potential temperature, on the fluid composition, the reservoir characteristics and whether the system is a predominantly convective or conductive system. The key element in this new classification for geothermal systems is the recognition that a geothermal system is part of a geological system. The structural geological and plate tectonic setting has a fundamental influence on

  19. DCT domain feature extraction scheme based on motor unit action potential of EMG signal for neuromuscular disease classification.

    Science.gov (United States)

    Doulah, Abul Barkat Mollah Sayeed Ud; Fattah, Shaikh Anowarul; Zhu, Wei-Ping; Ahmad, M Omair

    2014-01-01

    A feature extraction scheme based on discrete cosine transform (DCT) of electromyography (EMG) signals is proposed for the classification of normal event and a neuromuscular disease, namely the amyotrophic lateral sclerosis. Instead of employing DCT directly on EMG data, it is employed on the motor unit action potentials (MUAPs) extracted from the EMG signal via a template matching-based decomposition technique. Unlike conventional MUAP-based methods, only one MUAP with maximum dynamic range is selected for DCT-based feature extraction. Magnitude and frequency values of a few high-energy DCT coefficients corresponding to the selected MUAP are used as the desired feature which not only reduces computational burden, but also offers better feature quality with high within-class compactness and between-class separation. For the purpose of classification, the K-nearest neighbourhood classifier is employed. Extensive analysis is performed on clinical EMG database and it is found that the proposed method provides a very satisfactory performance in terms of specificity, sensitivity and overall classification accuracy.

  20. A classification scheme for risk assessment methods.

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In

  1. A hierarchical classification scheme of psoriasis images

    DEFF Research Database (Denmark)

    Maletti, Gabriela Mariel; Ersbøll, Bjarne Kjær

    2003-01-01

    the normal skin in the second stage. These tools are the Expectation-Maximization Algorithm, the quadratic discrimination function and a classification window of optimal size. Extrapolation of classification parameters of a given image to other images of the set is evaluated by means of Cohen's Kappa......A two-stage hierarchical classification scheme of psoriasis lesion images is proposed. These images are basically composed of three classes: normal skin, lesion and background. The scheme combines conventional tools to separate the skin from the background in the first stage, and the lesion from...

  2. A Classification Scheme for Glaciological AVA Responses

    Science.gov (United States)

    Booth, A.; Emir, E.

    2014-12-01

    A classification scheme is proposed for amplitude vs. angle (AVA) responses as an aid to the interpretation of seismic reflectivity in glaciological research campaigns. AVA responses are a powerful tool in characterising the material properties of glacier ice and its substrate. However, before interpreting AVA data, careful true amplitude processing is required to constrain basal reflectivity and compensate amplitude decay mechanisms, including anelastic attenuation and spherical divergence. These fundamental processing steps can be difficult to design in cases of noisy data, e.g. where a target reflection is contaminated by surface wave energy (in the case of shallow glaciers) or by energy reflected from out of the survey plane. AVA methods have equally powerful usage in estimating the fluid fill of potential hydrocarbon reservoirs. However, such applications seldom use true amplitude data and instead consider qualitative AVA responses using a well-defined classification scheme. Such schemes are often defined in terms of the characteristics of best-fit responses to the observed reflectivity, e.g. the intercept (I) and gradient (G) of a linear approximation to the AVA data. The position of the response on a cross-plot of I and G then offers a diagnostic attribute for certain fluid types. We investigate the advantages in glaciology of emulating this practice, and develop a cross-plot based on the 3-term Shuey AVA approximation (using I, G, and a curvature term C). Model AVA curves define a clear lithification trend: AVA responses to stiff (lithified) substrates fall discretely into one quadrant of the cross-plot, with positive I and negative G, whereas those to fluid-rich substrates plot diagonally opposite (in the negative I and positive G quadrant). The remaining quadrants are unoccupied by plausible single-layer responses and may therefore be diagnostic of complex thin-layer reflectivity, and the magnitude and polarity of the C term serves as a further indicator

  3. Environmental endocrine disruptors: A proposed classification scheme

    Energy Technology Data Exchange (ETDEWEB)

    Fur, P.L. de; Roberts, J. [Environmental Defense Fund, Washington, DC (United States)

    1995-12-31

    A number of chemicals known to act on animal systems through the endocrine system have been termed environmental endocrine disruptors. This group includes some of the PCBs and TCDDs, as well as lead, mercury and a large number of pesticides. The common feature is that the chemicals interact with endogenous endocrine systems at the cellular and/or molecular level to alter normal processes that are controlled or regulated by hormones. Although the existence of artificial or environmental estrogens (e.g. chlordecone and DES) has been known for some time, recent data indicate that this phenomenon is widespread. Indeed, anti-androgens have been held responsible for reproductive dysfunction in alligator populations in Florida. But the significance of endocrine disruption was recognized by pesticide manufacturers when insect growth regulators were developed to interfere with hormonal control of growth. Controlling, regulating or managing these chemicals depends in no small part on the ability to identify, screen or otherwise know that a chemical is an endocrine disrupter. Two possible classifications schemes are: using the effects caused in an animal, or animals as an exposure indicator; and using a known screen for the point of contact with the animal. The former would require extensive knowledge of cause and effect relationships in dozens of animal groups; the latter would require a screening tool comparable to an estrogen binding assay. The authors present a possible classification based on chemicals known to disrupt estrogenic, androgenic and ecdysone regulated hormonal systems.

  4. A Physical Classification Scheme for Blazars

    CERN Document Server

    Landt, H; Perlman, E S; Giommi, P

    2004-01-01

    Blazars are currently separated into BL Lacertae objects (BL Lacs) and flat spectrum radio quasars (FSRQ) based on the strength of their emission lines. This is done rather arbitrarily by defining a diagonal line in the Ca H&K break value -- equivalent width plane, following Marcha et al. We readdress this problem and put the classification scheme for blazars on firm physical grounds. We study ~100 blazars and radio galaxies from the Deep X-ray Radio Blazar Survey (DXRBS) and 2 Jy radio survey and find a significant bimodality for the narrow emission line [OIII] 5007. This suggests the presence of two physically distinct classes of radio-loud AGN. We show that all radio-loud AGN, blazars and radio galaxies, can be effectively separated into weak- and strong-lined sources using the [OIII] 5007 -- [OII] 3727 equivalent width plane. This plane allows one to disentangle orientation effects from intrinsic variations in radio-loud AGN. Based on DXRBS, the strongly beamed sources of the new class of weak-lined r...

  5. Development and application of a new comprehensive image-based classification scheme for coastal and benthic environments along the southeast Florida continental shelf

    Science.gov (United States)

    Makowski, Christopher

    The coastal (terrestrial) and benthic environments along the southeast Florida continental shelf show a unique biophysical succession of marine features from a highly urbanized, developed coastal region in the north (i.e. northern Miami-Dade County) to a protective marine sanctuary in the southeast (i.e. Florida Keys National Marine Sanctuary). However, the establishment of a standard bio-geomorphological classification scheme for this area of coastal and benthic environments is lacking. The purpose of this study was to test the hypothesis and answer the research question of whether new parameters of integrating geomorphological components with dominant biological covers could be developed and applied across multiple remote sensing platforms for an innovative way to identify, interpret, and classify diverse coastal and benthic environments along the southeast Florida continental shelf. An ordered manageable hierarchical classification scheme was developed to incorporate the categories of Physiographic Realm, Morphodynamic Zone, Geoform, Landform, Dominant Surface Sediment, and Dominant Biological Cover. Six different remote sensing platforms (i.e. five multi-spectral satellite image sensors and one high-resolution aerial orthoimagery) were acquired, delineated according to the new classification scheme, and compared to determine optimal formats for classifying the study area. Cognitive digital classification at a nominal scale of 1:6000 proved to be more accurate than autoclassification programs and therefore used to differentiate coastal marine environments based on spectral reflectance characteristics, such as color, tone, saturation, pattern, and texture of the seafloor topology. In addition, attribute tables were created in conjugation with interpretations to quantify and compare the spatial relationships between classificatory units. IKONOS-2 satellite imagery was determined to be the optimal platform for applying the hierarchical classification scheme

  6. A “Salt and Pepper” Noise Reduction Scheme for Digital Images Based on Support Vector Machines Classification and Regression

    Directory of Open Access Journals (Sweden)

    Hilario Gómez-Moreno

    2014-01-01

    Full Text Available We present a new impulse noise removal technique based on Support Vector Machines (SVM. Both classification and regression were used to reduce the “salt and pepper” noise found in digital images. Classification enables identification of noisy pixels, while regression provides a means to determine reconstruction values. The training vectors necessary for the SVM were generated synthetically in order to maintain control over quality and complexity. A modified median filter based on a previous noise detection stage and a regression-based filter are presented and compared to other well-known state-of-the-art noise reduction algorithms. The results show that the filters proposed achieved good results, outperforming other state-of-the-art algorithms for low and medium noise ratios, and were comparable for very highly corrupted images.

  7. An Efficient Machine Learning Based Classification Scheme for Detecting Distributed Command & Control Traffic of P2P Botnets

    Directory of Open Access Journals (Sweden)

    Pijush Barthakur

    2013-11-01

    Full Text Available Biggest internet security threat is the rise of Botnets having modular and flexible structures. The combined power of thousands of remotely controlled computers increases the speed and severity of attacks. In this paper, we provide a comparative analysis of machine-learning based classification of botnet command & control(C&C traffic for proactive detection of Peer-to-Peer (P2P botnets. We combine some of selected botnet C&C traffic flow features with that of carefully selected botnet behavioral characteristic features for better classification using machine learning algorithms. Our simulation results show that our method is very effective having very good test accuracy and very little training time. We compare the performances of Decision Tree (C4.5, Bayesian Network and Linear Support Vector Machines using performance metrics like accuracy, sensitivity, positive predictive value(PPV and F-Measure. We also provide a comparative analysis of our predictive models using AUC (area under ROC curve. Finally, we propose a rule induction algorithm from original C4.5 algorithm of Quinlan. Our proposed algorithm produces better accuracy than the original decision tree classifier.

  8. State of the Art in the Cramer Classification Scheme and ...

    Science.gov (United States)

    Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD. Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD.

  9. An improved fault detection classification and location scheme based on wavelet transform and artificial neural network for six phase transmission line using single end data only.

    Science.gov (United States)

    Koley, Ebha; Verma, Khushaboo; Ghosh, Subhojit

    2015-01-01

    Restrictions on right of way and increasing power demand has boosted development of six phase transmission. It offers a viable alternative for transmitting more power, without major modification in existing structure of three phase double circuit transmission system. Inspite of the advantages, low acceptance of six phase system is attributed to the unavailability of a proper protection scheme. The complexity arising from large number of possible faults in six phase lines makes the protection quite challenging. The proposed work presents a hybrid wavelet transform and modular artificial neural network based fault detector, classifier and locator for six phase lines using single end data only. The standard deviation of the approximate coefficients of voltage and current signals obtained using discrete wavelet transform are applied as input to the modular artificial neural network for fault classification and location. The proposed scheme has been tested for all 120 types of shunt faults with variation in location, fault resistance, fault inception angles. The variation in power system parameters viz. short circuit capacity of the source and its X/R ratio, voltage, frequency and CT saturation has also been investigated. The result confirms the effectiveness and reliability of the proposed protection scheme which makes it ideal for real time implementation.

  10. Sound classification of dwellings - Comparison of schemes in Europe

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2009-01-01

    National sound classification schemes for dwellings exist in nine countries in Europe, and proposals are under preparation in more countries. The schemes specify class criteria concerning several acoustic aspects, the main criteria being about airborne and impact sound insulation between dwellings......, facade sound insulation and installation noise. The quality classes reflect dierent levels of acoustical comfort. The paper presents and compares the sound classification schemes in Europe. The schemes have been implemented and revised gradually since the 1990es. However, due to lack of coordination....... The current variety of descriptors and classes also causes trade barriers. Thus, there is a need to harmonize concepts and other characteristics of the schemes....

  11. Comparing document classification schemes using k-means clustering

    OpenAIRE

    Šivić, Artur; Žmak, Lovro; Dalbelo Bašić, Bojana; Moens, Marie-Francine

    2008-01-01

    In this work, we jointly apply several text mining methods to a corpus of legal documents in order to compare the separation quality of two inherently different document classification schemes. The classification schemes are compared with the clusters produced by the k-means algorithm. In the future, we believe that our comparison method will be coupled with semi-supervised and active learning techniques. Also, this paper presents the idea of combining k-means and Principal Component Analysis...

  12. A Classification Scheme for Phenomenological Universalities in Growth Problems

    CERN Document Server

    Castorina, P; Guiot, C

    2006-01-01

    A classification in universality classes of broad categories of phenomenologies, belonging to different disciplines, may be very useful for a crossfertilization among them and for the purpose of pattern recognition. We present here a simple scheme for the classification of nonlinear growth problems. The success of the scheme in predicting and characterizing the well known Gompertz, West and logistic models suggests to us the study of a hitherto unexplored class of nonlinear growth problems.

  13. International proposal for an acoustic classification scheme for dwellings

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2014-01-01

    European countries have introduced classification schemes. The schemes typically include four classes. Comparative studies have shown significant discrepancies between countries due to national development of schemes. The diversity is an obstacle for exchange of construction experience for different...... classes, implying also trade barriers. Thus, a harmonized classification scheme would be useful, and the European COST Action TU0901 "Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing Constructions", running 2009-2013 with members from 32 countries, including three overseas...... countries, had as one of the main objectives preparation of a proposal for a harmonized acoustic classification scheme. The proposal developed has been approved as an ISO/TC43/SC2 work item, and a working group established. This paper describes the proposal, the background and the perspectives....

  14. Four classification schemes of adult motivation: current views and measures.

    Science.gov (United States)

    Barbuto, John E

    2006-04-01

    Classification of perspectives on motivation and recommendations for measurement are provided. Motivation is classified into four broad categories: content theories, process theories, decision-making theories, and sustained-effort theories--drawing from different theories and measures. Recommendations on measurement are developed for each classification scheme of motivation.

  15. Towards a Collaborative Intelligent Tutoring System Classification Scheme

    Science.gov (United States)

    Harsley, Rachel

    2014-01-01

    This paper presents a novel classification scheme for Collaborative Intelligent Tutoring Systems (CITS), an emergent research field. The three emergent classifications of CITS are unstructured, semi-structured, and fully structured. While all three types of CITS offer opportunities to improve student learning gains, the full extent to which these…

  16. A Unified Near Infrared Spectral Classification Scheme for T Dwarfs

    CERN Document Server

    Burgasser, A J; Leggett, S K; Kirkpatrick, J D; Golimowski, D A; Burgasser, Adam J.; Golimowski, David A.

    2006-01-01

    A revised near infrared classification scheme for T dwarfs is presented, based on and superseding prior schemes developed by Burgasser et al. and Geballe et al., and defined following the precepts of the MK Process. Drawing from two large spectroscopic libraries of T dwarfs identified largely in the Sloan Digital Sky Survey and the Two Micron All Sky Survey, nine primary spectral standards and five alternate standards spanning spectral types T0 to T8 are identified that match criteria of spectral character, brightness, absence of a resolved companion and accessibility from both northern and southern hemispheres. The classification of T dwarfs is formally made by the direct comparison of near infrared spectral data of equivalent resolution to the spectra of these standards. Alternately, we have redefined five key spectral indices measuring the strengths of the major H$_2$O and CH$_4$ bands in the 1-2.5 micron region that may be used as a proxy to direct spectral comparison. Two methods of determining T spectra...

  17. Classification and prioritization of usability problems using an augmented classification scheme.

    Science.gov (United States)

    Khajouei, R; Peute, L W P; Hasman, A; Jaspers, M W M

    2011-12-01

    Various methods exist for conducting usability evaluation studies in health care. But although the methodology is clear, no usability evaluation method provides a framework by which the usability reporting activities are fully standardized. Despite the frequent use of forms to report the usability problems and their context-information, this reporting is often hindered by information losses. This is due to the fact that evaluators' problem descriptions are based on individual judgments of what they find salient about a usability problem at a certain moment in time. Moreover, usability problems are typically classified in terms of their type, number, and severity. These classes are usually devised by the evaluator for the purpose at hand and the used problem types often are not mutually exclusive, complete and distinct. Also the impact of usability problems on the task outcome is usually not taken into account. Consequently, problem descriptions are often vague and even when combined with their classification in type or severity leave room for multiple interpretations when discussed with system designers afterwards. Correct interpretation of these problem descriptions is then highly dependent upon the extent to which the evaluators can retrieve relevant details from memory. To remedy this situation a framework is needed guiding usability evaluators in high quality reporting and unique classification of usability problems. Such a framework should allow the disclosure of the underlying essence of problem causes, the severity rating and the classification of the impact of usability problems on the task outcome. The User Action Framework (UAF) is an existing validated classification framework that allows the unique classification of usability problems, but it does not include a severity rating nor does it contain an assessment of the potential impact of usability flaws on the final task outcomes. We therefore augmented the UAF with a severity rating based on Nielsen

  18. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...

  19. Transporter taxonomy - a comparison of different transport protein classification schemes.

    Science.gov (United States)

    Viereck, Michael; Gaulton, Anna; Digles, Daniela; Ecker, Gerhard F

    2014-06-01

    Currently, there are more than 800 well characterized human membrane transport proteins (including channels and transporters) and there are estimates that about 10% (approx. 2000) of all human genes are related to transport. Membrane transport proteins are of interest as potential drug targets, for drug delivery, and as a cause of side effects and drug–drug interactions. In light of the development of Open PHACTS, which provides an open pharmacological space, we analyzed selected membrane transport protein classification schemes (Transporter Classification Database, ChEMBL, IUPHAR/BPS Guide to Pharmacology, and Gene Ontology) for their ability to serve as a basis for pharmacology driven protein classification. A comparison of these membrane transport protein classification schemes by using a set of clinically relevant transporters as use-case reveals the strengths and weaknesses of the different taxonomy approaches.

  20. The interplay of descriptor-based computational analysis with pharmacophore modeling builds the basis for a novel classification scheme for feruloyl esterases

    DEFF Research Database (Denmark)

    Udatha, D.B.R.K. Gupta; Kouskoumvekaki, Irene; Olsson, Lisbeth

    2011-01-01

    classification studies on FAEs were restricted on sequence similarity and substrate specificity on just four model substrates and considered only a handful of FAEs belonging to the fungal kingdom. This study centers on the descriptor-based classification and structural analysis of experimentally verified...... on amino acid composition and physico-chemical composition descriptors derived from the respective amino acid sequence. A Support Vector Machine model was subsequently constructed for the classification of new FAEs into the pre-assigned clusters. The model successfully recognized 98.2% of the training...... sequences and all the sequences of the blind test. The underlying functionality of the 12 proposed FAE families was validated against a combination of prediction tools and published experimental data. Another important aspect of the present work involves the development of pharmacophore models for the new...

  1. Brownian-motion ensembles of random matrix theory: A classification scheme and an integral transform method

    Energy Technology Data Exchange (ETDEWEB)

    Macedo-Junior, A.F. [Departamento de Fisica, Laboratorio de Fisica Teorica e Computacional, Universidade Federal de Pernambuco, 50670-901 Recife, PE (Brazil)]. E-mail: ailton@df.ufpe.br; Macedo, A.M.S. [Departamento de Fisica, Laboratorio de Fisica Teorica e Computacional, Universidade Federal de Pernambuco, 50670-901 Recife, PE (Brazil)

    2006-09-25

    We study a class of Brownian-motion ensembles obtained from the general theory of Markovian stochastic processes in random-matrix theory. The ensembles admit a complete classification scheme based on a recent multivariable generalization of classical orthogonal polynomials and are closely related to Hamiltonians of Calogero-Sutherland-type quantum systems. An integral transform is proposed to evaluate the n-point correlation function for a large class of initial distribution functions. Applications of the classification scheme and of the integral transform to concrete physical systems are presented in detail.

  2. The Nutraceutical Bioavailability Classification Scheme: Classifying Nutraceuticals According to Factors Limiting their Oral Bioavailability.

    Science.gov (United States)

    McClements, David Julian; Li, Fang; Xiao, Hang

    2015-01-01

    The oral bioavailability of a health-promoting dietary component (nutraceutical) may be limited by various physicochemical and physiological phenomena: liberation from food matrices, solubility in gastrointestinal fluids, interaction with gastrointestinal components, chemical degradation or metabolism, and epithelium cell permeability. Nutraceutical bioavailability can therefore be improved by designing food matrices that control their bioaccessibility (B*), absorption (A*), and transformation (T*) within the gastrointestinal tract (GIT). This article reviews the major factors influencing the gastrointestinal fate of nutraceuticals, and then uses this information to develop a new scheme to classify the major factors limiting nutraceutical bioavailability: the nutraceutical bioavailability classification scheme (NuBACS). This new scheme is analogous to the biopharmaceutical classification scheme (BCS) used by the pharmaceutical industry to classify drug bioavailability, but it contains additional factors important for understanding nutraceutical bioavailability in foods. The article also highlights potential strategies for increasing the oral bioavailability of nutraceuticals based on their NuBACS designation (B*A*T*).

  3. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  4. Adaptive codebook selection schemes for image classification in correlated channels

    Science.gov (United States)

    Hu, Chia Chang; Liu, Xiang Lian; Liu, Kuan-Fu

    2015-09-01

    The multiple-input multiple-output (MIMO) system with the use of transmit and receive antenna arrays achieves diversity and array gains via transmit beamforming. Due to the absence of full channel state information (CSI) at the transmitter, the transmit beamforming vector can be quantized at the receiver and sent back to the transmitter by a low-rate feedback channel, called limited feedback beamforming. One of the key roles of Vector Quantization (VQ) is how to generate a good codebook such that the distortion between the original image and the reconstructed image is the minimized. In this paper, a novel adaptive codebook selection scheme for image classification is proposed with taking both spatial and temporal correlation inherent in the channel into consideration. The new codebook selection algorithm is developed to select two codebooks from the discrete Fourier transform (DFT) codebook, the generalized Lloyd algorithm (GLA) codebook and the Grassmannian codebook to be combined and used as candidates of the original image and the reconstructed image for image transmission. The channel is estimated and divided into four regions based on the spatial and temporal correlation of the channel and an appropriate codebook is assigned to each region. The proposed method can efficiently reduce the required information of feedback under the spatially and temporally correlated channels, where each region is adaptively. Simulation results show that in the case of temporally and spatially correlated channels, the bit-error-rate (BER) performance can be improved substantially by the proposed algorithm compared to the one with only single codebook.

  5. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    CERN Document Server

    Kartaltepe, Jeyhan S; Kocevski, Dale; McIntosh, Daniel H; Lotz, Jennifer; Bell, Eric F; Faber, Sandra; Ferguson, Henry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J; Croton, Darren; Dahlen, Tomas; de Mello, Duilia F; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Inami, Hanae; Kassin, Susan; Koekemoer, Anton; Lani, Caterina; Liu, Nick; Lucas, Ray A; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; Mutch, Simon; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Poole, Gregory B; Rizer, Zachary; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Weiner, Benjamin; Wuyts, Stijn

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H<24.5 involving the dedicated efforts of 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies up to z<4 over all the fields. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, $k$-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed -- GOODS-S. The wide area coverage spanning the full field includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all ...

  6. Minimally-sized balanced decomposition schemes for multi-class classification

    NARCIS (Netherlands)

    Smirnov, E.N.; Moed, M.; Nalbantov, G.I.; Sprinkhuizen-Kuyper, I.G.

    2011-01-01

    Error-Correcting Output Coding (ECOC) is a well-known class of decomposition schemes for multi-class classification. It allows representing any multiclass classification problem as a set of binary classification problems. Due to code redundancy ECOC schemes can significantly improve generalization p

  7. Provable Secure Identity Based Generalized Signcryption Scheme

    CERN Document Server

    Yu, Gang; Shen, Yong; Han, Wenbao

    2010-01-01

    According to actual needs, generalized signcryption scheme can flexibly work as an encryption scheme, a signature scheme or a signcryption scheme. In this paper, firstly, we give a security model for identity based generalized signcryption which is more complete than existing model. Secondly, we propose an identity based generalized signcryption scheme. Thirdly, we give the security proof of the new scheme in this complete model. Comparing with existing identity based generalized signcryption, the new scheme has less implementation complexity. Moreover, the new scheme has comparable computation complexity with the existing normal signcryption schemes.

  8. Provable Secure Identity Based Generalized Signcryption Scheme

    OpenAIRE

    Yu, Gang; Ma, Xiaoxiao; Shen, Yong; Han, Wenbao

    2010-01-01

    According to actual needs, generalized signcryption scheme can flexibly work as an encryption scheme, a signature scheme or a signcryption scheme. In this paper, firstly, we give a security model for identity based generalized signcryption which is more complete than existing model. Secondly, we propose an identity based generalized signcryption scheme. Thirdly, we give the security proof of the new scheme in this complete model. Comparing with existing identity based generalized signcryption...

  9. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Science.gov (United States)

    2010-01-01

    ... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false National Estuarine Research Reserve System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce...

  10. An evolutionary scheme for morphological classification of Martian gullies

    Science.gov (United States)

    Aston, A. H.; Balme, M.

    2009-04-01

    Martian gullies are geologically recent small-scale features characterised by an alcove-channel-apron morphology associated on Earth with liquid water. Since their discovery by Malin and Edgett (1), several theories have been advanced to explain their formation. These typically emphasise either groundwater processes (1, 2) or melting of ground ice or snowpack (3). The former approach has been challenged on the basis of gullies observed on hills and central peaks, where aquifers are unlikely (4). Studies of gullied walls have been undertaken (5), but though morphological classifications of gullies have been proposed (1), they are largely descriptive. This study proposes an evolutionary classification scheme and a pilot study to determine its potential to address controversies in gully formation. A morphological classification of gullies was developed, and four types identified: Type I: V-shaped gullies in slope mantling material or scree (i.e. not cutting bedrock); no distinct alcoves. Type II: Alcoves capped by a distinct and continuous stratum of rock. Type III: Alcoves extending vertically upslope, without reaching top of slope. Type IV: Alcoves reaching top of slope and cutting back into cliff. The types form an evolutionary sequence: in particular, the sequence II-III-IV appears to represent the development of many Martian gullies. Moreover, we have found that average length increases from Type I to Type IV. Furthermore, the presence of small gullies (mostly I and II) in the mantling deposits filling larger alcoves suggests multiple stages of gully activity. To test the classifications in practice, a sample of gullied slope sections imaged by MOC (Mars Orbital Camera) on Mars Global Surveyor at a resolution of 1-7 m/pixel were catalogued using ArcGIS software. 210 slope sections were covered, representing 1734 gullies across the southern mid-latitudes. Broad geographical coverage was obtained by working through MOC image numbers. For each slope section, the

  11. Fair Electronic Payment Scheme Based on DSA

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-bin; HONG Fan; ZHU Xian

    2005-01-01

    We present a multi-signature scheme based on DSA and describes a fair electronic payment scheme based on improved DSA signatures. The scheme makes both sides in equal positions during the course of electronic transaction. A Trusted Third Party (TTP) is involved in the scheme to guarantee the fairness of the scheme for both sides. However, only during the course of registration and dispute resolution will TTP be needed. TTP is not needed during the normal payment stage.

  12. Texture Classification based on Gabor Wavelet

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2012-07-01

    Full Text Available This paper presents the comparison of Texture classification algorithms based on Gabor Wavelets. The focus of this paper is on feature extraction scheme for texture classification. The texture feature for an image can be classified using texture descriptors. In this paper we have used Homogeneous texture descriptor that uses Gabor Wavelets concept. For texture classification, we have used online texture database that is Brodatz’s database and three advanced well known classifiers: Support Vector Machine, K-nearest neighbor method and decision tree induction method. The results shows that classification using Support vector machines gives better results as compare to the other classifiers. It can accurately discriminate between a testing image data and training data.

  13. An Evaluation of Different Training Sample Allocation Schemes for Discrete and Continuous Land Cover Classification Using Decision Tree-Based Algorithms

    Directory of Open Access Journals (Sweden)

    René Roland Colditz

    2015-07-01

    Full Text Available Land cover mapping for large regions often employs satellite images of medium to coarse spatial resolution, which complicates mapping of discrete classes. Class memberships, which estimate the proportion of each class for every pixel, have been suggested as an alternative. This paper compares different strategies of training data allocation for discrete and continuous land cover mapping using classification and regression tree algorithms. In addition to measures of discrete and continuous map accuracy the correct estimation of the area is another important criteria. A subset of the 30 m national land cover dataset of 2006 (NLCD2006 of the United States was used as reference set to classify NADIR BRDF-adjusted surface reflectance time series of MODIS at 900 m spatial resolution. Results show that sampling of heterogeneous pixels and sample allocation according to the expected area of each class is best for classification trees. Regression trees for continuous land cover mapping should be trained with random allocation, and predictions should be normalized with a linear scaling function to correctly estimate the total area. From the tested algorithms random forest classification yields lower errors than boosted trees of C5.0, and Cubist shows higher accuracies than random forest regression.

  14. Inventory classification based on decoupling points

    Directory of Open Access Journals (Sweden)

    Joakim Wikner

    2015-01-01

    Full Text Available The ideal state of continuous one-piece flow may never be achieved. Still the logistics manager can improve the flow by carefully positioning inventory to buffer against variations. Strategies such as lean, postponement, mass customization, and outsourcing all rely on strategic positioning of decoupling points to separate forecast-driven from customer-order-driven flows. Planning and scheduling of the flow are also based on classification of decoupling points as master scheduled or not. A comprehensive classification scheme for these types of decoupling points is introduced. The approach rests on identification of flows as being either demand based or supply based. The demand or supply is then combined with exogenous factors, classified as independent, or endogenous factors, classified as dependent. As a result, eight types of strategic as well as tactical decoupling points are identified resulting in a process-based framework for inventory classification that can be used for flow design.

  15. Secure Identity-authentication Scheme Based on Minutiae Classification%基于指纹多生物特征的安全身份认证方案

    Institute of Scientific and Technical Information of China (English)

    黄家斌; 曹珍富

    2013-01-01

    由于多生物特征身份认证的安全稳定性,美国司法部引入指纹汗孔信息来帮助身份认证.然而指纹汗孔信息的泄露可以充分还原出指纹纹路信息.本文提出一种保护指纹汗孔等个人隐私信息的方案.方案采用高分辨率指纹仪提取带有汗孔的指纹数字图像,使用指纹细节点进行预对齐,指纹汗孔进行密钥绑定.现有的模糊金库方案在应用上都假设指纹预先对齐或采用一些准确率不高的对齐方法,本文首次将指纹第三层汗孔信息与模糊金库方案相结合.%As multi-biometric feature identification can get higher accuracy,the U.S.department of justice launched the next generation identification (NGI) project which will add the fingerprint pore information to help the identification.However,the leaking of personal fingerprint pore information can make it easy to restore the fingerprint ridges.The paper presents a scheme which can protect personal private pore information.The scheme uses high-resolution fingerprint scan device to capture fingerprint digital image with pores,aligns fingerprints by using minutiae,and binds key by using pore.Existing Fuzzy Vault scheme assumes fingerprints are pre-aligned or uses inaccurate methods to align fingerprints.The paper firstly combines fingerprint level3 information with Fuzzy Vault scheme.

  16. Acoustic classification schemes in Europe – Applicability for new, existing and renovated housing

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2016-01-01

    of the international scheme for classification of dwellings under development in ISO/TC43/SC2 will be explained. One of several key characteristics of the proposal is a wide range of classes, implying applicability to a major part of the existing housing stock in Europe, thus enabling acoustic labelling like energy......The first acoustic classification schemes for dwellings were published in the 1990’es as national standards with the main purpose to introduce the possibility of specifying easily stricter acoustic criteria for new-build than the minimum requirements found in building regulations. Since then, more...... countries have introduced acoustic classification schemes, the first countries updated more times and some countries introduced acoustic classification also for other building categories. However, the classification schemes continued to focus on new buildings and have in general limited applicability...

  17. A novel adaptive classification scheme for digital modulations in satellite communication

    Institute of Scientific and Technical Information of China (English)

    Wu Dan; Gu Xuemai; Guo Qing

    2007-01-01

    To make the modulation classification system more suitable for signals in a wide range of signal to noise ratios (SNRs) , a novel adaptive modulation classification scheme is presented in this paper. Different from traditional schemes, the proposed scheme employs a new SNR estimation algorithm for small samples before modulation classification, which makes the modulation classifier work adaptively according to estimated SNRs. Furthermore, it uses three efficient features and support vector machines (SVM) in modulation classification. Computer simulation shows that the scheme can adaptively classify ten digital modulation types (i.e. 2ASK, 4ASK, 2FSK, 4FSK, 2PSK, 4PSK, 16QAM, TFM, π/4QPSK and OQPSK) at SNRS ranging from OdB to 25 dB and success rates are over 95% when SNR is not lower than 3dB. Accuracy, efficiency and simplicity of the proposed scheme are obviously improved, which make it more adaptive to engineering applications.

  18. The four-populations model: a new classification scheme for pre-planetesimal collisions

    CERN Document Server

    Geretshauser, Ralf J; Speith, Roland; Kley, WIlhelm

    2011-01-01

    Within the collision growth scenario for planetesimal formation, the growth step from centimetre sized pre-planetesimals to kilometre sized planetesimals is still unclear. The formation of larger objects from the highly porous pre-planetesimals may be halted by a combination of fragmentation in disruptive collisions and mutual rebound with compaction. However, the right amount of fragmentation is necessary to explain the observed dust features in late T Tauri discs. Therefore, detailed data on the outcome of pre-planetesimal collisions is required and has to be presented in a suitable and precise format. We propose and apply a new classification scheme for pre-planetesimal collisions based on the quantitative aspects of four fragment populations: the largest and second largest fragment, a power-law population, and a sub-resolution population. For the simulations of pre-planetesimal collisions, we adopt the SPH numerical scheme with extensions for the simulation of porous solid bodies. By means of laboratory b...

  19. A two-tier atmospheric circulation classification scheme for the European-North Atlantic region

    Science.gov (United States)

    Guentchev, Galina S.; Winkler, Julie A.

    A two-tier classification of large-scale atmospheric circulation was developed for the European-North-Atlantic domain. The classification was constructed using a combination of principal components and k-means cluster analysis applied to reanalysis fields of mean sea-level pressure for 1951-2004. Separate classifications were developed for the winter, spring, summer, and fall seasons. For each season, the two classification tiers were identified independently, such that the definition of one tier does not depend on the other tier having already been defined. The first tier of the classification is comprised of supertype patterns. These broad-scale circulation classes are useful for generalized analyses such as investigations of the temporal trends in circulation frequency and persistence. The second, more detailed tier consists of circulation types and is useful for numerous applied research questions regarding the relationships between large-scale circulation and local and regional climate. Three to five supertypes and up to 19 circulation types were identified for each season. An intuitive nomenclature scheme based on the physical entities (i.e., anomaly centers) which dominate the specific patterns was used to label each of the supertypes and types. Two example applications illustrate the potential usefulness of a two-tier classification. In the first application, the temporal variability of the supertypes was evaluated. In general, the frequency and persistence of supertypes dominated by anticyclonic circulation increased during the study period, whereas the supertypes dominated by cyclonic features decreased in frequency and persistence. The usefulness of the derived circulation types was exemplified by an analysis of the circulation associated with heat waves and cold spells reported at several cities in Bulgaria. These extreme temperature events were found to occur with a small number of circulation types, a finding that can be helpful in understanding past

  20. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers.

    Science.gov (United States)

    Jack, Clifford R; Bennett, David A; Blennow, Kaj; Carrillo, Maria C; Feldman, Howard H; Frisoni, Giovanni B; Hampel, Harald; Jagust, William J; Johnson, Keith A; Knopman, David S; Petersen, Ronald C; Scheltens, Philip; Sperling, Reisa A; Dubois, Bruno

    2016-08-02

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the "A/T/N" system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. "A" refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); "T," the value of a tau biomarker (CSF phospho tau, or tau PET); and "N," biomarkers of neurodegeneration or neuronal injury ([(18)F]-fluorodeoxyglucose-PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N-, or A+/T-/N-, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme.

  1. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers

    Science.gov (United States)

    Bennett, David A.; Blennow, Kaj; Carrillo, Maria C.; Feldman, Howard H.; Frisoni, Giovanni B.; Hampel, Harald; Jagust, William J.; Johnson, Keith A.; Knopman, David S.; Petersen, Ronald C.; Scheltens, Philip; Sperling, Reisa A.; Dubois, Bruno

    2016-01-01

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the “A/T/N” system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. “A” refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); “T,” the value of a tau biomarker (CSF phospho tau, or tau PET); and “N,” biomarkers of neurodegeneration or neuronal injury ([18F]-fluorodeoxyglucose–PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N−, or A+/T−/N−, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme. PMID:27371494

  2. Mathematics subject classification and related schemes in the OAI framework

    OpenAIRE

    De Robbio, Antonella; Maguolo, Dario; Marini, Alberto

    2002-01-01

    This paper aims to give a feeling of the roles that discipline-oriented subject classifications can play in the Open Archive movement for the free dissemination of information in research activities. Mathematics, and Mathematics Subject Classification, will be the focuses around which we will move to discover a variety of presentation modes, protocols and tools for human and machine interoperability. The Open Archives Initiative (OAI) is intended to be the effective framework for such a play....

  3. 命名数据网络中一种基于节点分类的数据存储策略%A Data Caching Scheme Based on Node Classification in Named Data Networking

    Institute of Scientific and Technical Information of China (English)

    黄胜; 滕明埝; 吴震; 许江华; 季瑞军

    2016-01-01

    Compared with the traditional Internet , in‐networking caching is one of the most distinguishable features in named data networking (NDN) .In NDN ,a node caches every passing data packet as a default model . The caching scheme generates a large number of redundant data in in‐networking .As a consequence , the networking cache resource is wasted seriously . To solve the problem ,a caching scheme based on node classification (BNC) is proposed firstly in this paper .Based on different node positions ,the nodes that data packet passes through are divided into two types :“edge” type and “core” type .When data packet passes through the “core” type nodes ,by considering location and data popularity distribution at different nodes ,it is cached in a node which is beneficial to other nodes .When the data packet passes through the “edge” nodes ,a node is selected through data popularity to be beneficial to the client .The simulation results show that the proposed scheme can efficiently improve the in‐network hit ratio and reduce the delay and hops of getting the data packet .%缓存是命名数据网络(named data networking ,NDN )有别于传统网络最突出的特性之一, NDN 中默认所有节点都具有缓存所有经过数据的功能。这种“处处缓存”策略导致网内大量冗余数据的产生,使网内缓存被严重浪费。针对上述问题,首次提出了一种基于节点分类(based on node classification ,BNC)的数据存储策略。基于节点位置的不同,将数据返回客户端所经过的节点分为“边缘”类节点与“核心”类节点。当数据经过“核心”类节点时,通过权衡该类节点的位置与数据在不同节点的流行度分布,将数据存储在对其他节点最有利的节点中;当数据经过“边缘”类节点时,通过该数据流行度来选择最有利于客户端的位置。仿真结果表明,提出的策略将有效提高数据命中率,

  4. Signcryption scheme based on schnorr digital signature

    CERN Document Server

    Savu, Laura

    2012-01-01

    This article presents a new signcryption scheme which is based on the Schnorr digital signature algorithm. The new scheme represents my personal contribution to signcryption area. I have been implemented the algorithm in a program and here are provided the steps of the algorithm, the results and some examples. The paper also contains the presentation of the original Signcryption scheme, based on ElGamal digital signature and discusses the practical applications of Signcryption in real life.

  5. Sound classification of dwellings – A diversity of national schemes in Europe

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2011-01-01

    constructions fulfilling different classes. The current variety of descriptors and classes also causes trade barriers. Thus, there is a need to harmonize characteristics of the schemes, and a European COST Action TU0901 "Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing......Sound classification schemes for dwellings exist in ten countries in Europe, typically prepared and published as national standards. The schemes define quality classes intended to reflect different levels of acoustical comfort. The main criteria concern airborne and impact sound insulation between...... dwellings, facade sound insulation and installation noise. This paper presents the sound classification schemes in Europe and compares the class criteria for sound insulation between dwellings. The schemes have been implemented and revised gradually since the early 1990s. However, due to lack...

  6. Biogeography based Satellite Image Classification

    CERN Document Server

    Panchal, V K; Kaur, Navdeep; Kundra, Harish

    2009-01-01

    Biogeography is the study of the geographical distribution of biological organisms. The mindset of the engineer is that we can learn from nature. Biogeography Based Optimization is a burgeoning nature inspired technique to find the optimal solution of the problem. Satellite image classification is an important task because it is the only way we can know about the land cover map of inaccessible areas. Though satellite images have been classified in past by using various techniques, the researchers are always finding alternative strategies for satellite image classification so that they may be prepared to select the most appropriate technique for the feature extraction task in hand. This paper is focused on classification of the satellite image of a particular land cover using the theory of Biogeography based Optimization. The original BBO algorithm does not have the inbuilt property of clustering which is required during image classification. Hence modifications have been proposed to the original algorithm and...

  7. Block-based adaptive lifting schemes for multiband image compression

    Science.gov (United States)

    Masmoudi, Hela; Benazza-Benyahia, Amel; Pesquet, Jean-Christophe

    2004-02-01

    In this paper, we are interested in designing lifting schemes adapted to the statistics of the wavelet coefficients of multiband images for compression applications. More precisely, nonseparable vector lifting schemes are used in order to capture simultaneously the spatial and the spectral redundancies. The underlying operators are then computed in order to minimize the entropy of the resulting multiresolution representation. To this respect, we have developed a new iterative block-based classification algorithm. Simulation tests carried out on remotely sensed multispectral images indicate that a substantial gain in terms of bit-rate is achieved by the proposed adaptive coding method w.r.t the non-adaptive one.

  8. Texture classification based on EMD and FFT

    Institute of Scientific and Technical Information of China (English)

    XIONG Chang-zhen; XU Jun-yi; ZOU Jian-cheng; QI Dong-xu

    2006-01-01

    Empirical mode decomposition (EMD) is an adaptive and approximately orthogonal filtering process that reflects human's visual mechanism of differentiating textures. In this paper, we present a modified 2D EMD algorithm using the FastRBF and an appropriate number of iterations in the shifting process (SP), then apply it to texture classification. Rotation-invariant texture feature vectors are extracted using auto-registration and circular regions of magnitude spectra of 2D fast Fourier transform(FFT). In the experiments, we employ a Bayesion classifier to classify a set of 15 distinct natural textures selected from the Brodatz album. The experimental results, based on different testing datasets for images with different orientations, show the effectiveness of the proposed classification scheme.

  9. Kernel Clustering with a Differential Harmony Search Algorithm for Scheme Classification

    Directory of Open Access Journals (Sweden)

    Yu Feng

    2017-01-01

    Full Text Available This paper presents a kernel fuzzy clustering with a novel differential harmony search algorithm to coordinate with the diversion scheduling scheme classification. First, we employed a self-adaptive solution generation strategy and differential evolution-based population update strategy to improve the classical harmony search. Second, we applied the differential harmony search algorithm to the kernel fuzzy clustering to help the clustering method obtain better solutions. Finally, the combination of the kernel fuzzy clustering and the differential harmony search is applied for water diversion scheduling in East Lake. A comparison of the proposed method with other methods has been carried out. The results show that the kernel clustering with the differential harmony search algorithm has good performance to cooperate with the water diversion scheduling problems.

  10. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    Science.gov (United States)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Henry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J.; Croton, Darren; Dahlen, Tomas; deMello, Duilia F.; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Inami, Hanae; Kassin, Susan; Koekemoer, Anton; Lani, Caterina; Liu, Nick; Lucas, Ray A.; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Rizer, Zachary; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Weiner, Benjamin; Wuyts, Stijn

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H index. We find that the level of agreement among classifiers is quite good and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement and irregulars the lowest. A comparison of our classifications with the Sersic index and restframe colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or are very faint in the V-band.

  11. A comparison between national scheme for the acoustic classification of dwellings in Europe and in the U.S

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2015-01-01

    The classification of dwellings according to different building performances have been proposed through many schemes worldwide in the last years worldwide. The general idea behind these schemes relates to the positive impact a higher label, and thus a better performance, should have. In particula...... scheme may facilitate exchanging experiences about constructions fulfilling different classes, reducing trade barriers, and finally increasing the sound insulation of dwellings......., focusing on sound insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant...... work item. This paper compares sound classification schemes in Europe with the current situation in the United States. Economic evaluations related to the technological choices necessary to achieve different sound classification classes are also discussed. The hope is that a common sound classification...

  12. Classification of basic facilities for high-rise residential: A survey from 100 housing scheme in Kajang area

    Science.gov (United States)

    Ani, Adi Irfan Che; Sairi, Ahmad; Tawil, Norngainy Mohd; Wahab, Siti Rashidah Hanum Abd; Razak, Muhd Zulhanif Abd

    2016-08-01

    High demand for housing and limited land in town area has increasing the provision of high-rise residential scheme. This type of housing has different owners but share the same land lot and common facilities. Thus, maintenance works of the buildings and common facilities must be well organized. The purpose of this paper is to identify and classify basic facilities for high-rise residential building hoping to improve the management of the scheme. The method adopted is a survey on 100 high-rise residential schemes that ranged from affordable housing to high cost housing by using a snowball sampling. The scope of this research is within Kajang area, which is rapidly developed with high-rise housing. The objective of the survey is to list out all facilities in every sample of the schemes. The result confirmed that pre-determined 11 classifications hold true and can provide the realistic classification for high-rise residential scheme. This paper proposed for redefinition of facilities provided to create a better management system and give a clear definition on the type of high-rise residential based on its facilities.

  13. Job Performance Measurement Classification Scheme for Validation Research in the Military.

    Science.gov (United States)

    1986-02-01

    the information and procedures used with this approach. In order to develop a measurement methodology for job performance , a classification scheme of... job performance measurement quality was needed (a) to summarize and organize research progress in terms of previous empirical work and (b) to identify...literature review and specific directions for future job performance measurement research.

  14. A Novel Broadcast Scheme DSR-based Mobile Adhoc Networks

    Directory of Open Access Journals (Sweden)

    Muneer Bani Yassein

    2016-04-01

    Full Text Available Traffic classification seeks to assign packet flows to an appropriate quality of service (QoS. Despite many studies that have placed a lot of emphasis on broadcast communication, broadcasting in MANETs is still a problematic issue. Due to the absence of the fixed infrastructure in MANETs, broadcast is an essential operation for all network nodes. Although the blind flooding is the simplest broadcasting technique, it is inefficient and lacks resource utilization efficiency. One of the proposed schemes to mitigate the blind flooding deficiency is the counter based broadcast scheme that depends on the number of received duplicate packets between the node and its neighbors, where the node compares the duplicate packet itself and each neighbor node that previously re-broadcasted a packet. Due to the fact that existing counter-based schemes are mainly based on the fixed counter based approach, these schemes are not efficient in different operating conditions. Thus, unlike existing studies, this paper proposes a dynamic counter based threshold value and examines its effectiveness under the Dynamic Source Routing Protocol (DSR which is one of the well-known on-demand routing protocols. Specifically, we develop in this paper a new counter based broadcast algorithm under the umbrella of the DSR, namely, Inspired Counter Based Broadcasting (DSR-ICB. Using various simulation experiments, DSR-ICB has shown good performance especially in terms of delay and the number of redundant packets.

  15. Classification scheme for sedimentary and igneous rocks in Gale crater, Mars

    Science.gov (United States)

    Mangold, N.; Schmidt, M. E.; Fisk, M. R.; Forni, O.; McLennan, S. M.; Ming, D. W.; Sautter, V.; Sumner, D.; Williams, A. J.; Clegg, S. M.; Cousin, A.; Gasnault, O.; Gellert, R.; Grotzinger, J. P.; Wiens, R. C.

    2017-03-01

    Rocks analyzed by the Curiosity rover in Gale crater include a variety of clastic sedimentary rocks and igneous float rocks transported by fluvial and impact processes. To facilitate the discussion of the range of lithologies, we present in this article a petrological classification framework adapting terrestrial classification schemes to Mars compositions (such as Fe abundances typically higher than for comparable lithologies on Earth), to specific Curiosity observations (such as common alkali-rich rocks), and to the capabilities of the rover instruments. Mineralogy was acquired only locally for a few drilled rocks, and so it does not suffice as a systematic classification tool, in contrast to classical terrestrial rock classification. The core of this classification involves (1) the characterization of rock texture as sedimentary, igneous or undefined according to grain/crystal sizes and shapes using imaging from the ChemCam Remote Micro-Imager (RMI), Mars Hand Lens Imager (MAHLI) and Mastcam instruments, and (2) the assignment of geochemical modifiers based on the abundances of Fe, Si, alkali, and S determined by the Alpha Particle X-ray Spectrometer (APXS) and ChemCam instruments. The aims are to help understand Gale crater geology by highlighting the various categories of rocks analyzed by the rover. Several implications are proposed from the cross-comparisons of rocks of various texture and composition, for instance between in place outcrops and float rocks. All outcrops analyzed by the rover are sedimentary; no igneous outcrops have been observed. However, some igneous rocks are clasts in conglomerates, suggesting that part of them are derived from the crater rim. The compositions of in-place sedimentary rocks contrast significantly with the compositions of igneous float rocks. While some of the differences between sedimentary rocks and igneous floats may be related to physical sorting and diagenesis of the sediments, some of the sedimentary rocks (e

  16. Sound classification schemes in Europe - Quality classes intended for renovated housing

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2010-01-01

    According to social surveys in several European countries, occupants of multifamily housing are considerably annoyed by noise from neighbours’ activities. The noise issue has also received increasing attention from WHO. Neighbour noise has been identified as a health problem and reduction of noise...... exposure in the home included in the proposed main objectives for a housing policy. In most countries in Europe, building regulations specify minimum requirements concerning acoustical conditions for new dwellings. In addition, several countries have introduced sound classification schemes with classes...... intended to reflect different levels of acoustical comfort. Consequently, acoustic requirements for a dwelling can be specified as the legal minimum requirements or as a specific class in a classification scheme. Most schemes have both higher classes than corresponding to the regulatory requirements...

  17. A comparison between national scheme for the acoustic classification of dwellings in Europe and in the U.S

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2015-01-01

    work item. This paper compares sound classification schemes in Europe with the current situation in the United States. Economic evaluations related to the technological choices necessary to achieve different sound classification classes are also discussed. The hope is that a common sound classification......, focusing on sound insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant...... diversity in terms of descriptors, number of classes, and class intervals occurred between national schemes. However, a proposal ”acoustic classification scheme for dwellings” has been developed recently in the European COST Action TU0901 with 32 member countries. This proposal has been accepted as an ISO...

  18. Efficient Identity Based Public Verifiable Signcryption Scheme

    CERN Document Server

    Kushwah, Prashant

    2011-01-01

    Signcryption is a cryptographic primitive which performs encryption and signature in a single logical step. In conventional signcryption only receiver of the signcrypted text can verify the authenticity of the origin i.e. signature of the sender on the message after decrypting the cipher text. In public verifiable signcryption scheme anyone can verify the authenticity of the origin who can access the signcrypted text i.e. signature of the sender on the cipher text. Public verifiable signcryption scheme in which the receiver can convince a third party, by providing additional information other than his private key along with the signcryption is called third party verifiable signcryption schemes. In this paper we proposed an efficient identity based public verifiable signcryption scheme with third party verification and proved its security in the random oracle model.

  19. Hybrid Support Vector Machines-Based Multi-fault Classification

    Institute of Scientific and Technical Information of China (English)

    GAO Guo-hua; ZHANG Yong-zhong; ZHU Yu; DUAN Guang-huang

    2007-01-01

    Support Vector Machines (SVM) is a new general machine-learning tool based on structural risk minimization principle. This characteristic is very signific ant for the fault diagnostics when the number of fault samples is limited. Considering that SVM theory is originally designed for a two-class classification, a hybrid SVM scheme is proposed for multi-fault classification of rotating machinery in our paper. Two SVM strategies, 1-v-1 (one versus one) and 1-v-r (one versus rest), are respectively adopted at different classification levels. At the parallel classification level, using 1-v-1 strategy, the fault features extracted by various signal analysis methods are transferred into the multiple parallel SVM and the local classification results are obtained. At the serial classification level, these local results values are fused by one serial SVM based on 1-v-r strategy. The hybrid SVM scheme introduced in our paper not only generalizes the performance of signal binary SVMs but improves the precision and reliability of the fault classification results. The actually testing results show the availability suitability of this new method.

  20. Modulation classification based on spectrogram

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The aim of modulation classification (MC) is to identify the modulation type of a communication signal. It plays an important role in many cooperative or noncooperative communication applications. Three spectrogram-based modulation classification methods are proposed. Their reccgnition scope and performance are investigated or evaluated by theoretical analysis and extensive simulation studies. The method taking moment-like features is robust to frequency offset while the other two, which make use of principal component analysis (PCA) with different transformation inputs,can achieve satisfactory accuracy even at low SNR (as low as 2 dB). Due to the properties of spectrogram, the statistical pattern recognition techniques, and the image preprocessing steps, all of our methods are insensitive to unknown phase and frequency offsets, timing errors, and the arriving sequence of symbols.

  1. An ECC-Based Blind Signature Scheme

    Directory of Open Access Journals (Sweden)

    Fuh-Gwo Jeng

    2010-08-01

    Full Text Available Cryptography is increasingly applied to the E-commerce world, especially to the untraceable payment system and the electronic voting system. Protocols for these systems strongly require the anonymous digital signature property, and thus a blind signature strategy is the answer to it. Chaum stated that every blind signature protocol should hold two fundamental properties, blindness and intractableness. All blind signature schemes proposed previously almost are based on the integer factorization problems, discrete logarithm problems, or the quadratic residues, which are shown by Lee et al. that none of the schemes is able to meet the two fundamental properties above. Therefore, an ECC-based blind signature scheme that possesses both the above properties is proposed in this paper.

  2. Joint efforts to harmonize sound insulation descriptors and classification schemes in Europe (COST TU0901)

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2010-01-01

    Sound insulation descriptors, regulatory requirements and classification schemes in Europe represent a high degree of diversity. One implication is very little exchange of experience of housing design and construction details for different levels of sound insulation; another is trade barriers...... for building systems and products. Unfortunately, there is evidence for a development in the "wrong" direction. For example, sound classification schemes for dwellings exist in nine countries. There is no sign on increasing harmonization, rather the contrary, as more countries are preparing proposals with new...... classes. Social surveys in several European countries have shown that occupants of multi-storey housing are considerably annoyed by noise from neighbours' activities. To keep towns and cities attractive, homes in multi-storey housing must be attractive for a variety of people and offer “quietness”. Thus...

  3. Identity-based signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    CHAI ZhenChuan; CAO ZhenFu; DONG XiaoLei

    2007-01-01

    Identity-based (ID-based) cryptography has drawn great concerns in recent years, and most of ID-based schemes are constructed from bilinear parings. Therefore, ID-based scheme without pairing is of great interest in the field of cryptography. Up to now,there still remains a challenge to construct ID-based signature scheme from quadratic residues. Thus, we aim to meet this challenge by proposing a concrete scheme. In this paper, we first introduce the technique of how to calculate a 2lth root of a quadratic residue, and then give a concrete ID-based signature scheme using such technique.We also prove that our scheme is chosen message and ID secure in the random oracle model, assuming the hardness of factoring.

  4. Dissimilarity-based classification of anatomical tree structures

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Lo, Pechin Chien Pau; Dirksen, Asger

    2011-01-01

    A novel method for classification of abnormality in anatomical tree structures is presented. A tree is classified based on direct comparisons with other trees in a dissimilarity-based classification scheme. The pair-wise dissimilarity measure between two trees is based on a linear assignment...... by including anatomical features in the branch feature vectors. The proposed approach is applied to classify airway trees in computed tomography images of subjects with and without chronic obstructive pulmonary disease (COPD). Using the wall area percentage (WA%), a common measure of airway abnormality in COPD...

  5. Evaluation of rock mass classification schemes: a case study from the Bowen Basin, Australia

    Science.gov (United States)

    Brook, Martin; Hebblewhite, Bruce; Mitra, Rudrajit

    2016-04-01

    The development of an accurate engineering geological model and adequate knowledge of spatial variation in rock mass conditions are important prerequisites for slope stability analyses, tunnel design, mine planning and risk management. Rock mass classification schemes such as Rock Mass Rating (RMR), Coal Mine Roof Rating (CMRR), Q-system and Roof Strength Index (RSI) have been used for a range of engineering geological applications, including transport tunnels, "hard rock" mining and underground and open-cut coal mines. Often, rock mass classification schemes have been evaluated on subaerial exposures, where weathering has affected joint characteristics and intact strength. In contrast, the focus of this evaluation of the above classification schemes is an underground coal mine in the Bowen Basin, central Queensland, Australia, 15 km east of the town of Moranbah. Rock mass classification was undertaken at 68 sites across the mine. Both the target coal seam and overlying rock show marked spatial variability in terms of RMR, CMRR and Q, but RSI showed limited sensitivity to changes in rock mass condition. Relationships were developed between different parameters with varying degrees of success. A mine-wide analysis of faulting was undertaken, and compared with in situ stress field and local-scale measurements of joint and cleat. While there are no unequivocal relationships between rock mass classification parameters and faulting, a central graben zone shows heterogeneous rock mass properties. The corollary is that if geological features can be accurately defined by remote sensing technologies, then this can assist in predicting rock mass conditions and risk management ahead of development and construction.

  6. Feasibility analysis of two identity- based proxy ring signature schemes

    Institute of Scientific and Technical Information of China (English)

    Wang Huaqun; Zhang Lijun; Zhao Junxi

    2007-01-01

    Recently , proxy ring signature schemes have been shown to be useful in various applications , such as electronic polling, electronic payment, etc. Although many proxy ring signature schemes have been proposed, there are only two identity- based proxy ring signature schemes have been proposed until now, I.e., Cheng's scheme and Lang's scheme. It's unlucky that the two identity- based proxy ring signature schemes are unfeasible . This paper points out the reasons why the two identity- based proxy ring signature schemes are unfeasible. In order to design feasible and efficient identity-based proxy ring signature schemes from bilinear pairings , we have to search for other methods .

  7. Classification based polynomial image interpolation

    Science.gov (United States)

    Lenke, Sebastian; Schröder, Hartmut

    2008-02-01

    Due to the fast migration of high resolution displays for home and office environments there is a strong demand for high quality picture scaling. This is caused on the one hand by large picture sizes and on the other hand due to an enhanced visibility of picture artifacts on these displays [1]. There are many proposals for an enhanced spatial interpolation adaptively matched to picture contents like e.g. edges. The drawback of these approaches is the normally integer and often limited interpolation factor. In order to achieve rational factors there exist combinations of adaptive and non adaptive linear filters, but due to the non adaptive step the overall quality is notably limited. We present in this paper a content adaptive polyphase interpolation method which uses "offline" trained filter coefficients and an "online" linear filtering depending on a simple classification of the input situation. Furthermore we present a new approach to a content adaptive interpolation polynomial, which allows arbitrary polyphase interpolation factors at runtime and further improves the overall interpolation quality. The main goal of our new approach is to optimize interpolation quality by adapting higher order polynomials directly to the image content. In addition we derive filter constraints for enhanced picture quality. Furthermore we extend the classification based filtering to the temporal dimension in order to use it for an intermediate image interpolation.

  8. Identity-based ring signature scheme based on quadratic residues

    Institute of Scientific and Technical Information of China (English)

    Xiong Hu; Qin Zhiguang; Li Fagen

    2009-01-01

    Identity-based (ID-based) ring signature has drawn great concerns in recent years and many ID-based ring signature schemes have been proposed until now. Unfortunately, all of these ID-based ring signatures are constructed from bilinear pairings, a powerful but computationally expensive primitive. Hence, ID-based ring signature without pairing is of great interest in the field of cryptography. In this paper, the authors firstly propose an ID-based ring signature scheme based on quadratic residues. The proposed scheme is proved to be existentially unforgeable against adaptive chosen message-and-identity attack under the random oracle model, assuming the hardness of factoring. The proposed scheme is more efficient than those which are constructed from bilinear pairings.

  9. Dissimilarity-based classification of anatomical tree structures

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Lo, Pechin Chien Pau; Dirksen, Asger

    2011-01-01

    A novel method for classification of abnormality in anatomical tree structures is presented. A tree is classified based on direct comparisons with other trees in a dissimilarity-based classification scheme. The pair-wise dissimilarity measure between two trees is based on a linear assignment...... between the branch feature vectors representing those trees. Hereby, localized information in the branches is collectively used in classification and variations in feature values across the tree are taken into account. An approximate anatomical correspondence between matched branches can be achieved...... by including anatomical features in the branch feature vectors. The proposed approach is applied to classify airway trees in computed tomography images of subjects with and without chronic obstructive pulmonary disease (COPD). Using the wall area percentage (WA%), a common measure of airway abnormality in COPD...

  10. DIFFERENCE SCHEMES BASING ON COEFFICIENT APPROXIMATION

    Institute of Scientific and Technical Information of China (English)

    MOU Zong-ze; LONG Yong-xing; QU Wen-xiao

    2005-01-01

    In respect of variable coefficient differential equations, the equations of coefficient function approximation were more accurate than the coefficient to be frozen as a constant in every discrete subinterval. Usually, the difference schemes constructed based on Taylor expansion approximation of the solution do not suit the solution with sharp function.Introducing into local bases to be combined with coefficient function approximation, the difference can well depict more complex physical phenomena, for example, boundary layer as well as high oscillatory, with sharp behavior. The numerical test shows the method is more effective than the traditional one.

  11. Cluster-based adaptive metric classification

    NARCIS (Netherlands)

    Giotis, Ioannis; Petkov, Nicolai

    2012-01-01

    Introducing adaptive metric has been shown to improve the results of distance-based classification algorithms. Existing methods are often computationally intensive, either in the training or in the classification phase. We present a novel algorithm that we call Cluster-Based Adaptive Metric (CLAM) c

  12. Ontology-Based Classification System Development Methodology

    Directory of Open Access Journals (Sweden)

    Grabusts Peter

    2015-12-01

    Full Text Available The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision tree-based classification systems has been researched. Using such methodologies, the classification accuracy in some cases can be improved.

  13. Head/tail Breaks: A New Classification Scheme for Data with a Heavy-tailed Distribution

    CERN Document Server

    Jiang, Bin

    2012-01-01

    This paper introduces a new classification scheme - head/tail breaks - in order to find groupings or hierarchy for data with a heavy-tailed distribution. The heavy-tailed distributions are heavily right skewed, with a minority of large values in the head and a majority of small values in the tail, commonly characterized by a power law, a lognormal or an exponential function. For example, a country's population is often distributed in such a heavy-tailed manner, with a minority of people (e.g., 20 percent) in the countryside and the vast majority (e.g., 80 percent) in urban areas. This heavy-tailed distribution is also called scaling, hierarchy or scaling hierarchy. This new classification scheme partitions all of the data values around the mean into two parts and continues the process iteratively for the values (above the mean) in the head until the head part values are no longer heavy-tailed distributed. Thus, the number of classes and the class intervals are both naturally determined. We therefore claim tha...

  14. AN EFFICIENT BIT COMMITMENT SCHEME BASED ON FACTORING ASSUMPTION

    Institute of Scientific and Technical Information of China (English)

    Zhong Ming; Yang Yixian

    2001-01-01

    Recently, many bit commitment schemes have been presented. This paper presents a new practical bit commitment scheme based on Schnorr's one-time knowledge proof scheme,where the use of cut-and-choose method and many random exam candidates in the protocols are replaced by a single challenge number. Therefore the proposed bit commitment scheme is more efficient and practical than the previous schemes In addition, the security of the proposed scheme under factoring assumption is proved, thus the cryptographic basis of the proposed scheme is clarified.

  15. Climatological characteristics of the tropics in China: climate classification schemes between German scientists and Huang Bingwei

    Institute of Scientific and Technical Information of China (English)

    ManfredDomroes

    2003-01-01

    Reviewing some important German scientists who have developed climatic regionalization schemes either on a global or Chinese scale, their various definitions of the tropical climate characteristics in China are discussed and compared with Huang Bingwei's climate classification scheme and the identification of the tropical climate therein. It can be seen that, due to different methodological approaches of the climatic regionalization schemes, the definitions of the tropics vary and hence also their spatial distribution in China. However, it is found that the tropical climate type occupies only a peripheral part of southern China, though it firmly represents a distinctive type of climate that is subsequently associated with a great economic importance for China. As such, the tropical climate type was mostly identified with its agro-climatological significance, that is by giving favourable growing conditions all-year round for perennial crops with a great heat demand. Tropical climate is, hence, conventionally regarded to be governed by all-year round summer conditions "where winter never comes".

  16. Hyperspectral image classification based on volumetric texture and dimensionality reduction

    Science.gov (United States)

    Su, Hongjun; Sheng, Yehua; Du, Peijun; Chen, Chen; Liu, Kui

    2015-06-01

    A novel approach using volumetric texture and reduced-spectral features is presented for hyperspectral image classification. Using this approach, the volumetric textural features were extracted by volumetric gray-level co-occurrence matrices (VGLCM). The spectral features were extracted by minimum estimated abundance covariance (MEAC) and linear prediction (LP)-based band selection, and a semi-supervised k-means (SKM) clustering method with deleting the worst cluster (SKMd) bandclustering algorithms. Moreover, four feature combination schemes were designed for hyperspectral image classification by using spectral and textural features. It has been proven that the proposed method using VGLCM outperforms the gray-level co-occurrence matrices (GLCM) method, and the experimental results indicate that the combination of spectral information with volumetric textural features leads to an improved classification performance in hyperspectral imagery.

  17. Hash function based secret sharing scheme designs

    CERN Document Server

    Chum, Chi Sing

    2011-01-01

    Secret sharing schemes create an effective method to safeguard a secret by dividing it among several participants. By using hash functions and the herding hashes technique, we first set up a (t+1, n) threshold scheme which is perfect and ideal, and then extend it to schemes for any general access structure. The schemes can be further set up as proactive or verifiable if necessary. The setup and recovery of the secret is efficient due to the fast calculation of the hash function. The proposed scheme is flexible because of the use of existing hash functions.

  18. Password Authentication Based on Fractal Coding Scheme

    Directory of Open Access Journals (Sweden)

    Nadia M. G. Al-Saidi

    2012-01-01

    Full Text Available Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image mechanisms. The advantage of fractal image coding is to be used to securely send the compressed image data through a nonsecured communication channel to the server. The verification of client information with the database system is achieved in the server to authenticate the legal user. The encrypted hashed password in the decoded fractal image is recognized using optical character recognition. The authentication process is performed after a successful verification of the client identity by comparing the decrypted hashed password with those which was stored in the database system. The system is analyzed and discussed from the attacker’s viewpoint. A security comparison is performed to show that the proposed scheme provides an essential security requirement, while their efficiency makes it easier to be applied alone or in hybrid with other security methods. Computer simulation and statistical analysis are presented.

  19. Secure Order-Specified Multisignature Scheme Based on DSA

    Institute of Scientific and Technical Information of China (English)

    YANG Muxiang; SU Li; LI Jun; HONG Fan

    2006-01-01

    In multisignature schemes signers can sign either in a linear order or not in any specified order, but neither of them is adequate in some scenarios where require mixture using of orderless and ordered multisignature. Most order-specified multisignatures specified the orders as linear ones. In this paper, we proposed an order-specified multisignature scheme based on DSA secure against active insider attack. To our knowledge, it is the first order-specified multisignature scheme based on DSA signature scheme, in which signers can sign in flexible order represented by series-parallel graphs. In the multisignature scheme verification to both signers and signing order are available. The security of the scheme is proved by reduce to an identification scheme that is proved have some concrete security. The running time of verifying a signature is comparable to previous schemes while the running time of multisignature generation and the space needed is less than those schemes.

  20. Sound classification of dwellings in the Nordic countries – Differences and similarities between the five national schemes

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2012-01-01

    In all five Nordic countries, sound classification schemes for dwellings have been published in national standards being implemented and revised gradually since the late 1990s. The national classification criteria for dwellings originate from a common Nordic INSTA-B proposal from the 1990s, thus...... having several similarities. In 2012, status is that number and denotations of classes for dwellings are identical in the Nordic countries, but the structures of the standards and several details are quite different. Also the issues dealt with are different. Examples of differences are sound insulation...... for classification of such buildings. This paper presents and compares the main class criteria for sound insulation of dwellings and summarizes differences and similarities in criteria and in structures of standards. Classification schemes for dwellings also exist in several other countries in Europe...

  1. Harmonization of sound insulation descriptors and classification schemes in Europe: COST Action TU0901

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    countries and to reduce trade barriers. Most important is, however, that review of sound insulation requirements should be encouraged in several countries to adapt regulations to current construction trends and peoples' needs for health, wellbeing and comfort. Looking into the future, harmonization of sound...... insulation requirements seems unrealistic. However, by preparing a harmonized European classification scheme with a number of quality classes, member states could select a "harmonized" class fitting the national needs and conditions. A joint European Action, COST Action TU0901 "Integrating and Harmonizing......-in-Chief. Handbook of noise and vibration control, USA: Wiley and Son; 2007 [Ch. 114]. [4] COST Action TU0901 “Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing Constructions”, 2009-2013, www.cost.eu/index.php?id=240&action_number=tu0901 (public information at COST website) or http...

  2. Blind Signature Scheme Based on Chebyshev Polynomials

    OpenAIRE

    Maheswara Rao Valluri

    2011-01-01

    A blind signature scheme is a cryptographic protocol to obtain a valid signature for a message from a signer such that signer’s view of the protocol can’t be linked to the resulting message signature pair. This paper presents blind signature scheme using Chebyshev polynomials. The security of the given scheme depends upon the intractability of the integer factorization problem and discrete logarithms ofChebyshev polynomials.

  3. Blind Signature Scheme Based on Chebyshev Polynomials

    Directory of Open Access Journals (Sweden)

    Maheswara Rao Valluri

    2011-12-01

    Full Text Available A blind signature scheme is a cryptographic protocol to obtain a valid signature for a message from a signer such that signer’s view of the protocol can’t be linked to the resulting message signature pair. This paper presents blind signature scheme using Chebyshev polynomials. The security of the given scheme depends upon the intractability of the integer factorization problem and discrete logarithms ofChebyshev polynomials.

  4. A multi-label image annotation scheme based on improved SVM multiple kernel learning

    Science.gov (United States)

    Jin, Cong; Jin, Shu-Wei

    2017-02-01

    Multi-label image annotation (MIA) has been widely studied during recent years and many MIA schemes have been proposed. However, the most existing schemes are not satisfactory. In this paper, an improved multiple kernel learning (IMKL) method of support vector machine (SVM) is proposed to improve the classification accuracy of SVM, then a novel MIA scheme based on IMKL is presented, which uses the discriminant loss to control the number of top semantic labels, and the feature selection approach is also used for improving the performance of MIA. The experiment results show that proposed MIA scheme achieves higher the performance than the existing other MIA schemes, its performance is satisfactory for large image dataset.

  5. A NEW SCHEME BASED ON THE MI SCHEME AND ITS ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jiao Luyao; Li Yifa; Qiao Shuaiting

    2013-01-01

    This article aims at designing a new Multivariate Quadratic (MQ) public-key scheme to avoid the linearization attack and differential attack against the Matsumoto-Imai (MI) scheme.Based on the original scheme,our new scheme,named the Multi-layer MI (MMI) scheme,has a structure of multi-layer central map.Firstly,this article introduces the MI scheme and describes linearization attack and differential attack; then prescribes the designation of MMI in detail,and proves that MMI can resist both linearization attack and differential attack.Besides,this article also proves that MMI can resist recent eXtended Linearization (XL)-like methods.In the end,this article concludes that MMI also maintains the efficiency of MI.

  6. A new threshold signature scheme based on fuzzy biometric identity

    Institute of Scientific and Technical Information of China (English)

    Yongquan Cai; Ke Zhang

    2009-01-01

    The focus of this paper is to present the first threshold signature scheme based on biometric identity, which is acquired from a recently proposed fuzzy identities-based encryption scheme. An important feature of this scheme, which is different from other previous ID-based threshold signature schemes, is that it can be applied to situations using not only average personal attributes in social contact but also people's noisy biometric inputs as identities. The security of our scheme in the selective-lD model reduces the limit in the hardness of the Decisional BDH Assumption.

  7. Clinical presentation and outcome prediction of clinical, serological, and histopathological classification schemes in ANCA-associated vasculitis with renal involvement.

    Science.gov (United States)

    Córdova-Sánchez, Bertha M; Mejía-Vilet, Juan M; Morales-Buenrostro, Luis E; Loyola-Rodríguez, Georgina; Uribe-Uribe, Norma O; Correa-Rotter, Ricardo

    2016-07-01

    Several classification schemes have been developed for anti-neutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV), with actual debate focusing on their clinical and prognostic performance. Sixty-two patients with renal biopsy-proven AAV from a single center in Mexico City diagnosed between 2004 and 2013 were analyzed and classified under clinical (granulomatosis with polyangiitis [GPA], microscopic polyangiitis [MPA], renal limited vasculitis [RLV]), serological (proteinase 3 anti-neutrophil cytoplasmic antibodies [PR3-ANCA], myeloperoxidase anti-neutrophil cytoplasmic antibodies [MPO-ANCA], ANCA negative), and histopathological (focal, crescenteric, mixed-type, sclerosing) categories. Clinical presentation parameters were compared at baseline between classification groups, and the predictive value of different classification categories for disease and renal remission, relapse, renal, and patient survival was analyzed. Serological classification predicted relapse rate (PR3-ANCA hazard ratio for relapse 2.93, 1.20-7.17, p = 0.019). There were no differences in disease or renal remission, renal, or patient survival between clinical and serological categories. Histopathological classification predicted response to therapy, with a poorer renal remission rate for sclerosing group and those with less than 25 % normal glomeruli; in addition, it adequately delimited 24-month glomerular filtration rate (eGFR) evolution, but it did not predict renal nor patient survival. On multivariate models, renal replacement therapy (RRT) requirement (HR 8.07, CI 1.75-37.4, p = 0.008) and proteinuria (HR 1.49, CI 1.03-2.14, p = 0.034) at presentation predicted renal survival, while age (HR 1.10, CI 1.01-1.21, p = 0.041) and infective events during the induction phase (HR 4.72, 1.01-22.1, p = 0.049) negatively influenced patient survival. At present, ANCA-based serological classification may predict AAV relapses, but neither clinical nor serological

  8. A NEW EFFICIENT ID-BASED PROXY BLIND SIGNATURE SCHEME

    Institute of Scientific and Technical Information of China (English)

    Ming Yang; Wang Yumin

    2008-01-01

    In a proxy blind signature scheme, the proxy signer is allowed to generate a blind signature on behalf of the original signer. The proxy blind signature scheme is useful in several applications such as e-voting, e-payment, etc. Recently, Zheng, et al. presented an IDentity (ID)-based proxy blind signature. In this paper, a new efficient ID-based proxy blind signature scheme from bilinear pairings is proposed, which can satisfy the security properties of both the proxy signatures and the blind signature schemes. Analysis of the scheme efficiency shows that the new scheme is more efficient than Zheng, et al.'s scheme. The proposed scheme is more practical in the real world.

  9. A quantum identification scheme based on polarization modulation

    Institute of Scientific and Technical Information of China (English)

    He Guang-Qiang; Zeng Gui-Hua

    2005-01-01

    Aquantum idetification scheme including registration and identification phases is proposed.The user' passwords are transmitted by qubit string and recorded as set of quantum operators.The security of the proposed scheme is guarateed by the no-coloning theorem.Based on photon polarization modulation,an experimental approach is also designed to implement our proposed scheme.

  10. An Identity-Based Strong Designated Verifier Proxy Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    WANG Qin; CAO Zhenfu

    2006-01-01

    In a strong designated verifier proxy signature scheme, a proxy signer can generate proxy signature on behalf of an original signer, but only the designated verifier can verify the validity of the proxy signature. In this paper, we first define the security requirements for strong designated verifier proxy signature schemes. And then we construct an identity-based strong designated verifier proxy signature scheme. We argue that the proposed scheme satisfies all of the security requirements.

  11. Verifiable (t, n) Threshold Signature Scheme Based on Elliptic Curve

    Institute of Scientific and Technical Information of China (English)

    WANG Hua-qun; ZHAO Jun-xi; ZHANG Li-jun

    2005-01-01

    Based on the difficulty of solving the ECDLP (elliptic curve discrete logarithm problem) on the finite field,we present a (t, n) threshold signature scheme and a verifiable key agreement scheme without trusted party. Applying a modified elliptic curve signature equation, we get a more efficient signature scheme than the existing ECDSA (elliptic curve digital signature algorithm) from the computability and security view. Our scheme has a shorter key, faster computation, and better security.

  12. A New Efficient Certificate-Based Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yichen; LI Jiguo; WANG Zhiwei; YAO Wei

    2015-01-01

    Certificate-based cryptography is a new kind of public key algorithm, which combines the merits of traditional Public key infrastructure (PKI) and identity-based cryptography. It removes the inherent key escrow problem in the identity-based cryptography and eliminates the certificate revocation problem and third-party queries in the traditional PKI. In this paper, we propose an effi-cient certificate-based signature scheme based on bilinear pairings. Under the strong security model of certificate-based signature scheme, we prove that our scheme is exis-tentially unforgeable against adaptive chosen message and identity attacks in the random oracle. In our scheme, only two pairing operations are needed in the signing and ver-ification processes. Compared with some certificate-based signature schemes from bilinear pairings, our scheme en-joys more advantage in computational cost and communi-cational cost.

  13. Threshold Signature Scheme Based on Discrete Logarithm and Quadratic Residue

    Institute of Scientific and Technical Information of China (English)

    FEI Ru-chun; WANG Li-na

    2004-01-01

    Digital signature scheme is a very important research field in computer security and modern cryptography.A(k,n) threshold digital signature scheme is proposed by integrating digital signature scheme with Shamir secret sharing scheme.It can realize group-oriented digital signature, and its security is based on the difficulty in computing discrete logarithm and quadratic residue on some special conditions.In this scheme, effective digital signature can not be generated by any k-1 or fewer legal users, or only by signature executive.In addition, this scheme can identify any legal user who presents incorrect partial digital signature to disrupt correct signature, or any illegal user who forges digital signature.A method of extending this scheme to an Abelian group such as elliptical curve group is also discussed.The extended scheme can provide rapider computing speed and stronger security in the case of using shorter key.

  14. An improved identity-based proxy ring signature scheme

    Institute of Scientific and Technical Information of China (English)

    Lang Weimin; Yang Zongkai; Cheng Wenqing; Tan Yunmeng

    2005-01-01

    Proxy ring signature schemes have been shown to be useful in various applications, such as electronic polling, electronic payment, etc. In this paper, we point out that Lang's scheme is unreasonable and propose an improved Identity-based proxy ring scheme from bilinear pairings which is reasonable and overcomes the deficiencies of Lang's scheme. Our scheme can prevent the original signer from generating the proxy ring signature, thus the profits of the proxy signer are guaranteed. In addition, our scheme satisfies all the security requirements of proxy ring signature, I.e. Signer-ambiguity, non-forgeability, verification, non-deniability and distinguishability. As compared with Zhang's scheme, our scheme is a computational efficiency improvement for signature verification because the computational cost of bilinear pairings required is reduced from O(n) to O(1).

  15. Ontology-Based Classification System Development Methodology

    OpenAIRE

    2015-01-01

    The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision ...

  16. An Authentication Technique Based on Classification

    Institute of Scientific and Technical Information of China (English)

    李钢; 杨杰

    2004-01-01

    We present a novel watermarking approach based on classification for authentication, in which a watermark is embedded into the host image. When the marked image is modified, the extracted watermark is also different to the original watermark, and different kinds of modification lead to different extracted watermarks. In this paper, different kinds of modification are considered as classes, and we used classification algorithm to recognize the modifications with high probability. Simulation results show that the proposed method is potential and effective.

  17. Atmospheric circulation classification comparison based on wildfires in Portugal

    Science.gov (United States)

    Pereira, M. G.; Trigo, R. M.

    2009-04-01

    Atmospheric circulation classifications are not a simple description of atmospheric states but a tool to understand and interpret the atmospheric processes and to model the relation between atmospheric circulation and surface climate and other related variables (Radan Huth et al., 2008). Classifications were initially developed with weather forecasting purposes, however with the progress in computer processing capability, new and more robust objective methods were developed and applied to large datasets prompting atmospheric circulation classification methods to one of the most important fields in synoptic and statistical climatology. Classification studies have been extensively used in climate change studies (e.g. reconstructed past climates, recent observed changes and future climates), in bioclimatological research (e.g. relating human mortality to climatic factors) and in a wide variety of synoptic climatological applications (e.g. comparison between datasets, air pollution, snow avalanches, wine quality, fish captures and forest fires). Likewise, atmospheric circulation classifications are important for the study of the role of weather in wildfire occurrence in Portugal because the daily synoptic variability is the most important driver of local weather conditions (Pereira et al., 2005). In particular, the objective classification scheme developed by Trigo and DaCamara (2000) to classify the atmospheric circulation affecting Portugal have proved to be quite useful in discriminating the occurrence and development of wildfires as well as the distribution over Portugal of surface climatic variables with impact in wildfire activity such as maximum and minimum temperature and precipitation. This work aims to present: (i) an overview the existing circulation classification for the Iberian Peninsula, and (ii) the results of a comparison study between these atmospheric circulation classifications based on its relation with wildfires and relevant meteorological

  18. The Performance-based Funding Scheme of Universities

    Directory of Open Access Journals (Sweden)

    Juha KETTUNEN

    2016-05-01

    Full Text Available The purpose of this study is to analyse the effectiveness of the performance-based funding scheme of the Finnish universities that was adopted at the beginning of 2013. The political decision-makers expect that the funding scheme will create incentives for the universities to improve performance, but these funding schemes have largely failed in many other countries, primarily because public funding is only a small share of the total funding of universities. This study is interesting because Finnish universities have no tuition fees, unlike in many other countries, and the state allocates funding based on the objectives achieved. The empirical evidence of the graduation rates indicates that graduation rates increased when a new scheme was adopted, especially among male students, who have more room for improvement than female students. The new performance-based funding scheme allocates the funding according to the output-based indicators and limits the scope of strategic planning and the autonomy of the university. The performance-based funding scheme is transformed to the strategy map of the balanced scorecard. The new funding scheme steers universities in many respects but leaves the research and teaching skills to the discretion of the universities. The new scheme has also diminished the importance of the performance agreements between the university and the Ministry. The scheme increases the incentives for universities to improve the processes and structures in order to attain as much public funding as possible. It is optimal for the central administration of the university to allocate resources to faculties and other organisational units following the criteria of the performance-based funding scheme. The new funding scheme has made the universities compete with each other, because the total funding to the universities is allocated to each university according to the funding scheme. There is a tendency that the funding schemes are occasionally

  19. Sound insulation and reverberation time for classrooms - Criteria in regulations and classification schemes in the Nordic countries

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2016-01-01

    have become more extensive and stricter during the last two decades. The paper focuses on comparison of sound insulation and reverberation time criteria for classrooms in regulations and classification schemes in the Nordic countries. Limit values and changes over time will be discussed as well as how...

  20. Broadcast encryption schemes based on RSA

    Institute of Scientific and Technical Information of China (English)

    MU Ning-bo; HU Yu-pu; OU Hai-wen

    2009-01-01

    Three broadcast schemes for small receiver set using the property of RSA modulus are presented. They can solve the problem of data redundancy when the size of receiver set is small. In the proposed schemes, the center uses one key to encrypt the message and can revoke authorization conveniently. Every authorized user only needs to store one decryption key of a constant size. Among these three schemes, the first one has indistinguishability against adaptive chosen ciphertext attack (IND-CCA2) secure, and any collusion of authorized users cannot produce a new decryption key but the sizes of encryption modulus and ciphertext are linear in the number of receivers. In the second scheme, the size of ciphertext is half of the first one and any two authorized users can produce a new decryption key, but the center can identify them using the traitor tracing algorithm. The third one is the most efficient but the center cannot identify the traitors exactly.

  1. Distance-based features in pattern classification

    Directory of Open Access Journals (Sweden)

    Lin Wei-Yang

    2011-01-01

    Full Text Available Abstract In data mining and pattern classification, feature extraction and representation methods are a very important step since the extracted features have a direct and significant impact on the classification accuracy. In literature, numbers of novel feature extraction and representation methods have been proposed. However, many of them only focus on specific domain problems. In this article, we introduce a novel distance-based feature extraction method for various pattern classification problems. Specifically, two distances are extracted, which are based on (1 the distance between the data and its intra-cluster center and (2 the distance between the data and its extra-cluster centers. Experiments based on ten datasets containing different numbers of classes, samples, and dimensions are examined. The experimental results using naïve Bayes, k-NN, and SVM classifiers show that concatenating the original features provided by the datasets to the distance-based features can improve classification accuracy except image-related datasets. In particular, the distance-based features are suitable for the datasets which have smaller numbers of classes, numbers of samples, and the lower dimensionality of features. Moreover, two datasets, which have similar characteristics, are further used to validate this finding. The result is consistent with the first experiment result that adding the distance-based features can improve the classification performance.

  2. Adaptive Image Transmission Scheme over Wavelet-Based OFDM System

    Institute of Scientific and Technical Information of China (English)

    GAOXinying; YUANDongfeng; ZHANGHaixia

    2005-01-01

    In this paper an adaptive image transmission scheme is proposed over Wavelet-based OFDM (WOFDM) system with Unequal error protection (UEP) by the design of non-uniform signal constellation in MLC. Two different data division schemes: byte-based and bitbased, are analyzed and compared. Different bits are protected unequally according to their different contribution to the image quality in bit-based data division scheme, which causes UEP combined with this scheme more powerful than that with byte-based scheme. Simulation results demonstrate that image transmission by UEP with bit-based data division scheme presents much higher PSNR values and surprisingly better image quality. Furthermore, by considering the tradeoff of complexity and BER performance, Haar wavelet with the shortest compactly supported filter length is the most suitable one among orthogonal Daubechies wavelet series in our proposed system.

  3. A Proxy Blind Signature Scheme Based on ECDLP

    Institute of Scientific and Technical Information of China (English)

    WANGHaiyan; WANGRuchuan

    2005-01-01

    While proxy signature scheme enables an original signer to fully authorize a proxy to sign a message on his or her behalf legally and undeniably, blind signature scheme keeps the message blind from the signer so that the signer cannot make a linkage between the signature and the identity of requester (receiver). Both schemes have been widely applied in the electronic business. A new ECDLP (Elliptic curve discrete problem)-based proxy blind signature scheme is to be proposed in this paper by integrating the security properties of both schemes.

  4. Quantum election scheme based on anonymous quantum key distribution

    Institute of Scientific and Technical Information of China (English)

    Zhou Rui-Rui; Yang Li

    2012-01-01

    An unconditionally secure authority-certified anonymous quantum key distribution scheme using conjugate coding is presented,based on which we construct a quantum election scheme without the help of an entanglement state.We show that this election scheme ensures the completeness,soundness,privacy,eligibility,unreusability,fairness,and verifiability of a large-scale election in which the administrator and counter are semi-honest.This election scheme can work even if there exist loss and errors in quantum channels.In addition,any irregularity in this scheme is sensible.

  5. An Improved Scalar Costa Scheme Based on Watson Perceptual Model

    Institute of Scientific and Technical Information of China (English)

    QI Kai-yue; CHEN Jian-bo; ZHOU Yi

    2008-01-01

    An improved scalar Costa scheme (SCS) was proposed by using improved Watson perceptual model to adaptively decide quantization step size and scaling factor. The improved scheme equals to embed hiding data based on an actual image. In order to withstand amplitude scaling attack, the Watson perceptual model was redefined, and the improved scheme using the new definition can insure quantization step size in decoder that is proportional to amplitude scaling attack factor. The performance of the improved scheme outperforms that of SCS with fixed quantization step size. The improved scheme combines information theory and visual model.

  6. Third-order modified coefficient scheme based on essentially non-oscillatory scheme

    Institute of Scientific and Technical Information of China (English)

    LI Ming-jun; YANG Yu-yue; SHU Shi

    2008-01-01

    A third-order numerical scheme is presented to give approximate solutions to multi-dimensional hyperbolic conservation laws only using modified coefficients of an essentially non-oscillatory (MCENO) scheme without increasing the base points during construction of the scheme.The construction process shows that the modified coefficient approach preserves favourable properties inherent in the original essentially non oscillatory (ENO) scheme for its essential non-oscillation,total variation bounded (TVB),etc.The new scheme improves accuracy by one order compared to the original one.The proposed MCENO scheme is applied to simulate two-dimensional Rayleigh-Taylor (RT) instability with densities 1:3 and 1:100,and solve the Lax shock-wave tube numerically.The ratio of CPU time used to implement MCENO,the third-order ENO and fifth-order weighed ENO (WENO) schemes is 0.62:1:2.19.This indicates that MCENO improves accuracy in smooth regions and has higher accuracy and better efficiency compared to the original ENO scheme.

  7. Design Scheme of Remote Monitoring System Based on Qt

    Directory of Open Access Journals (Sweden)

    Xu Dawei

    2015-01-01

    Full Text Available This paper introduces a design scheme of remote monitoring system based on Qt, the scheme of remote monitoring system based on S3C2410 and Qt, with the aid of cross platform development tools Qt and powerful ARM platform design and implementation. The development of remote video surveillance system based on embedded terminal has practical significance and value.

  8. Classification Scheme for Random Longitudinal Road Unevenness Considering Road Waviness and Vehicle Response

    Directory of Open Access Journals (Sweden)

    Oldřich Kropáč

    2009-01-01

    Full Text Available A novel approach to the road unevenness classification based on the power spectral density with consideration of vehicle vibration response and broad interval of road waviness (road elevation PSD slope is presented. This approach enables transformation of two basic parameters of road profile elevation PSD (unevenness index, C, and waviness, w into a single-number indicator Cw when using a correction factor Kw accounting for w. For the road classification proposal two planar vehicle models (passenger car and truck, ten responses (reflecting ride comfort, dynamic load of road and cargo, ride safety and three different vehicle velocities have been considered. The minimum of ten estimated vibration response ranges sum for a broad waviness interval typical for real road sections (w = 1.5 to 3.5 has been used for the correction factor estimation. The introduced unevenness indicator, Cw, reflects the vehicle vibration response and seems to be a suitable alternative to the other currently used single-number indicators or as an extension of the road classification according to the ISO 8608: 1995, which is based on constant waviness value, w = 2.

  9. Classifying Obstructive and Nonobstructive Code Clones of Type I Using Simplified Classification Scheme: A Case Study

    Directory of Open Access Journals (Sweden)

    Miroslaw Staron

    2015-01-01

    Full Text Available Code cloning is a part of many commercial and open source development products. Multiple methods for detecting code clones have been developed and finding the clones is often used in modern quality assurance tools in industry. There is no consensus whether the detected clones are negative for the product and therefore the detected clones are often left unmanaged in the product code base. In this paper we investigate how obstructive code clones of Type I (duplicated exact code fragments are in large software systems from the perspective of the quality of the product after the release. We conduct a case study at Ericsson and three of its large products, which handle mobile data traffic. We show how to use automated analogy-based classification to decrease the classification effort required to determine whether a clone pair should be refactored or remain untouched. The automated method allows classifying 96% of Type I clones (both algorithms and data declarations leaving the remaining 4% for the manual classification. The results show that cloning is common in the studied commercial software, but that only 1% of these clones are potentially obstructive and can jeopardize the quality of the product if left unmanaged.

  10. An Encoder/Decoder Scheme of OCDMA Based on Waveguide

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new encoder/decoder scheme of OCDMA based on waveguide isproposed in this paper. The principle as well as the structure of waveguide encoder/decoder is given. It can be seen that all-optical OCDMA encoder/decoder can be realized by the proposed scheme of the waveguide encoder/decoder. It can also make the OCDMA encoder/decoder integrated easily and the access controlled easily. The system based on this scheme can work under the entirely asynchronous condition.

  11. A Dynamic ID-based Remote User Authentication Scheme

    CERN Document Server

    Das, Manik Lal; Gulati, Ved P

    2004-01-01

    Password-based authentication schemes are the most widely used techniques for remote user authentication. Many static ID-based remote user authentication schemes both with and without smart cards have been proposed. Most of the schemes do not allow the users to choose and change their passwords, and maintain a verifier table to verify the validity of the user login. In this paper we present a dynamic ID-based remote user authentication scheme using smart cards. Our scheme allows the users to choose and change their passwords freely, and do not maintain any verifier table. The scheme is secure against ID-theft, and can resist the reply attacks, forgery attacks, guessing attacks, insider attacks and stolen verifier attacks.

  12. Evaluation of a 5-tier scheme proposed for classification of sequence variants using bioinformatic and splicing assay data

    DEFF Research Database (Denmark)

    Walker, Logan C; Whiley, Phillip J; Houdayer, Claude;

    2013-01-01

    of results, and the lack of quantitative data for the aberrant transcripts. We propose suggestions for minimum reporting guidelines for splicing assays, and improvements to the 5-tier splicing classification system to allow future evaluation of its performance as a clinical tool.......Splicing assays are commonly undertaken in the clinical setting to assess the clinical relevance of sequence variants in disease predisposition genes. A 5-tier classification system incorporating both bioinformatic and splicing assay information was previously proposed as a method to provide...... consistent clinical classification of such variants. Members of the ENIGMA Consortium Splicing Working Group undertook a study to assess the applicability of the scheme to published assay results, and the consistency of classifications across multiple reviewers. Splicing assay data were identified for 235...

  13. A Digital Signature Scheme Based on MST3 Cryptosystems

    Directory of Open Access Journals (Sweden)

    Haibo Hong

    2014-01-01

    Full Text Available As special types of factorization of finite groups, logarithmic signature and cover have been used as the main components of cryptographic keys for secret key cryptosystems such as PGM and public key cryptosystems like MST1, MST2, and MST3. Recently, Svaba et. al proposed a revised MST3 encryption scheme with greater security. Meanwhile, they put forward an idea of constructing signature schemes on the basis of logarithmic signatures and random covers. In this paper, we firstly design a secure digital signature scheme based on logarithmic signatures and random covers. In order to complete the task, we devise a new encryption scheme based on MST3 cryptosystems.

  14. Improved ID-Based Signature Scheme Solving Key Escrow

    Institute of Scientific and Technical Information of China (English)

    LIAO Jian; QI Ying-hao; HUANG Pei-wei; RONG Men-tian; LI Sheng-hong

    2006-01-01

    Key escrow is an inherent disadvantage for traditional ID-based cryptosystem, i.e. , the dishonest private key generator (PKG) can forge the signature of any user, meanwhile, the user can deny the signature actually signed by him/herself. To avoid the keyescrow problem, an ID-based signature scheme was presented without trusted PKG. The exact proof of security was presented to demonstrate that our scheme is secure against existential forgery on adaptively chosen message and ID attacks assuming the complexity of computational Diffie-Hellman(CDH) problem. Compared with other signature schemes, the proposed scheme is more efficient.

  15. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  16. CONTROL SCHEMES FOR CMAC NEURAL NETWORK-BASED VISUAL SERVOING

    Institute of Scientific and Technical Information of China (English)

    Wang Huaming; Xi Wenming; Zhu Jianying

    2003-01-01

    In IBVS (image based visual servoing), the error signal in image space should be transformed into the control signal in the input space quickly. To avoid the iterative adjustment and complicated inverse solution of image Jacobian, CMAC (cerebellar model articulation controller) neural network is inserted into visual servo control loop to implement the nonlinear mapping. Two control schemes are used. Simulation results on two schemes are provided, which show a better tracking precision and stability can be achieved using scheme 2.

  17. Comprehensive Evaluation of Car-Body Light-Weighting Scheme Based on LCC Theory

    Directory of Open Access Journals (Sweden)

    Han Qing-lan

    2016-01-01

    Full Text Available In this paper, a comprehensive evaluation model of light-weighting scheme is established, which is based on three dimensions, including the life cycle costs of the resource consumed by the designed objects (LCC, willingness to pay for the environmental effect of resource consumption (WTP and performance (P. Firstly, cost of each stage is determined. Then, based on the resource classification, which is based on cost elements, determine the material list needed, and apply WTP weight coefficient to monetize life cycle environmental impact and obtain the life cycle comprehensive cost of designed scheme (TCC. In the next step Performance (P index is calculated to measure the value of the life cycle costs by applying AHP and SAW method, integrated (TCC and (P to achieve comprehensive evaluation of light-weighting scheme. Finally, the effectiveness of the evaluation model is verified by the example of car engine hood.

  18. The "chessboard" classification scheme of mineral deposits: Mineralogy and geology from aluminum to zirconium

    Science.gov (United States)

    Dill, Harald G.

    2010-06-01

    Economic geology is a mixtum compositum of all geoscientific disciplines focused on one goal, finding new mineral depsosits and enhancing their exploitation. The keystones of this mixtum compositum are geology and mineralogy whose studies are centered around the emplacement of the ore body and the development of its minerals and rocks. In the present study, mineralogy and geology act as x- and y-coordinates of a classification chart of mineral resources called the "chessboard" (or "spreadsheet") classification scheme. Magmatic and sedimentary lithologies together with tectonic structures (1 -D/pipes, 2 -D/veins) are plotted along the x-axis in the header of the spreadsheet diagram representing the columns in this chart diagram. 63 commodity groups, encompassing minerals and elements are plotted along the y-axis, forming the lines of the spreadsheet. These commodities are subjected to a tripartite subdivision into ore minerals, industrial minerals/rocks and gemstones/ornamental stones. Further information on the various types of mineral deposits, as to the major ore and gangue minerals, the current models and the mode of formation or when and in which geodynamic setting these deposits mainly formed throughout the geological past may be obtained from the text by simply using the code of each deposit in the chart. This code can be created by combining the commodity (lines) shown by numbers plus lower caps with the host rocks or structure (columns) given by capital letters. Each commodity has a small preface on the mineralogy and chemistry and ends up with an outlook into its final use and the supply situation of the raw material on a global basis, which may be updated by the user through a direct link to databases available on the internet. In this case the study has been linked to the commodity database of the US Geological Survey. The internal subdivision of each commodity section corresponds to the common host rock lithologies (magmatic, sedimentary, and

  19. Efficient Identity Based Public Verifiable Signcryption Scheme

    OpenAIRE

    Kushwah, Prashant; Lal, Sunder

    2011-01-01

    Signcryption is a cryptographic primitive which performs encryption and signature in a single logical step. In conventional signcryption only receiver of the signcrypted text can verify the authenticity of the origin i.e. signature of the sender on the message after decrypting the cipher text. In public verifiable signcryption scheme anyone can verify the authenticity of the origin who can access the signcrypted text i.e. signature of the sender on the cipher text. Public verifiable signcrypt...

  20. Cost Based Droop Schemes for Economic Dispatch in Islanded Microgrids

    DEFF Research Database (Denmark)

    Chen, Feixiong; Chen, Minyou; Li, Qiang

    2017-01-01

    In this paper, cost based droop schemes are proposed, to minimize the total active power generation cost in an islanded microgrid (MG), while the simplicity and decentralized nature of the droop control are retained. In cost based droop schemes, the incremental costs of distributed generators (DGs......) are embedded into the droop schemes, where the incremental cost is a derivative of the DG cost function with respect to output power. In the steady state, DGs share a single common frequency, and cost based droop schemes equate incremental costs of DGs, thus minimizing the total active power generation cost......, in terms of the equal incremental cost principle. Finally, simulation results in an islanded MG with high penetration of intermittent renewable energy sources are presented, to demonstrate the eectiveness, as well as plug and play capability of the cost based droop schemes....

  1. Texture Image Classification Based on Gabor Wavelet

    Institute of Scientific and Technical Information of China (English)

    DENG Wei-bing; LI Hai-fei; SHI Ya-li; YANG Xiao-hui

    2014-01-01

    For a texture image, by recognizining the class of every pixel of the image, it can be partitioned into disjoint regions of uniform texture. This paper proposed a texture image classification algorithm based on Gabor wavelet. In this algorithm, characteristic of every image is obtained through every pixel and its neighborhood of this image. And this algorithm can achieve the information transform between different sizes of neighborhood. Experiments on standard Brodatz texture image dataset show that our proposed algorithm can achieve good classification rates.

  2. Density Based Support Vector Machines for Classification

    Directory of Open Access Journals (Sweden)

    Zahra Nazari

    2015-04-01

    Full Text Available Support Vector Machines (SVM is the most successful algorithm for classification problems. SVM learns the decision boundary from two classes (for Binary Classification of training points. However, sometimes there are some less meaningful samples amongst training points, which are corrupted by noises or misplaced in wrong side, called outliers. These outliers are affecting on margin and classification performance, and machine should better to discard them. SVM as a popular and widely used classification algorithm is very sensitive to these outliers and lacks the ability to discard them. Many research results prove this sensitivity which is a weak point for SVM. Different approaches are proposed to reduce the effect of outliers but no method is suitable for all types of data sets. In this paper, the new method of Density Based SVM (DBSVM is introduced. Population Density is the basic concept which is used in this method for both linear and non-linear SVM to detect outliers. Experiments on artificial data sets, real high-dimensional benchmark data sets of Liver disorder and Heart disease, and data sets of new and fatigued banknotes’ acoustic signals can prove the efficiency of this method on noisy data classification and the better generalization that it can provide compared to the standard SVM.

  3. Classification of Base Sequences (+1,

    Directory of Open Access Journals (Sweden)

    Dragomir Ž. Ðoković

    2010-01-01

    Full Text Available Base sequences BS(+1, are quadruples of {±1}-sequences (;;;, with A and B of length +1 and C and D of length n, such that the sum of their nonperiodic autocor-relation functions is a -function. The base sequence conjecture, asserting that BS(+1, exist for all n, is stronger than the famous Hadamard matrix conjecture. We introduce a new definition of equivalence for base sequences BS(+1, and construct a canonical form. By using this canonical form, we have enumerated the equivalence classes of BS(+1, for ≤30. As the number of equivalence classes grows rapidly (but not monotonically with n, the tables in the paper cover only the cases ≤13.

  4. MIXED SCHEME FOR IMAGE EDGE DETECTION BASED ON WAVELET TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Xie Hongmei; Yu Bianzhang; Zhao Jian

    2004-01-01

    A mixed scheme based on Wavelet Transformation (WT) is proposed for image edge detection. The scheme combines the wavelet transform and traditional Sobel and LoG (Laplacian of Gaussian) operator edge-detection algorithms. The precise theory analysis is given to show that the wavelet transformation has an advantage for signal processing. Simulation results show that the new scheme is better than only using the Sobel or LoG methods. Complexity analysis is also given and the conclusion is acceptable, therefore the proposed scheme is effective for edge detection.

  5. A novel chaotic encryption scheme based on pseudorandom bit padding

    CERN Document Server

    Sadra, Yaser; Fard, Zahra Arasteh

    2012-01-01

    Cryptography is always very important in data origin authentications, entity authentication, data integrity and confidentiality. In recent years, a variety of chaotic cryptographic schemes have been proposed. These schemes has typical structure which performed the permutation and the diffusion stages, alternatively. The random number generators are intransitive in cryptographic schemes and be used in the diffusion functions of the image encryption for diffused pixels of plain image. In this paper, we propose a chaotic encryption scheme based on pseudorandom bit padding that the bits be generated by a novel logistic pseudorandom image algorithm. To evaluate the security of the cipher image of this scheme, the key space analysis, the correlation of two adjacent pixels and differential attack were performed. This scheme tries to improve the problem of failure of encryption such as small key space and level of security.

  6. A new structure-based classification of gram-positive bacteriocins.

    Science.gov (United States)

    Zouhir, Abdelmajid; Hammami, Riadh; Fliss, Ismail; Hamida, Jeannette Ben

    2010-08-01

    Bacteriocins are ribosomally-synthesized peptides or proteins produced by a wide range of bacteria. The antimicrobial activity of this group of natural substances against foodborne pathogenic and spoilage bacteria has raised considerable interest for their application in food preservation. Classifying these bacteriocins in well defined classes according to their biochemical properties is a major step towards characterizing these anti-infective peptides and understanding their mode of action. Actually, the chosen criteria for bacteriocins' classification lack consistency and coherence. So, various classification schemes of bacteriocins resulted various levels of contradiction and sorting inefficiencies leading to bacteriocins belonging to more than one class at the same time and to a general lack of classification of many bacteriocins. Establishing a coherent and adequate classification scheme for these bacteriocins is sought after by several researchers in the field. It is not straightforward to formulate an efficient classification scheme that encompasses all of the existing bacteriocins. In the light of the structural data, here we revisit the previously proposed contradictory classification and we define new structure-based sequence fingerprints that support a subdivision of the bacteriocins into 12 groups. The paper lays down a resourceful and consistent classification approach that resulted in classifying more than 70% of bacteriocins known to date and with potential to identify distinct classes for the remaining unclassified bacteriocins. Identified groups are characterized by the presence of highly conserved short amino acid motifs. Furthermore, unclassified bacteriocins are expected to form an identified group when there will be sufficient sequences.

  7. Comparison of two SVD-based color image compression schemes.

    Science.gov (United States)

    Li, Ying; Wei, Musheng; Zhang, Fengxia; Zhao, Jianli

    2017-01-01

    Color image compression is a commonly used process to represent image data as few bits as possible, which removes redundancy in the data while maintaining an appropriate level of quality for the user. Color image compression algorithms based on quaternion are very common in recent years. In this paper, we propose a color image compression scheme, based on the real SVD, named real compression scheme. First, we form a new real rectangular matrix C according to the red, green and blue components of the original color image and perform the real SVD for C. Then we select several largest singular values and the corresponding vectors in the left and right unitary matrices to compress the color image. We compare the real compression scheme with quaternion compression scheme by performing quaternion SVD using the real structure-preserving algorithm. We compare the two schemes in terms of operation amount, assignment number, operation speed, PSNR and CR. The experimental results show that with the same numbers of selected singular values, the real compression scheme offers higher CR, much less operation time, but a little bit smaller PSNR than the quaternion compression scheme. When these two schemes have the same CR, the real compression scheme shows more prominent advantages both on the operation time and PSNR.

  8. Gemstones and geosciences in space and time. Digital maps to the "Chessboard classification scheme of mineral deposits"

    Science.gov (United States)

    Dill, Harald G.; Weber, Berthold

    2013-12-01

    The gemstones, covering the spectrum from jeweler's to showcase quality, have been presented in a tripartite subdivision, by country, geology and geomorphology realized in 99 digital maps with more than 2600 mineralized sites. The various maps were designed based on the "Chessboard classification scheme of mineral deposits" proposed by Dill (2010a, 2010b) to reveal the interrelations between gemstone deposits and mineral deposits of other commodities and direct our thoughts to potential new target areas for exploration. A number of 33 categories were used for these digital maps: chromium, nickel, titanium, iron, manganese, copper, tin-tungsten, beryllium, lithium, zinc, calcium, boron, fluorine, strontium, phosphorus, zirconium, silica, feldspar, feldspathoids, zeolite, amphibole (tiger's eye), olivine, pyroxenoid, garnet, epidote, sillimanite-andalusite, corundum-spinel - diaspore, diamond, vermiculite-pagodite, prehnite, sepiolite, jet, and amber. Besides the political base map (gems by country) the mineral deposit is drawn on a geological map, illustrating the main lithologies, stratigraphic units and tectonic structure to unravel the evolution of primary gemstone deposits in time and space. The geomorphological map is to show the control of climate and subaerial and submarine hydrography on the deposition of secondary gemstone deposits. The digital maps are designed so as to be plotted as a paper version of different scale and to upgrade them for an interactive use and link them to gemological databases.

  9. A FRACTAL-BASED STOCHASTIC INTERPOLATION SCHEME IN SUBSURFACE HYDROLOGY

    Science.gov (United States)

    The need for a realistic and rational method for interpolating sparse data sets is widespread. Real porosity and hydraulic conductivity data do not vary smoothly over space, so an interpolation scheme that preserves irregularity is desirable. Such a scheme based on the properties...

  10. PILOT-BASED FREQUENCY OFFSET DETECTION SCHEME IN OFDM SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Du Zheng; Zhu Jinkang

    2003-01-01

    The frequency offset information is extracted from local pilot amplitude characteristics, which suffer much less distortion in frequency-selective fading channels than those utilizing frequency domain correlation techniques. Simulation shows that the performance of this scheme has better performance than the existing frequency domain pilot-based frequency offset detection scheme.

  11. Ciphertext-Policy Attribute-Based Broadcast Encryption Scheme

    NARCIS (Netherlands)

    Asim, Muhammad; Ibraimi, Luan; Petkovic, Milan

    2011-01-01

    In this work, we design a new attribute-based encryption scheme with the revocation capability. In the proposed schemes, the user (broadcaster) encrypts the data according to an access policy over the set of attributes, and a list of the identities of revoked users. Only recipients who have attribut

  12. Developing Land Surface Type Map with Biome Classification Scheme Using Suomi NPP/JPSS VIIRS Data

    Science.gov (United States)

    Zhang, Rui; Huang, Chengquan; Zhan, Xiwu; Jin, Huiran

    2016-08-01

    Accurate representation of actual terrestrial surface types at regional to global scales is an important element for a wide range of applications, such as land surface parameterization, modeling of biogeochemical cycles, and carbon cycle studies. In this study, in order to meet the requirement of the retrieval of global leaf area index (LAI) and fraction of photosynthetically active radiation absorbed by the vegetation (fPAR) and other studies, a global map generated from Suomi National Polar- orbiting Partnership (S-NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) surface reflectance data in six major biome classes based on their canopy structures, which include: Grass/Cereal Crops, Shrubs, Broadleaf Crops, Savannas, Broadleaf Forests, and Needleleaf Forests, was created. The primary biome classes were converted from an International Geosphere-Biosphere Program (IGBP) legend global surface type data that was created in previous study, and the separation of two crop types are based on a secondary classification.

  13. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  14. Identity-based threshold signature and mediated proxy signature schemes

    Institute of Scientific and Technical Information of China (English)

    YU Yong; YANG Bo; SUN Ying

    2007-01-01

    Proxy signature schemes allow an original signer to delegate his signing rights to a proxy signer. However, many proxy signature schemes have the defect which is the inability to solve the proxy revocation problem. In this article, we firstly propose an identity-based threshold signature scheme and show that it has the properties of unforgeability and robustness. In our threshold signature scheme, we adopt such a method that the private key associated with an identity rather than the master key is shared. Then, based on the threshold signature scheme, an identity-based mediated proxy signature scheme is proposed where a security mediator (SEM) is introduced to help a proxy signer to generate valid proxy signatures, examine whether a proxy signer signs according to the warrant, and check the revocation of a proxy signer. It is shown that the proposed scheme satisfies all the security requirements of a secure proxy signature. Moreover, a proxy signer must cooperate with the SEM to generate a valid proxy signature, which makes the new scheme have an effective and fast proxy revocation.

  15. Image-based Vehicle Classification System

    CERN Document Server

    Ng, Jun Yee

    2012-01-01

    Electronic toll collection (ETC) system has been a common trend used for toll collection on toll road nowadays. The implementation of electronic toll collection allows vehicles to travel at low or full speed during the toll payment, which help to avoid the traffic delay at toll road. One of the major components of an electronic toll collection is the automatic vehicle detection and classification (AVDC) system which is important to classify the vehicle so that the toll is charged according to the vehicle classes. Vision-based vehicle classification system is one type of vehicle classification system which adopt camera as the input sensing device for the system. This type of system has advantage over the rest for it is cost efficient as low cost camera is used. The implementation of vision-based vehicle classification system requires lower initial investment cost and very suitable for the toll collection trend migration in Malaysia from single ETC system to full-scale multi-lane free flow (MLFF). This project ...

  16. Correlation technique and least square support vector machine combine for frequency domain based ECG beat classification.

    Science.gov (United States)

    Dutta, Saibal; Chatterjee, Amitava; Munshi, Sugata

    2010-12-01

    The present work proposes the development of an automated medical diagnostic tool that can classify ECG beats. This is considered an important problem as accurate, timely detection of cardiac arrhythmia can help to provide proper medical attention to cure/reduce the ailment. The proposed scheme utilizes a cross-correlation based approach where the cross-spectral density information in frequency domain is used to extract suitable features. A least square support vector machine (LS-SVM) classifier is developed utilizing the features so that the ECG beats are classified into three categories: normal beats, PVC beats and other beats. This three-class classification scheme is developed utilizing a small training dataset and tested with an enormous testing dataset to show the generalization capability of the scheme. The scheme, when employed for 40 files in the MIT/BIH arrhythmia database, could produce high classification accuracy in the range 95.51-96.12% and could outperform several competing algorithms.

  17. An Identity-Based Encryption Scheme with Compact Ciphertexts

    Institute of Scientific and Technical Information of China (English)

    LIU Sheng-li; GUO Bao-an; ZHANG Qing-sheng

    2009-01-01

    This paper proposes an identity-based encryption scheme with the help of bilinear pairings, where the identity information of a user functions as the user's public key. The advantage of an identity-based public key system is that it can avoid public key certificates and certificate management. Our identity-based encryption scheme enjoys short ciphertexts and provable security against chosen-ciphertext attack (CCA).

  18. A State-Based Dynamic Location Management Scheme

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    We consider a state-based dynamic location management scheme, in which, the user is partitioned into different mobility state set. And its location area size is changed dynamically corresponding to the state set that it belongs to. Comparing with the fixed LA scheme, numerical experiment result shows it's performance can be improved by 30% while the current location and paging procedure can still be applied. Besides, as this scheme need not process complicated user information, the requirement of computing power decreases significantly than the user-based schemes. Our scheme can be used in current 2G mobile systems (such as GSM, CDMA) and the third generation (3G) mobile systems with slightly modifying of the equipment software.

  19. Improved RB-HARQ scheme based on structured LDPC codes

    Institute of Scientific and Technical Information of China (English)

    WANG Wen-jun; LIN Yue-wei; YAN Yuan

    2007-01-01

    Reliability-based hybrid automatic repeat request (ARQ) (RB-HARQ) is a recently introduced approach to incremental-redundancy ARQ. In RB-HARQ scheme, the bits that are to be retransmitted are adaptively selected at the receiver based on the estimated bit reliability. It could result in significant performance gain but requires huge overhead in the feedback channel. In this study, an improved RB-HARQ scheme (IRB-HARQ) for structured low-density parity-check codes is proposed, which simplifies the comparison operations needed to search the bits to be retransmitted and outperforms the RB-HARQ scheme in consideration of the bit transmission power for the requesting messages on the feedback link. Simulation results show that the IRB-HARQ scheme is more efficient and practical than the RB-HARQ scheme.

  20. Cost-based droop scheme for DC microgrid

    DEFF Research Database (Denmark)

    Nutkani, Inam Ullah; Wang, Peng; Loh, Poh Chiang

    2014-01-01

    DC microgrids are gaining interest due to higher efficiencies of DC distribution compared with AC. The benefits of DC systems have been widely researched for data centers, IT facilities and residential applications. The research focus, however, has been more on system architecture and optimal...... voltage level, less on optimized operation and control of generation sources. The latter theme is perused in this paper, where cost-based droop scheme is proposed for distributed generators (DGs) in DC microgrids. Unlike traditional proportional power sharing based droop scheme, the proposed scheme......-connected operation. Most importantly, the proposed scheme can reduce overall total generation cost in DC microgrids without centralized controller and communication links. The performance of the proposed scheme has been verified under different load conditions....

  1. The method of narrow-band audio classification based on universal noise background model

    Science.gov (United States)

    Rui, Rui; Bao, Chang-chun

    2013-03-01

    Audio classification is the basis of content-based audio analysis and retrieval. The conventional classification methods mainly depend on feature extraction of audio clip, which certainly increase the time requirement for classification. An approach for classifying the narrow-band audio stream based on feature extraction of audio frame-level is presented in this paper. The audio signals are divided into speech, instrumental music, song with accompaniment and noise using the Gaussian mixture model (GMM). In order to satisfy the demand of actual environment changing, a universal noise background model (UNBM) for white noise, street noise, factory noise and car interior noise is built. In addition, three feature schemes are considered to optimize feature selection. The experimental results show that the proposed algorithm achieves a high accuracy for audio classification, especially under each noise background we used and keep the classification time less than one second.

  2. A Neuro-Fuzzy based System for Classification of Natural Textures

    Science.gov (United States)

    Jiji, G. Wiselin

    2016-12-01

    A statistical approach based on the coordinated clusters representation of images is used for classification and recognition of textured images. In this paper, two issues are being addressed; one is the extraction of texture features from the fuzzy texture spectrum in the chromatic and achromatic domains from each colour component histogram of natural texture images and the second issue is the concept of a fusion of multiple classifiers. The implementation of an advanced neuro-fuzzy learning scheme has been also adopted in this paper. The results of classification tests show the high performance of the proposed method that may have industrial application for texture classification, when compared with other works.

  3. Quantum Authentication Based on the Lengthened String Scheme

    Institute of Scientific and Technical Information of China (English)

    GUO Fen-zhuo; WEN Qiao-yan; ZHU Fu-chen

    2004-01-01

    Assuming there is a shared string between the users, we present a novel scheme, the lengthened string scheme (LSS), where the shared string is lengthened firstly by a specially designed algorithm, then some special treatments are applied on the lengthened one before it is used to authenticate. Based on the lengthened string scheme (LSS) we propose a quantum authentication protocol using entanglements. And the robustness of our protocol is discussed. In fact all the quantum key distribution protocols can do identification simultaneously based on the LSS.

  4. A SECURE PROXY SIGNATURE SCHEME BASED ON ELLIPTIC CURVE CRYPTOSYSTEM

    Institute of Scientific and Technical Information of China (English)

    Hu Bin; Jin Chenhui

    2006-01-01

    Proxy signature is a special digital signature which enables a proxy signer to sign messages on behalf of the original signer. This paper proposes a strongly secure proxy signature scheme and a secure multi-proxy signature scheme based on elliptic curve cryptosystem. Contrast with universal proxy signature schemes, they are secure against key substitute attack even if there is not a certificate authority in the system,and also secure against the original signer's forgery attack. Furthermore, based on the elliptic curve crypto system, they are more efficient and have smaller key size than other system. They can be used in electronics transaction and mobile agent environment.

  5. A threshold key escrow scheme based on public key cryptosystem

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In key escrow field it is important to solve the problem thatuser's secret key completely depends on the trusted escrow agency. In 1995, some methods of solving the problem were presented. But these methods are no better than that of directly using threshold cryptography. In this paper, we present a common pattern of threshold key escrow scheme based on public key cryptosystem, and a detailed design based on the improved RSA algorithm is given. The above problem is solved by this scheme.

  6. An expert system based intelligent control scheme for space bioreactors

    Science.gov (United States)

    San, Ka-Yiu

    1988-01-01

    An expert system based intelligent control scheme is being developed for the effective control and full automation of bioreactor systems in space. The scheme developed will have the capability to capture information from various resources including heuristic information from process researchers and operators. The knowledge base of the expert system should contain enough expertise to perform on-line system identification and thus be able to adapt the controllers accordingly with minimal human supervision.

  7. Distance-based classification of keystroke dynamics

    Science.gov (United States)

    Tran Nguyen, Ngoc

    2016-07-01

    This paper uses the keystroke dynamics in user authentication. The relationship between the distance metrics and the data template, for the first time, was analyzed and new distance based algorithm for keystroke dynamics classification was proposed. The results of the experiments on the CMU keystroke dynamics benchmark dataset1 were evaluated with an equal error rate of 0.0614. The classifiers using the proposed distance metric outperform existing top performing keystroke dynamics classifiers which use traditional distance metrics.

  8. A Secure Code-Based Authentication Scheme for RFID Systems

    Directory of Open Access Journals (Sweden)

    Noureddine Chikouche

    2015-08-01

    Full Text Available Two essential problems are still posed in terms of Radio Frequency Identification (RFID systems, including: security and limitation of resources. Recently, Li et al.'s proposed a mutual authentication scheme for RFID systems in 2014, it is based on Quasi Cyclic-Moderate Density Parity Check (QC-MDPC McEliece cryptosystem. This cryptosystem is designed to reducing the key sizes. In this paper, we found that this scheme does not provide untraceability and forward secrecy properties. Furthermore, we propose an improved version of this scheme to eliminate existing vulnerabilities of studied scheme. It is based on the QC-MDPC McEliece cryptosystem with padding the plaintext by a random bit-string. Our work also includes a security comparison between our improved scheme and different code-based RFID authentication schemes. We prove secrecy and mutual authentication properties by AVISPA (Automated Validation of Internet Security Protocols and Applications tools. Concerning the performance, our scheme is suitable for low-cost tags with resource limitation.

  9. Ship Classification with High Resolution TerraSAR-X Imagery Based on Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Zhi Zhao

    2013-01-01

    Full Text Available Ship surveillance using space-borne synthetic aperture radar (SAR, taking advantages of high resolution over wide swaths and all-weather working capability, has attracted worldwide attention. Recent activity in this field has concentrated mainly on the study of ship detection, but the classification is largely still open. In this paper, we propose a novel ship classification scheme based on analytic hierarchy process (AHP in order to achieve better performance. The main idea is to apply AHP on both feature selection and classification decision. On one hand, the AHP based feature selection constructs a selection decision problem based on several feature evaluation measures (e.g., discriminability, stability, and information measure and provides objective criteria to make comprehensive decisions for their combinations quantitatively. On the other hand, we take the selected feature sets as the input of KNN classifiers and fuse the multiple classification results based on AHP, in which the feature sets’ confidence is taken into account when the AHP based classification decision is made. We analyze the proposed classification scheme and demonstrate its results on a ship dataset that comes from TerraSAR-X SAR images.

  10. Application of Bayesian Classification to Content-Based Data Management

    Science.gov (United States)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  11. A new stylolite classification scheme to estimate compaction and local permeability variations

    Science.gov (United States)

    Koehn, D.; Rood, M. P.; Beaudoin, N.; Chung, P.; Bons, P. D.; Gomez-Rivas, E.

    2016-12-01

    We modeled the geometrical roughening of bedding-parallel, mainly layer-dominated stylolites in order to understand their structural evolution, to present an advanced classification of stylolite shapes and to relate this classification to chemical compaction and permeability variations at stylolites. Stylolites are rough dissolution seams that develop in sedimentary basins during chemical compaction. In the Zechstein 2 carbonate units, an important lean gas reservoir in the southern Permian Zechstein basin in Germany, stylolites influence local fluid flow, mineral replacement reactions and hence the permeability of the reservoir. Our simulations demonstrate that layer-dominated stylolites can grow in three distinct stages: an initial slow nucleation phase, a fast layer-pinning phase and a final freezing phase if the layer is completely dissolved during growth. Dissolution of the pinning layer and thus destruction of the stylolite's compaction tracking capabilities is a function of the background noise in the rock and the dissolution rate of the layer itself. Low background noise needs a slower dissolving layer for pinning to be successful but produces flatter teeth than higher background noise. We present an advanced classification based on our simulations and separate stylolites into four classes: (1) rectangular layer type, (2) seismogram pinning type, (3) suture/sharp peak type and (4) simple wave-like type. Rectangular layer type stylolites are the most appropriate for chemical compaction estimates because they grow linearly and record most of the actual compaction (up to 40 mm in the Zechstein example). Seismogram pinning type stylolites also provide good tracking capabilities, with the largest teeth tracking most of the compaction. Suture/sharp peak type stylolites grow in a non-linear fashion and thus do not record most of the actual compaction. However, when a non-linear growth law is used, the compaction estimates are similar to those making use of the

  12. Collaborative Representation based Classification for Face Recognition

    CERN Document Server

    Zhang, Lei; Feng, Xiangchu; Ma, Yi; Zhang, David

    2012-01-01

    By coding a query sample as a sparse linear combination of all training samples and then classifying it by evaluating which class leads to the minimal coding residual, sparse representation based classification (SRC) leads to interesting results for robust face recognition. It is widely believed that the l1- norm sparsity constraint on coding coefficients plays a key role in the success of SRC, while its use of all training samples to collaboratively represent the query sample is rather ignored. In this paper we discuss how SRC works, and show that the collaborative representation mechanism used in SRC is much more crucial to its success of face classification. The SRC is a special case of collaborative representation based classification (CRC), which has various instantiations by applying different norms to the coding residual and coding coefficient. More specifically, the l1 or l2 norm characterization of coding residual is related to the robustness of CRC to outlier facial pixels, while the l1 or l2 norm c...

  13. Texture feature based liver lesion classification

    Science.gov (United States)

    Doron, Yeela; Mayer-Wolf, Nitzan; Diamant, Idit; Greenspan, Hayit

    2014-03-01

    Liver lesion classification is a difficult clinical task. Computerized analysis can support clinical workflow by enabling more objective and reproducible evaluation. In this paper, we evaluate the contribution of several types of texture features for a computer-aided diagnostic (CAD) system which automatically classifies liver lesions from CT images. Based on the assumption that liver lesions of various classes differ in their texture characteristics, a variety of texture features were examined as lesion descriptors. Although texture features are often used for this task, there is currently a lack of detailed research focusing on the comparison across different texture features, or their combinations, on a given dataset. In this work we investigated the performance of Gray Level Co-occurrence Matrix (GLCM), Local Binary Patterns (LBP), Gabor, gray level intensity values and Gabor-based LBP (GLBP), where the features are obtained from a given lesion`s region of interest (ROI). For the classification module, SVM and KNN classifiers were examined. Using a single type of texture feature, best result of 91% accuracy, was obtained with Gabor filtering and SVM classification. Combination of Gabor, LBP and Intensity features improved the results to a final accuracy of 97%.

  14. Feature-Based Classification of Networks

    CERN Document Server

    Barnett, Ian; Kuijjer, Marieke L; Mucha, Peter J; Onnela, Jukka-Pekka

    2016-01-01

    Network representations of systems from various scientific and societal domains are neither completely random nor fully regular, but instead appear to contain recurring structural building blocks. These features tend to be shared by networks belonging to the same broad class, such as the class of social networks or the class of biological networks. At a finer scale of classification within each such class, networks describing more similar systems tend to have more similar features. This occurs presumably because networks representing similar purposes or constructions would be expected to be generated by a shared set of domain specific mechanisms, and it should therefore be possible to classify these networks into categories based on their features at various structural levels. Here we describe and demonstrate a new, hybrid approach that combines manual selection of features of potential interest with existing automated classification methods. In particular, selecting well-known and well-studied features that ...

  15. SQL based cardiovascular ultrasound image classification.

    Science.gov (United States)

    Nandagopalan, S; Suryanarayana, Adiga B; Sudarshan, T S B; Chandrashekar, Dhanalakshmi; Manjunath, C N

    2013-01-01

    This paper proposes a novel method to analyze and classify the cardiovascular ultrasound echocardiographic images using Naïve-Bayesian model via database OLAP-SQL. Efficient data mining algorithms based on tightly-coupled model is used to extract features. Three algorithms are proposed for classification namely Naïve-Bayesian Classifier for Discrete variables (NBCD) with SQL, NBCD with OLAP-SQL, and Naïve-Bayesian Classifier for Continuous variables (NBCC) using OLAP-SQL. The proposed model is trained with 207 patient images containing normal and abnormal categories. Out of the three proposed algorithms, a high classification accuracy of 96.59% was achieved from NBCC which is better than the earlier methods.

  16. A New ID-Based Proxy Blind Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    LANG Wei-min; YANG Zong-kai; CHENG Wen-qing; TAN Yun-meng

    2005-01-01

    An identity-based proxy blind signature scheme from bilinear pairings is introduced, which combines the advantages of proxy signature and blind signature. Furthermore, our scheme can prevent the original signer from generating the proxy blind signature, thus the profits of the proxy signer are guaranteed. We introduce bilinear pairings to minimize computational overhead and to improve the related performance of our scheme. In addition, the proxy blind signature presented is non-repudiable and it fulfills perfectly the security requirements of a proxy blind signature.

  17. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    Science.gov (United States)

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis.

  18. Generic-Model-Based Description Scheme for MPEG-7

    Institute of Scientific and Technical Information of China (English)

    Deng Juan; Tan Hut; Chen Xin-meng

    2004-01-01

    We propose a new description scheme for MPEG7-: Generic-model-based Description Scheme to describe contents of audio, video, text and other sorts of multimedia.It uses a generic model as the description frame, which provides a simple but useful object-based structure. The main components of the description scheme are generic model, objects and object fcatures. The proposed description scheme is illustrated and exemplified by Extensible Markup Language.It aims at clarity and flexibility to support MPEG-7 applications such as query and edit. We demonstrate its feasibility and efficiency by presenting applications: Digital Broadcasting and Edit System (DEBS) and Non-linear Edit System (NLES) that already used the generic structure or will greatly benefit from it.

  19. Dihedral-Based Segment Identification and Classification of Biopolymers II: Polynucleotides

    Science.gov (United States)

    2013-01-01

    In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which—to our knowledge—were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides. PMID:24364355

  20. Dihedral-based segment identification and classification of biopolymers II: polynucleotides.

    Science.gov (United States)

    Nagy, Gabor; Oostenbrink, Chris

    2014-01-27

    In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers I: Proteins. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400541d), we introduce a new algorithm for structure classification of biopolymeric structures based on main-chain dihedral angles. The DISICL algorithm (short for DIhedral-based Segment Identification and CLassification) classifies segments of structures containing two central residues. Here, we introduce the DISICL library for polynucleotides, which is based on the dihedral angles ε, ζ, and χ for the two central residues of a three-nucleotide segment of a single strand. Seventeen distinct structural classes are defined for nucleotide structures, some of which--to our knowledge--were not described previously in other structure classification algorithms. In particular, DISICL also classifies noncanonical single-stranded structural elements. DISICL is applied to databases of DNA and RNA structures containing 80,000 and 180,000 segments, respectively. The classifications according to DISICL are compared to those of another popular classification scheme in terms of the amount of classified nucleotides, average occurrence and length of structural elements, and pairwise matches of the classifications. While the detailed classification of DISICL adds sensitivity to a structure analysis, it can be readily reduced to eight simplified classes providing a more general overview of the secondary structure in polynucleotides.

  1. Markov chaotic sequences for correlation based watermarking schemes

    Energy Technology Data Exchange (ETDEWEB)

    Tefas, A.; Nikolaidis, A.; Nikolaidis, N.; Solachidis, V.; Tsekeridou, S.; Pitas, I. E-mail: pitas@zeus.csd.auth.gr

    2003-07-01

    In this paper, statistical analysis of watermarking schemes based on correlation detection is presented. Statistical properties of watermark sequences generated by piecewise-linear Markov maps are exploited, resulting in superior watermark detection reliability. Correlation/spectral properties of such sequences are easily controllable, a fact that affects the watermarking system performance. A family of chaotic maps, namely the skew tent map family, is proposed for use in watermarking schemes.

  2. A Directly Public Verifiable Signcryption Scheme based on Elliptic Curves

    CERN Document Server

    Toorani, Mohsen; 10.1109/ISCC.2009.5202242

    2010-01-01

    A directly public verifiable signcryption scheme is introduced in this paper that provides the security attributes of message confidentiality, authentication, integrity, non-repudiation, unforgeability, and forward secrecy of message confidentiality. It provides the attribute of direct public verifiability so anyone can verify the signcryption without any need for any secret information from the corresponding participants. The proposed scheme is based on elliptic curve cryptography and is so suitable for environments with resource constraints.

  3. An Efficient Attack on a Code-Based Signature Scheme

    OpenAIRE

    Phesso, Aurélie; Tillich, Jean-Pierre

    2016-01-01

    International audience; Baldi et al. have introduced in [BBC + 13] a very novel code based signature scheme. However we will prove here that some of the bits of the signatures are correlated in this scheme and this allows an attack that recovers enough of the underlying secret structure to forge new signatures. This cryptanalysis was performed on the parameters which were devised for 80 bits of security and broke them with 100, 000 signatures originating from the same secret key.

  4. Hyperspectral remote sensing image classification based on decision level fusion

    Institute of Scientific and Technical Information of China (English)

    Peijun Du; Wei Zhang; Junshi Xia

    2011-01-01

    @@ To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner.To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers.In the experiment, by using the operational modular imaging spectrometer (OMIS) II HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion.The results also indicate that the optimization of input features can improve the classification performance.%To apply decision level fusion to hyperspectral remote sensing (HRS) image classification, three decision level fusion strategies are experimented on and compared, namely, linear consensus algorithm, improved evidence theory, and the proposed support vector machine (SVM) combiner. To evaluate the effects of the input features on classification performance, four schemes are used to organize input features for member classifiers. In the experiment, by using the operational modular imaging spectrometer (OMIS) Ⅱ HRS image, the decision level fusion is shown as an effective way for improving the classification accuracy of the HRS image, and the proposed SVM combiner is especially suitable for decision level fusion. The results also indicate that the optimization of input features can improve the classification performance.

  5. Digital image-based classification of biodiesel.

    Science.gov (United States)

    Costa, Gean Bezerra; Fernandes, David Douglas Sousa; Almeida, Valber Elias; Araújo, Thomas Souto Policarpo; Melo, Jessica Priscila; Diniz, Paulo Henrique Gonçalves Dias; Véras, Germano

    2015-07-01

    This work proposes a simple, rapid, inexpensive, and non-destructive methodology based on digital images and pattern recognition techniques for classification of biodiesel according to oil type (cottonseed, sunflower, corn, or soybean). For this, differing color histograms in RGB (extracted from digital images), HSI, Grayscale channels, and their combinations were used as analytical information, which was then statistically evaluated using Soft Independent Modeling by Class Analogy (SIMCA), Partial Least Squares Discriminant Analysis (PLS-DA), and variable selection using the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). Despite good performances by the SIMCA and PLS-DA classification models, SPA-LDA provided better results (up to 95% for all approaches) in terms of accuracy, sensitivity, and specificity for both the training and test sets. The variables selected Successive Projections Algorithm clearly contained the information necessary for biodiesel type classification. This is important since a product may exhibit different properties, depending on the feedstock used. Such variations directly influence the quality, and consequently the price. Moreover, intrinsic advantages such as quick analysis, requiring no reagents, and a noteworthy reduction (the avoidance of chemical characterization) of waste generation, all contribute towards the primary objective of green chemistry.

  6. Genome-based Taxonomic Classification of Bacteroidetes

    Directory of Open Access Journals (Sweden)

    Richard L. Hahnke

    2016-12-01

    Full Text Available The bacterial phylum Bacteroidetes, characterized by a distinct gliding motility, occurs in a broad variety of ecosystems, habitats, life styles and physiologies. Accordingly, taxonomic classification of the phylum, based on a limited number of features, proved difficult and controversial in the past, for example, when decisions were based on unresolved phylogenetic trees of the 16S rRNA gene sequence. Here we use a large collection of type-strain genomes from Bacteroidetes and closely related phyla for assessing their taxonomy based on the principles of phylogenetic classification and trees inferred from genome-scale data. No significant conflict between 16S rRNA gene and whole-genome phylogenetic analysis is found, whereas many but not all of the involved taxa are supported as monophyletic groups, particularly in the genome-scale trees. Phenotypic and phylogenomic features support the separation of Balneolaceae as new phylum Balneolaeota from Rhodothermaeota and of Saprospiraceae as new class Saprospiria from Chitinophagia. Epilithonimonas is nested within the older genus Chryseobacterium and without significant phenotypic differences; thus merging the two genera is proposed. Similarly, Vitellibacter is proposed to be included in Aequorivita. Flexibacter is confirmed as being heterogeneous and dissected, yielding six distinct genera. Hallella seregens is a later heterotypic synonym of Prevotella dentalis. Compared to values directly calculated from genome sequences, the G+C content mentioned in many species descriptions is too imprecise; moreover, corrected G+C content values have a significantly better fit to the phylogeny. Corresponding emendations of species descriptions are provided where necessary. Whereas most observed conflict with the current classification of Bacteroidetes is already visible in 16S rRNA gene trees, as expected whole-genome phylogenies are much better resolved.

  7. Genome-Based Taxonomic Classification of Bacteroidetes.

    Science.gov (United States)

    Hahnke, Richard L; Meier-Kolthoff, Jan P; García-López, Marina; Mukherjee, Supratim; Huntemann, Marcel; Ivanova, Natalia N; Woyke, Tanja; Kyrpides, Nikos C; Klenk, Hans-Peter; Göker, Markus

    2016-01-01

    The bacterial phylum Bacteroidetes, characterized by a distinct gliding motility, occurs in a broad variety of ecosystems, habitats, life styles, and physiologies. Accordingly, taxonomic classification of the phylum, based on a limited number of features, proved difficult and controversial in the past, for example, when decisions were based on unresolved phylogenetic trees of the 16S rRNA gene sequence. Here we use a large collection of type-strain genomes from Bacteroidetes and closely related phyla for assessing their taxonomy based on the principles of phylogenetic classification and trees inferred from genome-scale data. No significant conflict between 16S rRNA gene and whole-genome phylogenetic analysis is found, whereas many but not all of the involved taxa are supported as monophyletic groups, particularly in the genome-scale trees. Phenotypic and phylogenomic features support the separation of Balneolaceae as new phylum Balneolaeota from Rhodothermaeota and of Saprospiraceae as new class Saprospiria from Chitinophagia. Epilithonimonas is nested within the older genus Chryseobacterium and without significant phenotypic differences; thus merging the two genera is proposed. Similarly, Vitellibacter is proposed to be included in Aequorivita. Flexibacter is confirmed as being heterogeneous and dissected, yielding six distinct genera. Hallella seregens is a later heterotypic synonym of Prevotella dentalis. Compared to values directly calculated from genome sequences, the G+C content mentioned in many species descriptions is too imprecise; moreover, corrected G+C content values have a significantly better fit to the phylogeny. Corresponding emendations of species descriptions are provided where necessary. Whereas most observed conflict with the current classification of Bacteroidetes is already visible in 16S rRNA gene trees, as expected whole-genome phylogenies are much better resolved.

  8. Mining discriminative class codes for multi-class classification based on minimizing generalization errors

    Science.gov (United States)

    Eiadon, Mongkon; Pipanmaekaporn, Luepol; Kamonsantiroj, Suwatchai

    2016-07-01

    Error Correcting Output Code (ECOC) has emerged as one of promising techniques for solving multi-class classification. In the ECOC framework, a multi-class problem is decomposed into several binary ones with a coding design scheme. Despite this, the suitable multi-class decomposition scheme is still ongoing research in machine learning. In this work, we propose a novel multi-class coding design method to mine the effective and compact class codes for multi-class classification. For a given n-class problem, this method decomposes the classes into subsets by embedding a structure of binary trees. We put forward a novel splitting criterion based on minimizing generalization errors across the classes. Then, a greedy search procedure is applied to explore the optimal tree structure for the problem domain. We run experiments on many multi-class UCI datasets. The experimental results show that our proposed method can achieve better classification performance than the common ECOC design methods.

  9. An Efficient Content Based Image Retrieval Scheme

    Directory of Open Access Journals (Sweden)

    Zukuan WEI

    2013-11-01

    Full Text Available Due to the recent improvements in digital photography and storage capacity, storing large amounts of images has been made possible. Consequently efficient means to retrieve images matching a user’s query are needed. In this paper, we propose a framework based on a bipartite graph model (BGM for semantic image retrieval. BGM is a scalable data structure that aids semantic indexing in an efficient manner, and it can also be incrementally updated. Firstly, all the images are segmented into several regions with image segmentation algorithm, pre-trained SVMs are used to annotate each region, and final label is obtained by merging all the region labels. Then we use the set of images and the set of region labels to build a bipartite graph. When a query is given, a query node, initially containing a fixed number of labels, is created to attach to the bipartite graph. The node then distributes the labels based on the edge weight between the node and its neighbors. Image nodes receiving the most labels represent the most relevant images. Experimental results demonstrate that our proposed technique is promising.

  10. How does the selection of landscape classification schemes affect the spatial pattern of natural landscapes? An assessment on a coastal wetland site in southern Italy.

    Science.gov (United States)

    Tomaselli, V; Veronico, G; Sciandrello, S; Blonda, P

    2016-06-01

    It is widely known that thematic resolution affects spatial pattern and landscape metrics performances. In literature, data dealing with this issue usually refer to a specific class scheme with its thematic levels. In this paper, the effects of different land cover (LC) and habitat classification schemes on the spatial pattern of a coastal landscape were compared. One of the largest components of the Mediterranean wetland system was considered as the study site, and different schemes widely used in the EU were selected and harmonized with a common thematic resolution, suitable for habitat discrimination and monitoring. For each scheme, a thematic map was produced and, for each map, 28 landscape metrics were calculated. The landscape composition, already in terms of number of classes, class area, and number of patches, changes significantly among different classification schemes. Landscape complexity varies according to the class scheme considered and its underlying semantics, depending on how the different types aggregate or split when changing class scheme. Results confirm that the selection of a specific class scheme affects the spatial pattern of the derived landscapes and consequently the landscape metrics, especially at class level. Moreover, among the classification schemes considered, EUNIS seems to be the best choice for a comprehensive representation of both natural and anthropogenic classes.

  11. The polarimetric entropy classification of SAR based on the clustering and signal noise ration

    Science.gov (United States)

    Shi, Lei; Yang, Jie; Lang, Fengkai

    2009-10-01

    Usually, Wishart H/α/A classification is an effective unsupervised classification method. However, the anisotropy parameter (A) is an unstable factor in the low signal noise ration (SNR) areas; at the same time, many clusters are useless to manually recognize. In order to avoid too many clusters to affect the manual recognition and the convergence of iteration and aiming at the drawback of the Wishart classification, in this paper, an enhancive unsupervised Wishart classification scheme for POLSAR data sets is introduced. The anisotropy parameter A is used to subdivide the target after H/α classification, this parameter has the ability to subdivide the homogeneity area in high SNR condition which can not be classified by using H/α. It is very useful to enhance the adaptability in difficult areas. Yet, the target polarimetric decomposition is affected by SNR before the classification; thus, the local homogeneity area's SNR evaluation is necessary. After using the direction of the edge detection template to examine the direction of POL-SAR images, the results can be processed to estimate SNR. The SNR could turn to a powerful tool to guide H/α/A classification. This scheme is able to correct the mistake judging of using A parameter such as eliminating much insignificant spot on the road and urban aggregation, even having a good performance in the complex forest. To convenience the manual recognition, an agglomerative clustering algorithm basing on the method of deviation-class is used to consolidate some clusters which are similar in 3by3 polarimetric coherency matrix. This classification scheme is applied to full polarimetric L band SAR image of Foulum area, Denmark.

  12. Key Updating Methods for Combinatorial Design Based Key Management Schemes

    Directory of Open Access Journals (Sweden)

    Chonghuan Xu

    2014-01-01

    Full Text Available Wireless sensor network (WSN has become one of the most promising network technologies for many useful applications. However, for the lack of resources, it is different but important to ensure the security of the WSNs. Key management is a corner stone on which to build secure WSNs for it has a fundamental role in confidentiality, authentication, and so on. Combinatorial design theory has been used to generate good-designed key rings for each sensor node in WSNs. A large number of combinatorial design based key management schemes have been proposed but none of them have taken key updating into consideration. In this paper, we point out the essence of key updating for the unital design based key management scheme and propose two key updating methods; then, we conduct performance analysis on the two methods from three aspects; at last, we generalize the two methods to other combinatorial design based key management schemes and enhance the second method.

  13. Cirrhosis Classification Based on Texture Classification of Random Features

    Directory of Open Access Journals (Sweden)

    Hui Liu

    2014-01-01

    Full Text Available Accurate staging of hepatic cirrhosis is important in investigating the cause and slowing down the effects of cirrhosis. Computer-aided diagnosis (CAD can provide doctors with an alternative second opinion and assist them to make a specific treatment with accurate cirrhosis stage. MRI has many advantages, including high resolution for soft tissue, no radiation, and multiparameters imaging modalities. So in this paper, multisequences MRIs, including T1-weighted, T2-weighted, arterial, portal venous, and equilibrium phase, are applied. However, CAD does not meet the clinical needs of cirrhosis and few researchers are concerned with it at present. Cirrhosis is characterized by the presence of widespread fibrosis and regenerative nodules in the hepatic, leading to different texture patterns of different stages. So, extracting texture feature is the primary task. Compared with typical gray level cooccurrence matrix (GLCM features, texture classification from random features provides an effective way, and we adopt it and propose CCTCRF for triple classification (normal, early, and middle and advanced stage. CCTCRF does not need strong assumptions except the sparse character of image, contains sufficient texture information, includes concise and effective process, and makes case decision with high accuracy. Experimental results also illustrate the satisfying performance and they are also compared with typical NN with GLCM.

  14. Cirrhosis classification based on texture classification of random features.

    Science.gov (United States)

    Liu, Hui; Shao, Ying; Guo, Dongmei; Zheng, Yuanjie; Zhao, Zuowei; Qiu, Tianshuang

    2014-01-01

    Accurate staging of hepatic cirrhosis is important in investigating the cause and slowing down the effects of cirrhosis. Computer-aided diagnosis (CAD) can provide doctors with an alternative second opinion and assist them to make a specific treatment with accurate cirrhosis stage. MRI has many advantages, including high resolution for soft tissue, no radiation, and multiparameters imaging modalities. So in this paper, multisequences MRIs, including T1-weighted, T2-weighted, arterial, portal venous, and equilibrium phase, are applied. However, CAD does not meet the clinical needs of cirrhosis and few researchers are concerned with it at present. Cirrhosis is characterized by the presence of widespread fibrosis and regenerative nodules in the hepatic, leading to different texture patterns of different stages. So, extracting texture feature is the primary task. Compared with typical gray level cooccurrence matrix (GLCM) features, texture classification from random features provides an effective way, and we adopt it and propose CCTCRF for triple classification (normal, early, and middle and advanced stage). CCTCRF does not need strong assumptions except the sparse character of image, contains sufficient texture information, includes concise and effective process, and makes case decision with high accuracy. Experimental results also illustrate the satisfying performance and they are also compared with typical NN with GLCM.

  15. "Chromosome": a knowledge-based system for the chromosome classification.

    Science.gov (United States)

    Ramstein, G; Bernadet, M

    1993-01-01

    Chromosome, a knowledge-based analysis system has been designed for the classification of human chromosomes. Its aim is to perform an optimal classification by driving a tool box containing the procedures of image processing, pattern recognition and classification. This paper presents the general architecture of Chromosome, based on a multiagent system generator. The image processing tool box is described from the met aphasic enhancement to the fine classification. Emphasis is then put on the knowledge base intended for the chromosome recognition. The global classification process is also presented, showing how Chromosome proceeds to classify a given chromosome. Finally, we discuss further extensions of the system for the karyotype building.

  16. Optical tomographic detection of rheumatoid arthritis with computer-aided classification schemes

    Science.gov (United States)

    Klose, Christian D.; Klose, Alexander D.; Netz, Uwe; Beuthan, Jürgen; Hielscher, Andreas H.

    2009-02-01

    A recent research study has shown that combining multiple parameters, drawn from optical tomographic images, leads to better classification results to identifying human finger joints that are affected or not affected by rheumatic arthritis RA. Building up on the research findings of the previous study, this article presents an advanced computer-aided classification approach for interpreting optical image data to detect RA in finger joints. Additional data are used including, for example, maximum and minimum values of the absorption coefficient as well as their ratios and image variances. Classification performances obtained by the proposed method were evaluated in terms of sensitivity, specificity, Youden index and area under the curve AUC. Results were compared to different benchmarks ("gold standard"): magnet resonance, ultrasound and clinical evaluation. Maximum accuracies (AUC=0.88) were reached when combining minimum/maximum-ratios and image variances and using ultrasound as gold standard.

  17. Judgement of Design Scheme Based on Flexible Constraint in ICAD

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The conception of flexible constraint is proposed in the paper. The solution of flexible constraint is in special range, and maybe different in different instances of same design scheme. The paper emphasis on how to evaluate and optimize a design scheme with flexible constraints based on the satisfaction degree function defined on flexible constraints. The conception of flexible constraint is used to solve constraint conflict and design optimization in complicated constraint-based assembly design by the PFM parametrization assembly design system. An instance of gear-box design is used for verifying optimization method.

  18. An Elliptic Curve-based Signcryption Scheme with Forward Secrecy

    CERN Document Server

    Toorani, Mohsen; 10.3923/jas.2009.1025.1035

    2010-01-01

    An elliptic curve-based signcryption scheme is introduced in this paper that effectively combines the functionalities of digital signature and encryption, and decreases the computational costs and communication overheads in comparison with the traditional signature-then-encryption schemes. It simultaneously provides the attributes of message confidentiality, authentication, integrity, unforgeability, non-repudiation, public verifiability, and forward secrecy of message confidentiality. Since it is based on elliptic curves and can use any fast and secure symmetric algorithm for encrypting messages, it has great advantages to be used for security establishments in store-and-forward applications and when dealing with resource-constrained devices.

  19. Fuzzy Rule Base System for Software Classification

    Directory of Open Access Journals (Sweden)

    Adnan Shaout

    2013-07-01

    Full Text Available Given the central role that software development plays in the delivery and application of informationtechnology, managers have been focusing on process improvement in the software development area. Thisimprovement has increased the demand for software measures, or metrics to manage the process. Thismetrics provide a quantitative basis for the development and validation of models during the softwaredevelopment process. In this paper a fuzzy rule-based system will be developed to classify java applicationsusing object oriented metrics. The system will contain the following features:Automated method to extract the OO metrics from the source code,Default/base set of rules that can be easily configured via XML file so companies, developers, teamleaders,etc, can modify the set of rules according to their needs,Implementation of a framework so new metrics, fuzzy sets and fuzzy rules can be added or removeddepending on the needs of the end user,General classification of the software application and fine-grained classification of the java classesbased on OO metrics, andTwo interfaces are provided for the system: GUI and command.

  20. A Detection Scheme for Cavity-based Dark Matter Searches

    CERN Document Server

    Bukhari, M H S

    2016-01-01

    We present here proposal of a scheme and some useful ideas for resonant cavity-based detection of cold dark matter axions with hope to improve the existing endeavors. The scheme is based upon our idea of a detector, which incorporates an integrated tunnel diode and a GaAs HEMT or HFET, High Electron Mobility Transistor or Heterogenous FET, for resonance detection and amplification from a resonant cavity (in a strong transverse magnetic field from a cylindrical array of halbach magnets). The idea of a TD-oscillator-amplifier combination could possibly serve as a more sensitive and viable resonance detection regime while maintaining an excellent performance with low noise temperature, whereas the halbach magnets array may offer a compact and permanent solution replacing the conventional electromagnets scheme. We believe that all these factors could possibly increase the sensitivity and accuracy of axion detection searches and reduce complications (and associated costs) in the experiments, in addition to help re...

  1. Malware Classification based on Call Graph Clustering

    CERN Document Server

    Kinable, Joris

    2010-01-01

    Each day, anti-virus companies receive tens of thousands samples of potentially harmful executables. Many of the malicious samples are variations of previously encountered malware, created by their authors to evade pattern-based detection. Dealing with these large amounts of data requires robust, automatic detection approaches. This paper studies malware classification based on call graph clustering. By representing malware samples as call graphs, it is possible to abstract certain variations away, and enable the detection of structural similarities between samples. The ability to cluster similar samples together will make more generic detection techniques possible, thereby targeting the commonalities of the samples within a cluster. To compare call graphs mutually, we compute pairwise graph similarity scores via graph matchings which approximately minimize the graph edit distance. Next, to facilitate the discovery of similar malware samples, we employ several clustering algorithms, including k-medoids and DB...

  2. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    Science.gov (United States)

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  3. Movie Popularity Classification based on Inherent Movie Attributes using C4.5, PART and Correlation Coefficient

    DEFF Research Database (Denmark)

    Ibnal Asad, Khalid; Ahmed, Tanvir; Rahman, Md. Saiedur

    2012-01-01

    Abundance of movie data across the internet makes it an obvious candidate for machine learning and knowledge discovery. But most researches are directed towards bi-polar classification of movie or generation of a movie recommendation system based on reviews given by viewers on various internet si...... propose classification scheme of pre-release movie popularity based on inherent attributes using C4.S and PART classifier algorithm and define the relation between attributes of post release movies using correlation coefficient....

  4. Dihedral-Based Segment Identification and Classification of Biopolymers I: Proteins

    Science.gov (United States)

    2013-01-01

    A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail. PMID:24364820

  5. Dihedral-based segment identification and classification of biopolymers I: proteins.

    Science.gov (United States)

    Nagy, Gabor; Oostenbrink, Chris

    2014-01-27

    A new structure classification scheme for biopolymers is introduced, which is solely based on main-chain dihedral angles. It is shown that by dividing a biopolymer into segments containing two central residues, a local classification can be performed. The method is referred to as DISICL, short for Dihedral-based Segment Identification and Classification. Compared to other popular secondary structure classification programs, DISICL is more detailed as it offers 18 distinct structural classes, which may be simplified into a classification in terms of seven more general classes. It was designed with an eye to analyzing subtle structural changes as observed in molecular dynamics simulations of biomolecular systems. Here, the DISICL algorithm is used to classify two databases of protein structures, jointly containing more than 10 million segments. The data is compared to two alternative approaches in terms of the amount of classified residues, average occurrence and length of structural elements, and pair wise matches of the classifications by the different programs. In an accompanying paper (Nagy, G.; Oostenbrink, C. Dihedral-based segment identification and classification of biopolymers II: Polynucleotides. J. Chem. Inf. Model. 2013, DOI: 10.1021/ci400542n), the analysis of polynucleotides is described and applied. Overall, DISICL represents a potentially useful tool to analyze biopolymer structures at a high level of detail.

  6. Age Classification Based On Integrated Approach

    Directory of Open Access Journals (Sweden)

    Pullela. SVVSR Kumar

    2014-05-01

    Full Text Available The present paper presents a new age classification method by integrating the features derived from Grey Level Co-occurrence Matrix (GLCM with a new structural approach derived from four distinct LBP's (4-DLBP on a 3 x 3 image. The present paper derived four distinct patterns called Left Diagonal (LD, Right diagonal (RD, vertical centre (VC and horizontal centre (HC LBP's. For all the LBP's the central pixel value of the 3 x 3 neighbourhood is significant. That is the reason in the present research LBP values are evaluated by comparing all 9 pixels of the 3 x 3 neighbourhood with the average value of the neighbourhood. The four distinct LBP's are grouped into two distinct LBP's. Based on these two distinct LBP's GLCM is computed and features are evaluated to classify the human age into four age groups i.e: Child (0-15, Young adult (16-30, Middle aged adult (31-50 and senior adult (>50. The co-occurrence features extracted from the 4-DLBP provides complete texture information about an image which is useful for classification. The proposed 4-DLBP reduces the size of the LBP from 6561 to 79 in the case of original texture spectrum and 2020 to 79 in the case of Fuzzy Texture approach.

  7. Automatic web services classification based on rough set theory

    Institute of Scientific and Technical Information of China (English)

    陈立; 张英; 宋自林; 苗壮

    2013-01-01

    With development of web services technology, the number of existing services in the internet is growing day by day. In order to achieve automatic and accurate services classification which can be beneficial for service related tasks, a rough set theory based method for services classification was proposed. First, the services descriptions were preprocessed and represented as vectors. Elicited by the discernibility matrices based attribute reduction in rough set theory and taking into account the characteristic of decision table of services classification, a method based on continuous discernibility matrices was proposed for dimensionality reduction. And finally, services classification was processed automatically. Through the experiment, the proposed method for services classification achieves approving classification result in all five testing categories. The experiment result shows that the proposed method is accurate and could be used in practical web services classification.

  8. Hyperspectral Image Classification Based on the Combination of Spatial-spectral Feature and Sparse Representation

    Directory of Open Access Journals (Sweden)

    YANG Zhaoxia

    2015-07-01

    Full Text Available In order to avoid the problem of being over-dependent on high-dimensional spectral feature in the traditional hyperspectral image classification, a novel approach based on the combination of spatial-spectral feature and sparse representation is proposed in this paper. Firstly, we extract the spatial-spectral feature by reorganizing the local image patch with the first d principal components(PCs into a vector representation, followed by a sorting scheme to make the vector invariant to local image rotation. Secondly, we learn the dictionary through a supervised method, and use it to code the features from test samples afterwards. Finally, we embed the resulting sparse feature coding into the support vector machine(SVM for hyperspectral image classification. Experiments using three hyperspectral data show that the proposed method can effectively improve the classification accuracy comparing with traditional classification methods.

  9. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  10. DWT-Based Watermarking Scheme for Digital Images

    Institute of Scientific and Technical Information of China (English)

    何泉; 苏广川

    2003-01-01

    A watermarking scheme for digital images is introduced. This method is based on discrete wavelet transform and spread spectrum technique. A discrete wavelet transformed binary signature image is expanded by an m-sequence and added to the large wavelet coefficients of a host image with a scale factor. Good balance between transparency and robustness is achieved by the selection of the scale factor. In addition, the spread spectrum technique is adopted to increase the robustness of this watermarking scheme. The experimental results show that the proposed method is of good performance and robustness for common image operations such as JPEG lossy compression, etc.

  11. A Novel Block-Based Scheme for Arithmetic Coding

    Directory of Open Access Journals (Sweden)

    Qi-Bin Hou

    2014-06-01

    Full Text Available It is well-known that for a given sequence, its optimal codeword length is fixed. Many coding schemes have been proposed to make the codeword length as close to the optimal value as possible. In this paper, a new block-based coding scheme operating on the subsequences of a source sequence is proposed. It is proved that the optimal codeword lengths of the subsequences are not larger than that of the given sequence. Experimental results using arithmetic coding will be presented.

  12. Wavelet based hierarchical coding scheme for radar image compression

    Science.gov (United States)

    Sheng, Wen; Jiao, Xiaoli; He, Jifeng

    2007-12-01

    This paper presents a wavelet based hierarchical coding scheme for radar image compression. Radar signal is firstly quantized to digital signal, and reorganized as raster-scanned image according to radar's repeated period frequency. After reorganization, the reformed image is decomposed to image blocks with different frequency band by 2-D wavelet transformation, each block is quantized and coded by the Huffman coding scheme. A demonstrating system is developed, showing that under the requirement of real time processing, the compression ratio can be very high, while with no significant loss of target signal in restored radar image.

  13. A Fair E-Cash Payment Scheme Based on Credit

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new fair e-cash payment scheme based on credit is present in this paper. In the scheme, an overdraft credit certificate is issued to user by bank. Using the overdraft credit certificate, user can produce e-cash himself to pay in exchanges. Merchant can verify the e-cash received from user. Bank can make a fair dispute resolution when there is a dissension between user and merchant. It can avoid the problem of partition e-cash for changes, prevent from reusing e-cash and faking e-cash. It fits justice, anonymity, non-deny and impartiality.

  14. A novel current-sharing scheme based on magamp

    Institute of Scientific and Technical Information of China (English)

    Wen-xi YAO; Xiao-yuan HONG; Zheng-yu LU

    2008-01-01

    The magamp (magnetic amplifier) is widely used in power supplies due to its low cost,simplicity and other advantages.This paper discusses a novel application of the magamp in switching power supplies,where the magamp is used to regulate pulse width modulation (PWM) instead of power signal in the main circuit.This method extends the application of the magamp in power supplies,and makes it possible to further regulate control signal when PWMs have been generated.Based on this application,a new current-sharing (CS) scheme using the magamp is proposed,which uses a modified inner loop CS structure.In this scheme PWMs are generated by one main controller,and CS is achieved by regulating PWMs using a magamp in each module.Compared with traditional application of the magamp,the new CS scheme can be used in most topologies and only requires magamps of low power capacity.Then a test circuit of parallel power supply is developed,in which CS is achieved by a PWM regulator with the magamp.The proposed scheme is also used to upgrade an electroplate power to make it capable of paralleling supplies.Experimental results show that the proposed scheme has good CS performance.

  15. An Efficient Provable Secure ID-Based Proxy Signature Scheme Based on CDH Assumption

    Institute of Scientific and Technical Information of China (English)

    CHAI Zhen-chuan; CAO Zhen-fu; LU Rong-xing

    2006-01-01

    Identity-based proxy signature enables an entity to delegate its signing rights to another entity in identity-based cryptosystem settings. However, few existing scheme has been proved secure in a formalized model, or acquired optimized performance. To achieve the goals of both proven security and high efficiency, this paper proposed an efficient identity-based proxy signature scheme. The scheme is constructed from bilinear pairing and proved secure in the random oracle model, using the oracle replay attack technique introduced by Pointchval and Stern. The analysis shows that the scheme needs less computation costs and has a shorter signature than the other schemes.

  16. A privacy authentication scheme based on cloud for medical environment.

    Science.gov (United States)

    Chen, Chin-Ling; Yang, Tsai-Tung; Chiang, Mao-Lun; Shih, Tzay-Farn

    2014-11-01

    With the rapid development of the information technology, the health care technologies already became matured. Such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concerning issue. In spite of many literatures discussed about medical systems, these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a privacy authentication scheme based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples to use medical resources on the cloud environment to find medical advice conveniently. The digital signature is used to ensure the security of the medical information that is certified by the medical department in our proposed scheme.

  17. A Sweep Coverage Scheme Based on Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Li Shu

    2013-04-01

    Full Text Available As an emerging coverage problem in wireless sensor networks, sweep coverage which introducing mobile sensors to cover points of interest within certain time interval can satisfy monitoring request in some particular application scenarios with less number of nodes than the conventional static coverage approach. In this work, aiming to support dynamical POI coverage and data delivery simultaneously, a novel sweep coverage scheme, named VRPSC(Vehicle Routing Problem based Sweep Coverage, is proposed by modeling the minimum number of required sensors problem in sweep coverage as a Vehicle Routing Problem (VRP. In VRPSC, an insertion algorithm is first introduced to create the initial scanning routes for POIs, and then the Simulated Annealing is employed to optimize these routes. The simulation results show that the VRPSC scheme achieves better performance than existing schemes

  18. A classification scheme of Amino Acids in the Genetic Code by Group Theory

    CERN Document Server

    Sachse, Sebastian

    2012-01-01

    We derive the amino acid assignment to one codon representation (typical 64-dimensional irreducible representation) of the basic classical Lie superalgebra osp(5|2) from biochemical arguments. We motivate the approach of mathematical symmetries to the classification of the building constituents of the biosphere by analogy of its success in particle physics and chemistry. The model enables to calculate polarity and molecular volume of amino acids to a good approximation.

  19. Graph-based Methods for Orbit Classification

    Energy Technology Data Exchange (ETDEWEB)

    Bagherjeiran, A; Kamath, C

    2005-09-29

    An important step in the quest for low-cost fusion power is the ability to perform and analyze experiments in prototype fusion reactors. One of the tasks in the analysis of experimental data is the classification of orbits in Poincare plots. These plots are generated by the particles in a fusion reactor as they move within the toroidal device. In this paper, we describe the use of graph-based methods to extract features from orbits. These features are then used to classify the orbits into several categories. Our results show that existing machine learning algorithms are successful in classifying orbits with few points, a situation which can arise in data from experiments.

  20. Sentiment classification technology based on Markov logic networks

    Science.gov (United States)

    He, Hui; Li, Zhigang; Yao, Chongchong; Zhang, Weizhe

    2016-07-01

    With diverse online media emerging, there is a growing concern of sentiment classification problem. At present, text sentiment classification mainly utilizes supervised machine learning methods, which feature certain domain dependency. On the basis of Markov logic networks (MLNs), this study proposed a cross-domain multi-task text sentiment classification method rooted in transfer learning. Through many-to-one knowledge transfer, labeled text sentiment classification, knowledge was successfully transferred into other domains, and the precision of the sentiment classification analysis in the text tendency domain was improved. The experimental results revealed the following: (1) the model based on a MLN demonstrated higher precision than the single individual learning plan model. (2) Multi-task transfer learning based on Markov logical networks could acquire more knowledge than self-domain learning. The cross-domain text sentiment classification model could significantly improve the precision and efficiency of text sentiment classification.

  1. Multifocus image fusion scheme based on nonsubsampled contourlet transform

    Science.gov (United States)

    Zhou, Xinxing; Wang, Dianhong; Duan, Zhijuan; Li, Dongming

    2011-06-01

    This paper proposes a novel multifocus image fusion scheme based on nonsubsampled contourlet transform (NSCT). The selection principles for different subband coefficients in NSCT domain are discussed in detail. In order to be consistent with the characteristics of the human visual system and improve the robustness of the fusion algorithm to the noise, the NSCT-DCT energy is first developed. Based on it, the clarity measure and bandpass energy contrast are defined and employed to motivate the pulse coupled neural networks (PCNN) for the fusion of lowpass and bandpass subbands, respectively. The performance of the proposed fusion scheme is assessed by experiments and the results demonstrate that the algorithm proposed in the paper compares favorably to wavelet-based, contourlet-based and NSCTbased fusion algorithms in terms of visual appearances and objective criterion.

  2. Computerized scheme for duplicate checking of bibliographic data bases

    Energy Technology Data Exchange (ETDEWEB)

    Giles, C.A.; Brooks, A.A.; Doszkocs, T.; Hummel, D.J.

    1976-08-01

    A technique for the automatic identification of duplicate documents within large bibliographic data bases has been designed and tested with encouraging results. The procedure is based on the generation and comparison of significant elements compressed from existing document descriptions. Problems arising from inconsistencies in editorial style and data base formats and from discrepancies in spelling, punctuation, translation and transliteration schemes are discussed; one method for circumventing ambiguities and errors of this type is proposed. The generalized computer program employs a key-making, sorting, weighting, and summation scheme for the detection of duplicates and, according to preliminary findings, achieves this objective with a high degree of accuracy. Sample results from five large data bases suggest that this automatic system performs as effectively as manual techniques.

  3. Knowledge-based sea ice classification by polarimetric SAR

    DEFF Research Database (Denmark)

    Skriver, Henning; Dierking, Wolfgang

    2004-01-01

    Polarimetric SAR images acquired at C- and L-band over sea ice in the Greenland Sea, Baltic Sea, and Beaufort Sea have been analysed with respect to their potential for ice type classification. The polarimetric data were gathered by the Danish EMISAR and the US AIRSAR which both are airborne...... systems. A hierarchical classification scheme was chosen for sea ice because our knowledge about magnitudes, variations, and dependences of sea ice signatures can be directly considered. The optimal sequence of classification rules and the rules themselves depend on the ice conditions/regimes. The use...... of the polarimetric phase information improves the classification only in the case of thin ice types but is not necessary for thicker ice (above about 30 cm thickness)...

  4. Color Based Authentication Scheme for Publically Disclosable Entities

    Directory of Open Access Journals (Sweden)

    A. S. Syed Shahul Hameed

    2015-01-01

    Full Text Available Traditional password authentication system are not strong enough to cope up with the current age of cybercrime. It’s high time that new password authentication schemes are explored and studied. This paper aims to provide an authentication system based on color recognition which would provide a way to encrypt both the username and password. The Color based authentication system would provide an efficient way to encrypt account details for sensitive applications like defense and banking.

  5. A New Symbol Synchronization Scheme for Cyclic Prefix Based Systems

    Institute of Scientific and Technical Information of China (English)

    KUANG Yu-jun; TENG Yong; YIN Chang-chuan; HAO Jian-jun; YUE Guang-xin

    2003-01-01

    This contribution proposes a new symbol synchronization scheme for cyclic prefix based modulation systems, which is disclosed in Ref.[16]. The proposed algorithm involves two steps. By using short-time Fourier transform, ISI-free intervals are estimated from time-frequency spectrum of the received signal, and then an optimum symbol start time is obtained. Computer simulation results show that the algorithm is very robust, and outperforms those based upon time-domain correlations.

  6. Cluster analysis based on dimensional information with applications to feature selection and classification

    Science.gov (United States)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  7. Deflection routing scheme for GMPLS-based OBS networks

    DEFF Research Database (Denmark)

    Eid, Arafat; Mahmood, Waqar; Alomar, Anwar;

    2010-01-01

    Integrating the Generalized Multi-Protocol Label Switching (GMPLS) framework into an Optical Burst Switching (OBS) Control Plane is a promising solution to alleviating most of OBS performance and design issues. However, implementing the already proposed OBS deflection routing schemes is not appli......Integrating the Generalized Multi-Protocol Label Switching (GMPLS) framework into an Optical Burst Switching (OBS) Control Plane is a promising solution to alleviating most of OBS performance and design issues. However, implementing the already proposed OBS deflection routing schemes...... is not applicable in such an integrated solution. This is due to the existence of already established Label Switched Paths (LSPs) between edge nodes in a GMPLS-based OBS network which guide the Data Burst Headers (DBHs) through the network. In this paper we propose a novel deflection routing scheme which can...... be implemented in GMPLS-based OBS Control Plane. In this scheme, deflection routes or LSPs are designed and pre-established for the whole network. The ingress nodes are responsible for enabling DBHs for deflection at contending core ports prior to DBHs transmission. Moreover, we propose an object extension...

  8. Gaussian Mixture Model and Deep Neural Network based Vehicle Detection and Classification

    Directory of Open Access Journals (Sweden)

    S Sri Harsha

    2016-09-01

    Full Text Available The exponential rise in the demand of vision based traffic surveillance systems have motivated academia-industries to develop optimal vehicle detection and classification scheme. In this paper, an adaptive learning rate based Gaussian mixture model (GMM algorithm has been developed for background subtraction of multilane traffic data. Here, vehicle rear information and road dash-markings have been used for vehicle detection. Performing background subtraction, connected component analysis has been applied to retrieve vehicle region. A multilayered AlexNet deep neural network (DNN has been applied to extract higher layer features. Furthermore, scale invariant feature transform (SIFT based vehicle feature extraction has been performed. The extracted 4096-dimensional features have been processed for dimensional reduction using principle component analysis (PCA and linear discriminant analysis (LDA. The features have been mapped for SVM-based classification. The classification results have exhibited that AlexNet-FC6 features with LDA give the accuracy of 97.80%, followed by AlexNet-FC6 with PCA (96.75%. AlexNet-FC7 feature with LDA and PCA algorithms has exhibited classification accuracy of 91.40% and 96.30%, respectively. On the contrary, SIFT features with LDA algorithm has exhibited 96.46% classification accuracy. The results revealed that enhanced GMM with AlexNet DNN at FC6 and FC7 can be significant for optimal vehicle detection and classification.

  9. A New Wavelet-Based Document Image Segmentation Scheme

    Institute of Scientific and Technical Information of China (English)

    赵健; 李道京; 俞卞章; 耿军平

    2002-01-01

    The document image segmentation is very useful for printing, faxing and data processing. An algorithm is developed for segmenting and classifying document image. Feature used for classification is based on the histogram distribution pattern of different image classes. The important attribute of the algorithm is using wavelet correlation image to enhance raw image's pattern, so the classification accuracy is improved. In this paper document image is divided into four types: background, photo, text and graph. Firstly, the document image background has been distingusished easily by former normally method; secondly, three image types will be distinguished by their typical histograms, in order to make histograms feature clearer, each resolution' s HH wavelet subimage is used to add to the raw image at their resolution. At last, the photo, text and praph have been devided according to how the feature fit to the Laplacian distrbution by -X2 and L. Simulations show that classification accuracy is significantly improved. The comparison with related shows that our algorithm provides both lower classification error rates and better visual results.

  10. Development of a classification scheme for disease-related enzyme information

    Directory of Open Access Journals (Sweden)

    Söhngen Carola

    2011-08-01

    Full Text Available Abstract Background BRENDA (BRaunschweig ENzyme DAtabase, http://www.brenda-enzymes.org is a major resource for enzyme related information. First and foremost, it provides data which are manually curated from the primary literature. DRENDA (Disease RElated ENzyme information DAtabase complements BRENDA with a focus on the automatic search and categorization of enzyme and disease related information from title and abstracts of primary publications. In a two-step procedure DRENDA makes use of text mining and machine learning methods. Results Currently enzyme and disease related references are biannually updated as part of the standard BRENDA update. 910,897 relations of EC-numbers and diseases were extracted from titles or abstracts and are included in the second release in 2010. The enzyme and disease entity recognition has been successfully enhanced by a further relation classification via machine learning. The classification step has been evaluated by a 5-fold cross validation and achieves an F1 score between 0.802 ± 0.032 and 0.738 ± 0.033 depending on the categories and pre-processing procedures. In the eventual DRENDA content every category reaches a classification specificity of at least 96.7% and a precision that ranges from 86-98% in the highest confidence level, and 64-83% for the smallest confidence level associated with higher recall. Conclusions The DRENDA processing chain analyses PubMed, locates references with disease-related information on enzymes and categorises their focus according to the categories causal interaction, therapeutic application, diagnostic usage and ongoing research. The categorisation gives an impression on the focus of the located references. Thus, the relation categorisation can facilitate orientation within the rapidly growing number of references with impact on diseases and enzymes. The DRENDA information is available as additional information in BRENDA.

  11. Video segmentation and classification for content-based storage and retrieval using motion vectors

    Science.gov (United States)

    Fernando, W. A. C.; Canagarajah, Cedric N.; Bull, David R.

    1998-12-01

    Video parsing is an important step in content-based indexing techniques, where the input video is decomposed into segments with uniform content. In video parsing detection of scene changes is one of the approaches widely used for extracting key frames from the video sequence. In this paper, an algorithm, based on motion vectors, is proposed to detect sudden scene changes and gradual scene changes (camera movements such as panning, tilting and zooming). Unlike some of the existing schemes, the proposed scheme is capable of detecting both sudden and gradual changes in uncompressed, as well as, compressed domain video. It is shown that the resultant motion vector can be used to identify and classify gradual changes due to camera movements. Results show that algorithm performed as well as the histogram-based schemes, with uncompressed video. The performance of the algorithm was also investigated with H.263 compressed video. The detection and classification of both sudden and gradual scene changes was successfully demonstrated.

  12. A Cartesian grid-based unified gas kinetic scheme

    Science.gov (United States)

    Chen, Songze; Xu, Kun

    2014-12-01

    A Cartesian grid-based unified gas kinetic scheme is developed. In this approach, any oriented boundary in a Cartesian grid is represented by many directional boundary points. The numerical flux is evaluated on each boundary point. Then, a boundary flux interpolation method (BFIM) is constructed to distribute the boundary effect to the flow evolution on regular Cartesian grid points. The BFIM provides a general strategy to implement any kind of boundary condition on Cartesian grid. The newly developed technique is implemented in the unified gas kinetic scheme, where the scheme is reformulated into a finite difference format. Several typical test cases are simulated with different geometries. For example, the thermophoresis phenomenon for a plate with infinitesimal thickness immersed in a rarefied flow environment is calculated under different orientations on the same Cartesian grid. These computational results validate the BFIM in the unified scheme for the capturing of different thermal boundary conditions. The BFIM can be extended to the moving boundary problems as well.

  13. Structure-Based Algorithms for Microvessel Classification

    KAUST Repository

    Smith, Amy F.

    2015-02-01

    © 2014 The Authors. Microcirculation published by John Wiley & Sons Ltd. Objective: Recent developments in high-resolution imaging techniques have enabled digital reconstruction of three-dimensional sections of microvascular networks down to the capillary scale. To better interpret these large data sets, our goal is to distinguish branching trees of arterioles and venules from capillaries. Methods: Two novel algorithms are presented for classifying vessels in microvascular anatomical data sets without requiring flow information. The algorithms are compared with a classification based on observed flow directions (considered the gold standard), and with an existing resistance-based method that relies only on structural data. Results: The first algorithm, developed for networks with one arteriolar and one venular tree, performs well in identifying arterioles and venules and is robust to parameter changes, but incorrectly labels a significant number of capillaries as arterioles or venules. The second algorithm, developed for networks with multiple inlets and outlets, correctly identifies more arterioles and venules, but is more sensitive to parameter changes. Conclusions: The algorithms presented here can be used to classify microvessels in large microvascular data sets lacking flow information. This provides a basis for analyzing the distinct geometrical properties and modelling the functional behavior of arterioles, capillaries, and venules.

  14. A new support vector machine based multiuser detection scheme

    Institute of Scientific and Technical Information of China (English)

    WANG Yong-jian; ZHAO Hong-lin

    2008-01-01

    In order to suppress the multiple access interference(MAI)in 3G,which limits the capacity of a CDMA communication system,a fast relevance vector machine(FRVM)is employed in the muhinser detection (MUD)scheme.This method aims to overcome the shortcomings of many ordinary support vector machine (SVM)based MUD schemes,such as the long training time and the inaccuracy of the decision data,and enhance the performance of a CDMA communication system.Computer simulation results demonstrate that the proposed FRVM based muhiuser detection has lower bit error rate,costs short training time,needs fewer kernel functions and possesses better near-far resistance.

  15. Enhancing Community Detection By Affinity-based Edge Weighting Scheme

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Andy [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanders, Geoffrey [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Henson, Van [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Vassilevski, Panayot [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-05

    Community detection refers to an important graph analytics problem of finding a set of densely-connected subgraphs in a graph and has gained a great deal of interest recently. The performance of current community detection algorithms is limited by an inherent constraint of unweighted graphs that offer very little information on their internal community structures. In this paper, we propose a new scheme to address this issue that weights the edges in a given graph based on recently proposed vertex affinity. The vertex affinity quantifies the proximity between two vertices in terms of their clustering strength, and therefore, it is ideal for graph analytics applications such as community detection. We also demonstrate that the affinity-based edge weighting scheme can improve the performance of community detection algorithms significantly.

  16. A Color Image Digital Watermarking Scheme Based on SOFM

    CERN Document Server

    Anitha, J

    2011-01-01

    Digital watermarking technique has been presented and widely researched to solve some important issues in the digital world, such as copyright protection, copy protection and content authentication. Several robust watermarking schemes based on vector quantization (VQ) have been presented. In this paper, we present a new digital image watermarking method based on SOFM vector quantizer for color images. This method utilizes the codebook partition technique in which the watermark bit is embedded into the selected VQ encoded block. The main feature of this scheme is that the watermark exists both in VQ compressed image and in the reconstructed image. The watermark extraction can be performed without the original image. The watermark is hidden inside the compressed image, so much transmission time and storage space can be saved when the compressed data are transmitted over the Internet. Simulation results demonstrate that the proposed method has robustness against various image processing operations without sacrif...

  17. C60-based clustering scheme for sensor management in STSS

    Institute of Scientific and Technical Information of China (English)

    Yiyu Zhou

    2015-01-01

    Clustering-based sensor-management schemes have been widely used for various wireless sensor networks (WSNs), as they are wel suited to the distributive and col aborative nature of WSN. In this paper, a C60-based clustering algorithm is proposed for the specific planned network of space tracking and surveil ance system (STSS), where al the sensors are partitioned into 12 clus-ters according to the C60 (or footbal surface) architecture, and then a hierarchical sensor-management scheme is wel designed. Final y, the algorithm is applied to a typical STSS constel ation, and the simulation results show that the proposed method has bet-ter target-tracking performance than the nonclustering scheduling method.

  18. RECURSIVE CLASSIFICATION OF MQAM SIGNALS BASED ON HIGHER ORDER CUMULANTS

    Institute of Scientific and Technical Information of China (English)

    Chen Weidong; Yang Shaoquan

    2002-01-01

    A new feature based on higher order cumulants is proposed for classification of MQAM signals. Theoretical analysis justify that the new feature is invariant with respect to translation (shift), scale and rotation transform of signal constellations, and can suppress color or white additive Gaussian noise. Computer simulation shows that the proposed recursive orderreduction based classification algorithm can classify MQAM signals with any order.

  19. Operator functional state classification using least-square support vector machine based recursive feature elimination technique.

    Science.gov (United States)

    Yin, Zhong; Zhang, Jianhua

    2014-01-01

    This paper proposed two psychophysiological-data-driven classification frameworks for operator functional states (OFS) assessment in safety-critical human-machine systems with stable generalization ability. The recursive feature elimination (RFE) and least square support vector machine (LSSVM) are combined and used for binary and multiclass feature selection. Besides typical binary LSSVM classifiers for two-class OFS assessment, two multiclass classifiers based on multiclass LSSVM-RFE and decision directed acyclic graph (DDAG) scheme are developed, one used for recognizing the high mental workload and fatigued state while the other for differentiating overloaded and base-line states from the normal states. Feature selection results have revealed that different dimensions of OFS can be characterized by specific set of psychophysiological features. Performance comparison studies show that reasonable high and stable classification accuracy of both classification frameworks can be achieved if the RFE procedure is properly implemented and utilized.

  20. Kinetic energy decomposition scheme based on information theory.

    Science.gov (United States)

    Imamura, Yutaka; Suzuki, Jun; Nakai, Hiromi

    2013-12-15

    We proposed a novel kinetic energy decomposition analysis based on information theory. Since the Hirshfeld partitioning for electron densities can be formulated in terms of Kullback-Leibler information deficiency in information theory, a similar partitioning for kinetic energy densities was newly proposed. The numerical assessments confirm that the current kinetic energy decomposition scheme provides reasonable chemical pictures for ionic and covalent molecules, and can also estimate atomic energies using a correction with viral ratios.

  1. Revisiting Quantum Authentication Scheme Based on Entanglement Swapping

    Science.gov (United States)

    Naseri, Mosayeb

    2016-05-01

    The crucial issue of quantum communication protocol is its security. In this paper, the security of the Quantum Authentication Scheme Based on Entanglement Swapping proposed by Penghao et al. (Int J Theor Phys., doi: 10.1007/s10773-015-2662-7) is reanalyzed. It is shown that the original does not complete the task of quantum authentication and communication securely. Furthermore a simple improvement on the protocol is proposed.

  2. Functions and Design Scheme of Tibet High Altitude Test Base

    Institute of Scientific and Technical Information of China (English)

    Yu Yongqing; Guo Jian; Yin Yu; Mao Yan; Li Guangfan; Fan Jianbin; Lu Jiayu; Su Zhiyi; Li Peng; Li Qingfeng; Liao Weiming; Zhou Jun

    2010-01-01

    @@ The functional orientation of the Tibet High Altitude Test Base, subordinated to the State Grid Corporation of China (SGCC), is to serve power transmission projects in high altitude areas, especially to provide technical support for southwestern hydropower delivery projects by UHVDC transmission and Qinghai-Tibet grid interconnection project. This paper presents the matters concerned during siting and planning, functions,design scheme, the main performances and parameters of the test facilities, as well as the tests and research tasks already carried out.

  3. WekaPyScript: Classification, Regression, and Filter Schemes for WEKA Implemented in Python

    Directory of Open Access Journals (Sweden)

    Christopher Beckham

    2016-08-01

    Full Text Available WekaPyScript is a package for the machine learning software WEKA that allows learning algorithms and preprocessing methods for classification and regression to be written in Python, as opposed to WEKA’s implementation language, Java. This opens up WEKA to its machine learning and scientific computing ecosystem. Furthermore, due to Python’s minimalist syntax, learning algorithms and preprocessing methods can be prototyped easily and utilised from within WEKA. WekaPyScript works by running a local Python server using the host’s installation of Python; as a result, any libraries installed in the host installation can be leveraged when writing a script for WekaPyScript. Three example scripts (two learning algorithms and one preprocessing method are presented.

  4. Spectral-Spatial Hyperspectral Image Classification Based on KNN

    Science.gov (United States)

    Huang, Kunshan; Li, Shutao; Kang, Xudong; Fang, Leyuan

    2016-12-01

    Fusion of spectral and spatial information is an effective way in improving the accuracy of hyperspectral image classification. In this paper, a novel spectral-spatial hyperspectral image classification method based on K nearest neighbor (KNN) is proposed, which consists of the following steps. First, the support vector machine is adopted to obtain the initial classification probability maps which reflect the probability that each hyperspectral pixel belongs to different classes. Then, the obtained pixel-wise probability maps are refined with the proposed KNN filtering algorithm that is based on matching and averaging nonlocal neighborhoods. The proposed method does not need sophisticated segmentation and optimization strategies while still being able to make full use of the nonlocal principle of real images by using KNN, and thus, providing competitive classification with fast computation. Experiments performed on two real hyperspectral data sets show that the classification results obtained by the proposed method are comparable to several recently proposed hyperspectral image classification methods.

  5. Integrating Globality and Locality for Robust Representation Based Classification

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2014-01-01

    Full Text Available The representation based classification method (RBCM has shown huge potential for face recognition since it first emerged. Linear regression classification (LRC method and collaborative representation classification (CRC method are two well-known RBCMs. LRC and CRC exploit training samples of each class and all the training samples to represent the testing sample, respectively, and subsequently conduct classification on the basis of the representation residual. LRC method can be viewed as a “locality representation” method because it just uses the training samples of each class to represent the testing sample and it cannot embody the effectiveness of the “globality representation.” On the contrary, it seems that CRC method cannot own the benefit of locality of the general RBCM. Thus we propose to integrate CRC and LRC to perform more robust representation based classification. The experimental results on benchmark face databases substantially demonstrate that the proposed method achieves high classification accuracy.

  6. [Evaluation of interventions intended to reduce psychosocial work stress. Proposal for a classification scheme].

    Science.gov (United States)

    Neuner, R; Bauer, J; Nübling, M; Rose, U; Krause, A

    2011-08-01

    Evidence for the effectiveness of measures aiming to reduce psychosocial work stress is sporadic. This is contradictory to the requirement identified by the German Social Security Code (SGB VII) that interventions constitute the most important method of maintaining and improving employees' health. Reasons for this can be seen in the complexity of the subject and methodological issues concerning scientific standards. In addition, agreed quality standards are nonexistent for the evaluation of intervention measures. For this reason, a synopsis of existing audit and evaluation schemes was performed, thus, resulting in refined and adapted quality standards for intervention measures aiming to reduce psychosocial work stress. The quality criteria presented in this paper comprise aims, effectiveness, and facilitators, each being composed of several indicators. The criteria are designed as quality indicators which translate the outcome of an evaluation into quality figures. The process is transparent and offers a rational basis for communication, planning, and decision-making in health promotion.

  7. Opposition-Based Discrete PSO Using Natural Encoding for Classification Rule Discovery

    Directory of Open Access Journals (Sweden)

    Naveed Kazim Khan

    2012-11-01

    Full Text Available In this paper we present a new Discrete Particle Swarm Optimization approach to induce rules from discrete data. The proposed algorithm, called Opposition‐ based Natural Discrete PSO (ONDPSO, initializes its population by taking into account the discrete nature of the data. Particles are encoded using a Natural Encoding scheme. Each member of the population updates its position iteratively on the basis of a newly designed position update rule. Opposition‐based learning is implemented in the optimization process. The encoding scheme and position update rule used by the algorithm allows individual terms corresponding to different attributes within the rule’s antecedent to be a disjunction of the values of those attributes. The performance of the proposed algorithm is evaluated against seven different datasets using a tenfold testing scheme. The achieved median accuracy is compared against various evolutionary and non‐evolutionary classification techniques. The algorithm produces promising results by creating highly accurate and precise rules for each dataset.

  8. A new classification algorithm based on RGH-tree search

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper, we put forward a new classification algorithm based on RGH-Tree search and perform the classification analysis and comparison study. This algorithm can save computing resource and increase the classification efficiency. The experiment shows that this algorithm can get better effect in dealing with three dimensional multi-kind data. We find that the algorithm has better generalization ability for small training set and big testing result.

  9. Classification of types of stuttering symptoms based on brain activity.

    Science.gov (United States)

    Jiang, Jing; Lu, Chunming; Peng, Danling; Zhu, Chaozhe; Howell, Peter

    2012-01-01

    Among the non-fluencies seen in speech, some are more typical (MT) of stuttering speakers, whereas others are less typical (LT) and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT) whole-word repetitions (WWR) should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type.

  10. Classification of types of stuttering symptoms based on brain activity.

    Directory of Open Access Journals (Sweden)

    Jing Jiang

    Full Text Available Among the non-fluencies seen in speech, some are more typical (MT of stuttering speakers, whereas others are less typical (LT and are common to both stuttering and fluent speakers. No neuroimaging work has evaluated the neural basis for grouping these symptom types. Another long-debated issue is which type (LT, MT whole-word repetitions (WWR should be placed in. In this study, a sentence completion task was performed by twenty stuttering patients who were scanned using an event-related design. This task elicited stuttering in these patients. Each stuttered trial from each patient was sorted into the MT or LT types with WWR put aside. Pattern classification was employed to train a patient-specific single trial model to automatically classify each trial as MT or LT using the corresponding fMRI data. This model was then validated by using test data that were independent of the training data. In a subsequent analysis, the classification model, just established, was used to determine which type the WWR should be placed in. The results showed that the LT and the MT could be separated with high accuracy based on their brain activity. The brain regions that made most contribution to the separation of the types were: the left inferior frontal cortex and bilateral precuneus, both of which showed higher activity in the MT than in the LT; and the left putamen and right cerebellum which showed the opposite activity pattern. The results also showed that the brain activity for WWR was more similar to that of the LT and fluent speech than to that of the MT. These findings provide a neurological basis for separating the MT and the LT types, and support the widely-used MT/LT symptom grouping scheme. In addition, WWR play a similar role as the LT, and thus should be placed in the LT type.

  11. Evaluation of ALOS PALSAR Imagery for Burned Area Mapping in Greece Using Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Anastasia Polychronaki

    2013-11-01

    Full Text Available In this work, the potential of Advanced Land Observing Satellite (ALOS Phased Array type L-band Synthetic Aperture Radar (PALSAR imagery to map burned areas was evaluated in two study areas in Greece. For this purpose, we developed an object-based classification scheme to map the fire-disturbed areas using the PALSAR imagery acquired before and shortly after fire events. The advantage of employing an object-based approach was not only the use of the temporal variation of the backscatter coefficient, but also the incorporation in the classification of topological features, such as neighbor objects, and class related features, such as objects classified as burned. The classification scheme resulted in mapping the burned areas with satisfactory results: 0.71 and 0.82 probabilities of detection for the two study areas. Our investigation revealed that the pre-fire vegetation conditions and fire severity should be taken in consideration when mapping burned areas using PALSAR in Mediterranean regions. Overall, findings suggest that the developed scheme could be applied for rapid burned area assessment, especially to areas where cloud cover and fire smoke inhibit accurate mapping of burned areas when optical data are used.

  12. A Novel User Authentication Scheme Based on QR-Code

    Directory of Open Access Journals (Sweden)

    Kuan-Chieh Liao

    2010-08-01

    Full Text Available User authentication is one of the fundamental procedures to ensure secure communications and share system resources over an insecure public network channel.  Thus, a simple and efficient authentication mechanism is required for securing the network system in the real environment. In general, the password-based authentication mechanism provides the basic capability to prevent unauthorized access. Especially, the purpose of the one-time password is to make it more difficult to gain unauthorized access to restricted resources. Instead of using the password file as conventional authentication systems, many researchers have devoted to implement various one-time password schemes using smart cards, time-synchronized token or short message service in order to reduce the risk of tampering and maintenance cost.  However, these schemes are impractical because of the far from ubiquitous hardware devices or the infrastructure requirements. To remedy these weaknesses, the attraction of the QR-code technique can be introduced into our one-time password authentication protocol. Not the same as before, the proposed scheme based on QR code not only eliminates the usage of the password verification table, but also is a cost effective solution since most internet users already have mobile phones. For this reason, instead of carrying around a separate hardware token for each security domain, the superiority of handiness benefit from the mobile phone makes our approach more practical and convenient.

  13. Image integrity authentication scheme based on fixed point theory.

    Science.gov (United States)

    Li, Xu; Sun, Xingming; Liu, Quansheng

    2015-02-01

    Based on the fixed point theory, this paper proposes a new scheme for image integrity authentication, which is very different from digital signature and fragile watermarking. By the new scheme, the sender transforms an original image into a fixed point image (very close to the original one) of a well-chosen transform and sends the fixed point image (instead of the original one) to the receiver; using the same transform, the receiver checks the integrity of the received image by testing whether it is a fixed point image and locates the tampered areas if the image has been modified during the transmission. A realization of the new scheme is based on Gaussian convolution and deconvolution (GCD) transform, for which an existence theorem of fixed points is proved. The semifragility is analyzed via commutativity of transforms, and three commutativity theorems are found for the GCD transform. Three iterative algorithms are presented for finding a fixed point image with a few numbers of iterations, and for the whole procedure of image integrity authentication; a fragile authentication system and a semifragile one are separately built. Experiments show that both the systems have good performance in transparence, fragility, security, and tampering localization. In particular, the semifragile system can perfectly resist the rotation by a multiple of 90° flipping and brightness attacks.

  14. An image encryption scheme based on quantum logistic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Lim, S.-C.; Hassan, Z.

    2012-12-01

    The topic of quantum chaos has begun to draw increasing attention in recent years. While a satisfactory definition for it is not settled yet in order to differentiate between its classical counterparts. Dissipative quantum maps can be characterized by sensitive dependence on initial conditions, like classical maps. Considering this property, an implementation of image encryption scheme based on the quantum logistic map is proposed. The security and performance analysis of the proposed image encryption is performed using well-known methods. The results of the reliability analysis are encouraging and it can be concluded that, the proposed scheme is efficient and secure. The results of this study also suggest application of other quantum maps such as quantum standard map and quantum baker map in cryptography and other aspects of security and privacy.

  15. About the Key Escrow Properties of Identity Based Encryption Schemes

    Directory of Open Access Journals (Sweden)

    Ruxandra Olimid

    2012-09-01

    Full Text Available IBE (Identity Based Encryption represents a type of public key encryption that allows a party to encrypt a message using the recipient’s identity as public key. The private keys needed for decryption are generated and distributed to each party by a KGC (Key Generation Center. The existence of such an entity in an IBE scheme allows access to the encrypted information for other parties other than the intended recipient by construction: the KGC or any other entity that receives the cryptographic keys from the KGC may perform decryption. A system that permits other parties to have access to the private keys of the users is said to have key escrow abilities. The paper performs a brief analysis of the key escrow properties of IBE schemes and gives a practical example of communication protocol that improves the key escrow capabilities.

  16. Prediction-based association control scheme in dense femtocell networks

    Science.gov (United States)

    Pham, Ngoc-Thai; Huynh, Thong; Hwang, Won-Joo; You, Ilsun; Choo, Kim-Kwang Raymond

    2017-01-01

    The deployment of large number of femtocell base stations allows us to extend the coverage and efficiently utilize resources in a low cost manner. However, the small cell size of femtocell networks can result in frequent handovers to the mobile user, and consequently throughput degradation. Thus, in this paper, we propose predictive association control schemes to improve the system’s effective throughput. Our design focuses on reducing handover frequency without impacting on throughput. The proposed schemes determine handover decisions that contribute most to the network throughput and are proper for distributed implementations. The simulation results show significant gains compared with existing methods in terms of handover frequency and network throughput perspective. PMID:28328992

  17. A 3D Automated Classification Scheme for the TAUVEX data pipeline

    CERN Document Server

    Bora, Archana; Singh, Harinder P; Murthy, Jayant; Mohan, Rekhesh

    2007-01-01

    In order to develop a pipeline for automated classification of stars to be observed by the TAUVEX ultraviolet space Telescope, we employ an artificial neural network (ANN) technique for classifying stars by using synthetic spectra in the UV region from 1250\\AA to 3220\\AA as the training set and International Ultraviolet Explorer (IUE) low resolution spectra as the test set. Both the data sets have been pre-processed to mimic the observations of the TAUVEX ultraviolet imager. We have successfully classified 229 stars from the IUE low resolution catalog to within 3-4 spectral sub-class using two different simulated training spectra, the TAUVEX spectra of 286 spectral types and UVBLUE spectra of 277 spectral types. Further, we have also been able to obtain the colour excess (i.e. E(B-V) in magnitude units) or the interstellar reddening for those IUE spectra which have known reddening to an accuracy of better than 0.1 magnitudes. It has been shown that even with the limitation of data from just photometric bands,...

  18. An improved biometrics-based remote user authentication scheme with user anonymity.

    Science.gov (United States)

    Khan, Muhammad Khurram; Kumari, Saru

    2013-01-01

    The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability.

  19. A novel transferable individual tree crown delineation model based on Fishing Net Dragging and boundary classification

    Science.gov (United States)

    Liu, Tao; Im, Jungho; Quackenbush, Lindi J.

    2015-12-01

    This study provides a novel approach to individual tree crown delineation (ITCD) using airborne Light Detection and Ranging (LiDAR) data in dense natural forests using two main steps: crown boundary refinement based on a proposed Fishing Net Dragging (FiND) method, and segment merging based on boundary classification. FiND starts with approximate tree crown boundaries derived using a traditional watershed method with Gaussian filtering and refines these boundaries using an algorithm that mimics how a fisherman drags a fishing net. Random forest machine learning is then used to classify boundary segments into two classes: boundaries between trees and boundaries between branches that belong to a single tree. Three groups of LiDAR-derived features-two from the pseudo waveform generated along with crown boundaries and one from a canopy height model (CHM)-were used in the classification. The proposed ITCD approach was tested using LiDAR data collected over a mountainous region in the Adirondack Park, NY, USA. Overall accuracy of boundary classification was 82.4%. Features derived from the CHM were generally more important in the classification than the features extracted from the pseudo waveform. A comprehensive accuracy assessment scheme for ITCD was also introduced by considering both area of crown overlap and crown centroids. Accuracy assessment using this new scheme shows the proposed ITCD achieved 74% and 78% as overall accuracy, respectively, for deciduous and mixed forest.

  20. Target searching based on modified implicit ROI encoding scheme

    Institute of Scientific and Technical Information of China (English)

    Bai Xu; Zhang Zhongzhao

    2008-01-01

    An EBCOT-based method is proposed to reduce the priority of background coefficients in the ROI code block without compromising algorithm complexity.The region of interest is encoded to a higher quality level than background,and the target searching time in video-guided penetrating missile can be shortened.Three kinds of coding schemes based on EBCOT are discussed.Experimental results demonstrate that the proposed method shows higher compression efficiency,lower complexity,and good reconstructed ROI image quality in the lower channel capacity.

  1. Multiresolution image fusion scheme based on fuzzy region feature

    Institute of Scientific and Technical Information of China (English)

    LIU Gang; JING Zhong-liang; SUN Shao-yuan

    2006-01-01

    This paper proposes a novel region based image fusion scheme based on multiresolution analysis. The low frequency band of the image multiresolution representation is segmented into important regions, sub-important regions and background regions. Each feature of the regions is used to determine the region's degree of membership in the multiresolution representation,and then to achieve multiresolution representation of the fusion result. The final image fusion result can be obtained by using the inverse multiresolution transform. Experiments showed that the proposed image fusion method can have better performance than existing image fusion methods.

  2. Biometrics based authentication scheme for session initiation protocol.

    Science.gov (United States)

    Xie, Qi; Tang, Zhixiong

    2016-01-01

    Many two-factor challenge-response based session initiation protocol (SIP) has been proposed, but most of them are vulnerable to smart card stolen attacks and password guessing attacks. In this paper, we propose a novel three-factor SIP authentication scheme using biometrics, password and smart card, and utilize the pi calculus-based formal verification tool ProVerif to prove that the proposed protocol achieves security and authentication. Furthermore, our protocol is highly efficient when compared to other related protocols.

  3. Image-Based Coral Reef Classification and Thematic Mapping

    Directory of Open Access Journals (Sweden)

    Brooke Gintert

    2013-04-01

    Full Text Available This paper presents a novel image classification scheme for benthic coral reef images that can be applied to both single image and composite mosaic datasets. The proposed method can be configured to the characteristics (e.g., the size of the dataset, number of classes, resolution of the samples, color information availability, class types, etc. of individual datasets. The proposed method uses completed local binary pattern (CLBP, grey level co-occurrence matrix (GLCM, Gabor filter response, and opponent angle and hue channel color histograms as feature descriptors. For classification, either k-nearest neighbor (KNN, neural network (NN, support vector machine (SVM or probability density weighted mean distance (PDWMD is used. The combination of features and classifiers that attains the best results is presented together with the guidelines for selection. The accuracy and efficiency of our proposed method are compared with other state-of-the-art techniques using three benthic and three texture datasets. The proposed method achieves the highest overall classification accuracy of any of the tested methods and has moderate execution time. Finally, the proposed classification scheme is applied to a large-scale image mosaic of the Red Sea to create a completely classified thematic map of the reef benthos.

  4. Dynamic Symmetric Key Mobile Commerce Scheme Based on Self-Verified Mechanism

    Directory of Open Access Journals (Sweden)

    Jiachen Yang

    2014-01-01

    Full Text Available In terms of the security and efficiency of mobile e-commerce, the authors summarized the advantages and disadvantages of several related schemes, especially the self-verified mobile payment scheme based on the elliptic curve cryptosystem (ECC and then proposed a new type of dynamic symmetric key mobile commerce scheme based on self-verified mechanism. The authors analyzed the basic algorithm based on self-verified mechanisms and detailed the complete transaction process of the proposed scheme. The authors analyzed the payment scheme based on the security and high efficiency index. The analysis shows that the proposed scheme not only meets the high efficiency of mobile electronic payment premise, but also takes the security into account. The user confirmation mechanism at the end of the proposed scheme further strengthens the security of the proposed scheme. In brief, the proposed scheme is more efficient and practical than most of the existing schemes.

  5. Staggering behavior of the first excited 2{sup +} states of even-even nuclei in a Sp(4, R) classification scheme

    Energy Technology Data Exchange (ETDEWEB)

    Drenska, S.; Georgieva, A.; Minkov, N. [Bulgarian Academy of Sciences, Inst. for Nuclear Research and Nuclear Energy, Sofia (Bulgaria)

    2002-12-01

    We implement a high order discrete derivative analysis of the lowest nuclear collective excitations in terms of the quantum numbers of an algebraic Sp(4, R) classification scheme. The results reveal a fine systematic behavior of nuclear collectivity in terms of nucleon pairing and high order quartetting correlations. (author)

  6. AN OBJECT-BASED METHOD FOR CHINESE LANDFORM TYPES CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Ding

    2016-06-01

    Full Text Available Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM. In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  7. An Object-Based Method for Chinese Landform Types Classification

    Science.gov (United States)

    Ding, Hu; Tao, Fei; Zhao, Wufan; Na, Jiaming; Tang, Guo'an

    2016-06-01

    Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM). In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  8. Radar-Derived Quantitative Precipitation Estimation Based on Precipitation Classification

    Directory of Open Access Journals (Sweden)

    Lili Yang

    2016-01-01

    Full Text Available A method for improving radar-derived quantitative precipitation estimation is proposed. Tropical vertical profiles of reflectivity (VPRs are first determined from multiple VPRs. Upon identifying a tropical VPR, the event can be further classified as either tropical-stratiform or tropical-convective rainfall by a fuzzy logic (FL algorithm. Based on the precipitation-type fields, the reflectivity values are converted into rainfall rate using a Z-R relationship. In order to evaluate the performance of this rainfall classification scheme, three experiments were conducted using three months of data and two study cases. In Experiment I, the Weather Surveillance Radar-1988 Doppler (WSR-88D default Z-R relationship was applied. In Experiment II, the precipitation regime was separated into convective and stratiform rainfall using the FL algorithm, and corresponding Z-R relationships were used. In Experiment III, the precipitation regime was separated into convective, stratiform, and tropical rainfall, and the corresponding Z-R relationships were applied. The results show that the rainfall rates obtained from all three experiments match closely with the gauge observations, although Experiment II could solve the underestimation, when compared to Experiment I. Experiment III significantly reduced this underestimation and generated the most accurate radar estimates of rain rate among the three experiments.

  9. Periodic Sweep Coverage Scheme Based on Periodic Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Li Shu

    2014-03-01

    Full Text Available We provide a sweep coverage algorithm for routing mobile sensors that communicate with a central data sink. This algorithm improves on its predecessors by reducing the number of unnecessary scans when different points of interest (POIs have different requirements for the time interval within which they must be scanned (sweep period. Most sweep coverage algorithms seek to minimize the number of sensors required to cover a given collection of POIs. When POIs have different sweep period requirements, existing algorithms will produce solutions in which sensors visit some POIs much more frequently than is necessary. We define this as the POI Over-Coverage problem. In order to address this problem we develop a Periodic Sweep Coverage (PSC scheme based on a well-known solution to the Periodic Vehicle Routing Problem (PVRP. Our algorithm seeks a route for the mobile sensors that minimizes the number of unnecessary visits to each POI. To verify and test the proposed scheme we implemented a C++ simulation and ran scenarios with a variety of POI topologies (number and distribution of the POIs and the speed at which sensors could travel. The simulation results show that the PSC algorithm outperforms other sweep coverage algorithms such as CSweep and Vehicle Routing Problem Sweep Coverage (VRPSC on both the average number of sensors in a solution and in the computational time required to find a solution. Our results also demonstrate that the PSC scheme is more suitable for the sweep coverage scenarios in which higher speed mobile sensors are used.

  10. Demand response scheme based on lottery-like rebates

    KAUST Repository

    Schwartz, Galina A.

    2014-08-24

    In this paper, we develop a novel mechanism for reducing volatility of residential demand for electricity. We construct a reward-based (rebate) mechanism that provides consumers with incentives to shift their demand to off-peak time. In contrast to most other mechanisms proposed in the literature, the key feature of our mechanism is its modest requirements on user preferences, i.e., it does not require exact knowledge of user responsiveness to rewards for shifting their demand from the peak to the off-peak time. Specifically, our mechanism utilizes a probabilistic reward structure for users who shift their demand to the off-peak time, and is robust to incomplete information about user demand and/or risk preferences. We approach the problem from the public good perspective, and demonstrate that the mechanism can be implemented via lottery-like schemes. Our mechanism permits to reduce the distribution losses, and thus improve efficiency of electricity distribution. Finally, the mechanism can be readily incorporated into the emerging demand response schemes (e.g., the time-of-day pricing, and critical peak pricing schemes), and has security and privacy-preserving properties.

  11. Fast Wavelet-Based Visual Classification

    CERN Document Server

    Yu, Guoshen

    2008-01-01

    We investigate a biologically motivated approach to fast visual classification, directly inspired by the recent work of Serre et al. Specifically, trading-off biological accuracy for computational efficiency, we explore using wavelet and grouplet-like transforms to parallel the tuning of visual cortex V1 and V2 cells, alternated with max operations to achieve scale and translation invariance. A feature selection procedure is applied during learning to accelerate recognition. We introduce a simple attention-like feedback mechanism, significantly improving recognition and robustness in multiple-object scenes. In experiments, the proposed algorithm achieves or exceeds state-of-the-art success rate on object recognition, texture and satellite image classification, language identification and sound classification.

  12. Arbitrated quantum signature scheme based on reusable key

    Science.gov (United States)

    Yu, ChaoHua; Guo, GongDe; Lin, Song

    2014-11-01

    An arbitrated quantum signature scheme without using entangled states is proposed. In the scheme, by employing a classical hash function and random numbers, the secret keys of signer and receiver can be reused. It is shown that the proposed scheme is secure against several well-known attacks. Specifically, it can stand against the receiver's disavowal attack. Moreover, compared with previous relevant arbitrated quantum signature schemes, the scheme proposed has the advantage of less transmission complexity.

  13. Arbitrated quantum signature scheme based on reusable key

    Institute of Scientific and Technical Information of China (English)

    YU ChaoHua; GUO GongDe; LIN Song

    2014-01-01

    An arbitrated quantum signature scheme without using entangled states is proposed.In the scheme,by employing a classical hash function and random numbers,the secret keys of signer and receiver can be reused.It is shown that the proposed scheme is secure against several well-known attacks.Specifically,it can stand against the receiver's disavowal attack.Moreover,compared with previous relevant arbitrated quantum signature schemes,the scheme proposed has the advantage of less transmission complexity.

  14. A secure biometrics-based authentication scheme for telecare medicine information systems.

    Science.gov (United States)

    Yan, Xiaopeng; Li, Weiheng; Li, Ping; Wang, Jiantao; Hao, Xinhong; Gong, Peng

    2013-10-01

    The telecare medicine information system (TMIS) allows patients and doctors to access medical services or medical information at remote sites. Therefore, it could bring us very big convenient. To safeguard patients' privacy, authentication schemes for the TMIS attracted wide attention. Recently, Tan proposed an efficient biometrics-based authentication scheme for the TMIS and claimed their scheme could withstand various attacks. However, in this paper, we point out that Tan's scheme is vulnerable to the Denial-of-Service attack. To enhance security, we also propose an improved scheme based on Tan's work. Security and performance analysis shows our scheme not only could overcome weakness in Tan's scheme but also has better performance.

  15. Knowledge-Based Classification in Automated Soil Mapping

    Institute of Scientific and Technical Information of China (English)

    ZHOU BIN; WANG RENCHAO

    2003-01-01

    A machine-learning approach was developed for automated building of knowledge bases for soil resourcesmapping by using a classification tree to generate knowledge from training data. With this method, buildinga knowledge base for automated soil mapping was easier than using the conventional knowledge acquisitionapproach. The knowledge base built by classification tree was used by the knowledge classifier to perform thesoil type classification of Longyou County, Zhejiang Province, China using Landsat TM bi-temporal imagesand GIS data. To evaluate the performance of the resultant knowledge bases, the classification results werecompared to existing soil map based on a field survey. The accuracy assessment and analysis of the resultantsoil maps suggested that the knowledge bases built by the machine-learning method was of good quality formapping distribution model of soil classes over the study area.

  16. Shape classification based on singular value decomposition transform

    Institute of Scientific and Technical Information of China (English)

    SHAABAN Zyad; ARIF Thawar; BABA Sami; KREKOR Lala

    2009-01-01

    In this paper, a new shape classification system based on singular value decomposition (SVD) transform using nearest neighbour classifier was proposed. The gray scale image of the shape object was converted into a black and white image. The squared Euclidean distance transform on binary image was applied to extract the boundary image of the shape. SVD transform features were extracted from the the boundary of the object shapes. In this paper, the proposed classification system based on SVD transform feature extraction method was compared with classifier based on moment invariants using nearest neighbour classifier. The experimental results showed the advantage of our proposed classification system.

  17. Multiclass Classification Based on the Analytical Center of Version Space

    Institute of Scientific and Technical Information of China (English)

    ZENGFanzi; QIUZhengding; YUEJianhai; LIXiangqian

    2005-01-01

    Analytical center machine, based on the analytical center of version space, outperforms support vector machine, especially when the version space is elongated or asymmetric. While analytical center machine for binary classification is well understood, little is known about corresponding multiclass classification.Moreover, considering that the current multiclass classification method: “one versus all” needs repeatedly constructing classifiers to separate a single class from all the others, which leads to daunting computation and low efficiency of classification, and that though multiclass support vector machine corresponds to a simple quadratic optimization, it is not very effective when the version spaceis asymmetric or elongated, Thus, the multiclass classification approach based on the analytical center of version space is proposed to address the above problems. Experiments on wine recognition and glass identification dataset demonstrate validity of the approach proposed.

  18. Parallel Implementation of Classification Algorithms Based on Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wenbo Wang

    2012-09-01

    Full Text Available As an important task of data mining, Classification has been received considerable attention in many applications, such as information retrieval, web searching, etc. The enlarging volumes of information emerging by the progress of technology and the growing individual needs of data mining, makes classifying of very large scale of data a challenging task. In order to deal with the problem, many researchers try to design efficient parallel classification algorithms. This paper introduces the classification algorithms and cloud computing briefly, based on it analyses the bad points of the present parallel classification algorithms, then addresses a new model of parallel classifying algorithms. And it mainly introduces a parallel Naïve Bayes classification algorithm based on MapReduce, which is a simple yet powerful parallel programming technique. The experimental results demonstrate that the proposed algorithm improves the original algorithm performance, and it can process large datasets efficiently on commodity hardware.

  19. An Efficient Audio Classification Approach Based on Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Lhoucine Bahatti

    2016-05-01

    Full Text Available In order to achieve an audio classification aimed to identify the composer, the use of adequate and relevant features is important to improve performance especially when the classification algorithm is based on support vector machines. As opposed to conventional approaches that often use timbral features based on a time-frequency representation of the musical signal using constant window, this paper deals with a new audio classification method which improves the features extraction according the Constant Q Transform (CQT approach and includes original audio features related to the musical context in which the notes appear. The enhancement done by this work is also lay on the proposal of an optimal features selection procedure which combines filter and wrapper strategies. Experimental results show the accuracy and efficiency of the adopted approach in the binary classification as well as in the multi-class classification.

  20. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    Science.gov (United States)

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions.

  1. A New Images Hiding Scheme Based on Chaotic Sequences

    Institute of Scientific and Technical Information of China (English)

    LIU Nian-sheng; GUO Dong-hui; WU Bo-xi; Parr G

    2005-01-01

    We propose a data hidding technique in a still image. This technique is based on chaotic sequence in the transform domain of covert image. We use different chaotic random sequences multiplied by multiple sensitive images, respectively, to spread the spectrum of sensitive images. Multiple sensitive images are hidden in a covert image as a form of noise. The results of theoretical analysis and computer simulation show the new hiding technique have better properties with high security, imperceptibility and capacity for hidden information in comparison with the conventional scheme such as LSB (Least Significance Bit).

  2. An Industrial Model Based Disturbance Feedback Control Scheme

    DEFF Research Database (Denmark)

    Kawai, Fukiko; Nakazawa, Chikashi; Vinther, Kasper;

    2014-01-01

    propose a new control method that can decrease the negative impact of disturbance and model errors. The control method is motivated by industrial practice by Fuji Electric. Simulation tests are examined with a conventional PID controller and the disturbance feedback control. The simulation results......This paper presents a model based disturbance feedback control scheme. Industrial process systems have been traditionally controlled by using relay and PID controller. However these controllers are affected by disturbances and model errors and these effects degrade control performance. The authors...

  3. GSM-MRF based classification approach for real-time moving object detection

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Statistical and contextual information are typically used to detect moving regions in image sequences for a fixed camera. In this paper, we propose a fast and stable linear discriminant approach based on Gaussian Single Model (GSM) and Markov Random Field (MRF). The performance of GSM is analyzed first, and then two main improvements corresponding to the drawbacks of GSM are proposed: the latest filtered data based update scheme of the background model and the linear classification judgment rule based on spatial-temporal feature specified by MRF. Experimental results show that the proposed method runs more rapidly and accurately when compared with other methods.

  4. Cardiac Arrhythmias Classification Method Based on MUSIC, Morphological Descriptors, and Neural Network

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available An electrocardiogram (ECG beat classification scheme based on multiple signal classification (MUSIC algorithm, morphological descriptors, and neural networks is proposed for discriminating nine ECG beat types. These are normal, fusion of ventricular and normal, fusion of paced and normal, left bundle branch block, right bundle branch block, premature ventricular concentration, atrial premature contraction, paced beat, and ventricular flutter. ECG signal samples from MIT-BIH arrhythmia database are used to evaluate the scheme. MUSIC algorithm is used to calculate pseudospectrum of ECG signals. The low-frequency samples are picked to have the most valuable heartbeat information. These samples along with two morphological descriptors, which deliver the characteristics and features of all parts of the heart, form an input feature vector. This vector is used for the initial training of a classifier neural network. The neural network is designed to have nine sample outputs which constitute the nine beat types. Two neural network schemes, namely multilayered perceptron (MLP neural network and a probabilistic neural network (PNN, are employed. The experimental results achieved a promising accuracy of 99.03% for classifying the beat types using MLP neural network. In addition, our scheme recognizes NORMAL class with 100% accuracy and never misclassifies any other classes as NORMAL.

  5. Cardiac Arrhythmias Classification Method Based on MUSIC, Morphological Descriptors, and Neural Network

    Science.gov (United States)

    Naghsh-Nilchi, Ahmad R.; Kadkhodamohammadi, A. Rahim

    2009-12-01

    An electrocardiogram (ECG) beat classification scheme based on multiple signal classification (MUSIC) algorithm, morphological descriptors, and neural networks is proposed for discriminating nine ECG beat types. These are normal, fusion of ventricular and normal, fusion of paced and normal, left bundle branch block, right bundle branch block, premature ventricular concentration, atrial premature contraction, paced beat, and ventricular flutter. ECG signal samples from MIT-BIH arrhythmia database are used to evaluate the scheme. MUSIC algorithm is used to calculate pseudospectrum of ECG signals. The low-frequency samples are picked to have the most valuable heartbeat information. These samples along with two morphological descriptors, which deliver the characteristics and features of all parts of the heart, form an input feature vector. This vector is used for the initial training of a classifier neural network. The neural network is designed to have nine sample outputs which constitute the nine beat types. Two neural network schemes, namely multilayered perceptron (MLP) neural network and a probabilistic neural network (PNN), are employed. The experimental results achieved a promising accuracy of 99.03% for classifying the beat types using MLP neural network. In addition, our scheme recognizes NORMAL class with 100% accuracy and never misclassifies any other classes as NORMAL.

  6. Twig Pattern Matching Based on Compressed Path Labeling Scheme

    Institute of Scientific and Technical Information of China (English)

    NING Bo; WANG Guoren; DONG Ke

    2007-01-01

    Holistic twig query processing techniques based on region encoding have been developed to minimize the intermediate results, namely, those root-to-leaf path matches that are not in the final twig results. These algorithms have to scan all the streams of tags in query patterns. However, useless path matches cannot be completely avoided. TJFast which is based on the labeling scheme of Extended Dewey has been proposed to avoid useless intermediate results, and it only needs to access the labels of the leaf query nodes. However, it don't concern about the characteristics of elements with the same parent, and it has to merge join all the intermediate results which are evaluated during the first phrase. We propose a new labeling scheme to compress the XML elements which have the same characteristic. Based on the compressed path-labeled streams, a new novel holistic twig query algorithm named CPJoin is designed. Finally, implementation results are provided to show that CPJoin has good performance on both real and synthetic data.

  7. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  8. Geometrically Invariant Watermarking Scheme Based on Local Feature Points

    Directory of Open Access Journals (Sweden)

    Jing Li

    2012-06-01

    Full Text Available Based on local invariant feature points and cross ratio principle, this paper presents a feature-point-based image watermarking scheme. It is robust to geometric attacks and some signal processes. It extracts local invariant feature points from the image using the improved scale invariant feature transform algorithm. Utilizing these points as vertexes it constructs some quadrilaterals to be as local feature regions. Watermark is inserted these local feature regions repeatedly. In order to get stable local regions it adjusts the number and distribution of extracted feature points. In every chosen local feature region it decides locations to embed watermark bits based on the cross ratio of four collinear points, the cross ratio is invariant to projective transformation. Watermark bits are embedded by quantization modulation, in which the quantization step value is computed with the given PSNR. Experimental results show that the proposed method can strongly fight more geometrical attacks and the compound attacks of geometrical ones.

  9. A Region-Based GeneSIS Segmentation Algorithm for the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    Stelios K. Mylonas

    2015-03-01

    Full Text Available This paper proposes an object-based segmentation/classification scheme for remotely sensed images, based on a novel variant of the recently proposed Genetic Sequential Image Segmentation (GeneSIS algorithm. GeneSIS segments the image in an iterative manner, whereby at each iteration a single object is extracted via a genetic-based object extraction algorithm. Contrary to the previous pixel-based GeneSIS where the candidate objects to be extracted were evaluated through the fuzzy content of their included pixels, in the newly developed region-based GeneSIS algorithm, a watershed-driven fine segmentation map is initially obtained from the original image, which serves as the basis for the forthcoming GeneSIS segmentation. Furthermore, in order to enhance the spatial search capabilities, we introduce a more descriptive encoding scheme in the object extraction algorithm, where the structural search modules are represented by polygonal shapes. Our objectives in the new framework are posed as follows: enhance the flexibility of the algorithm in extracting more flexible object shapes, assure high level classification accuracies, and reduce the execution time of the segmentation, while at the same time preserving all the inherent attributes of the GeneSIS approach. Finally, exploiting the inherent attribute of GeneSIS to produce multiple segmentations, we also propose two segmentation fusion schemes that operate on the ensemble of segmentations generated by GeneSIS. Our approaches are tested on an urban and two agricultural images. The results show that region-based GeneSIS has considerably lower computational demands compared to the pixel-based one. Furthermore, the suggested methods achieve higher classification accuracies and good segmentation maps compared to a series of existing algorithms.

  10. TENSOR MODELING BASED FOR AIRBORNE LiDAR DATA CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    N. Li

    2016-06-01

    Full Text Available Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the “raw” data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.

  11. Tensor Modeling Based for Airborne LiDAR Data Classification

    Science.gov (United States)

    Li, N.; Liu, C.; Pfeifer, N.; Yin, J. F.; Liao, Z. Y.; Zhou, Y.

    2016-06-01

    Feature selection and description is a key factor in classification of Earth observation data. In this paper a classification method based on tensor decomposition is proposed. First, multiple features are extracted from raw LiDAR point cloud, and raster LiDAR images are derived by accumulating features or the "raw" data attributes. Then, the feature rasters of LiDAR data are stored as a tensor, and tensor decomposition is used to select component features. This tensor representation could keep the initial spatial structure and insure the consideration of the neighborhood. Based on a small number of component features a k nearest neighborhood classification is applied.

  12. Staggering behavior of the low lying excited states of even-even nuclei in a Sp(4,R) classification scheme

    CERN Document Server

    Drenska, S B; Minkov, N

    2002-01-01

    We implement a high order discrete derivative analysis of the low lying collective energies of even-even nuclei with respect to the total number of valence nucleon pairs N in the framework of F- spin multiplets appearing in a symplectic sp(4,R) classification scheme. We find that for the nuclei of any given F- multiplet the respective experimental energies exhibit a Delta N=2 staggering behavior and for the nuclei of two united neighboring F- multiplets well pronounced Delta N=1 staggering patterns are observed. Those effects have been reproduced successfully through a generalized sp(4,R) model energy expression and explained in terms of the step-like changes in collective modes within the F- multiplets and the alternation of the F-spin projection in the united neighboring multiplets. On this basis we suggest that the observed Delta N=2 and Delta N=1 staggering effects carry detailed information about the respective systematic manifestation of both high order alpha - particle like quartetting of nucleons and ...

  13. Generating Unstable Resonances for Extraction Schemes Based on Transverse Splitting

    CERN Document Server

    Giovannozzi, M; Turchetti, G

    2009-01-01

    A few years ago, a novel multi-turn extraction scheme was proposed, based on particle trapping inside stable resonances. Numerical simulations and experimental tests have confirmed the feasibility of such a scheme for low order resonances. While the third-order resonance is generically unstable and those higher than fourth-order are generically stable, the fourth-order resonance can be either stable or unstable depending on the specifics of the system under consideration. By means of the Normal Form a general approach to control the stability of the fourth-order resonance has been derived. This approach is based on the control of the amplitude detuning and the general form for a lattice with an arbitrary number of sextupole and octupole families is derived in this paper. Numerical simulations have confirmed the analytical results and have shown that, when crossing the unstable fourth-order resonance, the region around the centre of the phase space is depleted and particles are trapped in only the four stable ...

  14. Motion feature extraction scheme for content-based video retrieval

    Science.gov (United States)

    Wu, Chuan; He, Yuwen; Zhao, Li; Zhong, Yuzhuo

    2001-12-01

    This paper proposes the extraction scheme of global motion and object trajectory in a video shot for content-based video retrieval. Motion is the key feature representing temporal information of videos. And it is more objective and consistent compared to other features such as color, texture, etc. Efficient motion feature extraction is an important step for content-based video retrieval. Some approaches have been taken to extract camera motion and motion activity in video sequences. When dealing with the problem of object tracking, algorithms are always proposed on the basis of known object region in the frames. In this paper, a whole picture of the motion information in the video shot has been achieved through analyzing motion of background and foreground respectively and automatically. 6-parameter affine model is utilized as the motion model of background motion, and a fast and robust global motion estimation algorithm is developed to estimate the parameters of the motion model. The object region is obtained by means of global motion compensation between two consecutive frames. Then the center of object region is calculated and tracked to get the object motion trajectory in the video sequence. Global motion and object trajectory are described with MPEG-7 parametric motion and motion trajectory descriptors and valid similar measures are defined for the two descriptors. Experimental results indicate that our proposed scheme is reliable and efficient.

  15. ID-based authentication scheme combined with identity-based encryption with fingerprint hashing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Current identity-based (ID) cryptosystem lacks the mechanisms of two-party authentication and user's private key distribution. Some ID-based signcryption schemes and ID-based authenticated key agreement protocols have been presented, but they cannot solve the problem completely. A novel ID-based authentication scheme based on ID-based encryption (IBE) and fingerprint hashing method is proposed to solve the difficulties in the IBE scheme, which includes message receiver authenticating the sender, the trusted authority (TA) authenticating the users and transmitting the private key to them. Furthermore, the scheme extends the application of fingerprint authentication from terminal to network and protects against fingerprint data fabrication. The fingerprint authentication method consists of two factors. This method combines a token key, for example, the USB key, with the user's fingerprint hash by mixing a pseudo-random number with the fingerprint feature. The security and experimental efficiency meet the requirements of practical applications.

  16. A Lattice-Based Identity-Based Proxy Blind Signature Scheme in the Standard Model

    Directory of Open Access Journals (Sweden)

    Lili Zhang

    2014-01-01

    Full Text Available A proxy blind signature scheme is a special form of blind signature which allowed a designated person called proxy signer to sign on behalf of original signers without knowing the content of the message. It combines the advantages of proxy signature and blind signature. Up to date, most proxy blind signature schemes rely on hard number theory problems, discrete logarithm, and bilinear pairings. Unfortunately, the above underlying number theory problems will be solvable in the postquantum era. Lattice-based cryptography is enjoying great interest these days, due to implementation simplicity and provable security reductions. Moreover, lattice-based cryptography is believed to be hard even for quantum computers. In this paper, we present a new identity-based proxy blind signature scheme from lattices without random oracles. The new scheme is proven to be strongly unforgeable under the standard hardness assumption of the short integer solution problem (SIS and the inhomogeneous small integer solution problem (ISIS. Furthermore, the secret key size and the signature length of our scheme are invariant and much shorter than those of the previous lattice-based proxy blind signature schemes. To the best of our knowledge, our construction is the first short lattice-based identity-based proxy blind signature scheme in the standard model.

  17. Speech Segregation based on Binary Classification

    Science.gov (United States)

    2016-07-15

    other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a...to the adoption of the ideal ratio mask (IRM). A subsequent listening evaluation shows increased intelligibility in noise for human listeners...15. SUBJECT TERMS Binary classification, time-frequency masking, supervised speech segregation, speech intelligibility , room reverberation 16

  18. Iris image recognition wavelet filter-banks based iris feature extraction schemes

    CERN Document Server

    Rahulkar, Amol D

    2014-01-01

    This book provides the new results in wavelet filter banks based feature extraction, and the classifier in the field of iris image recognition. It provides the broad treatment on the design of separable, non-separable wavelets filter banks, and the classifier. The design techniques presented in the book are applied on iris image analysis for person authentication. This book also brings together the three strands of research (wavelets, iris image analysis, and classifier). It compares the performance of the presented techniques with state-of-the-art available schemes. This book contains the compilation of basic material on the design of wavelets that avoids reading many different books. Therefore, it provide an easier path for the new-comers, researchers to master the contents. In addition, the designed filter banks and classifier can also be effectively used than existing filter-banks in many signal processing applications like pattern classification, data-compression, watermarking, denoising etc.  that will...

  19. Adaptive SPC monitoring scheme for DOE-based APC

    Institute of Scientific and Technical Information of China (English)

    Ye Liang; Pan Ershun; Xi Lifeng

    2008-01-01

    Automatic process control (APC) based on design of experiment (DOE) is a cost-efficient approach for variation reduction. The process changes both in mean and variance owing to online parameter adjustment make it hard to apply traditional SPC charts in such DOE-based APC applied process. An adaptive SPC scheme is developed, which can better track the process transitions and achieve the possible SPC run cost reduction when the process is stable. The control law of SPC parameters is designed by fully utilizing the estimation properties of the process model instead of traditionally using the data collected from the production line. An example is provided to illustrate the proposed adaptive SPC design approach.

  20. Triangle-based key management scheme for wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    Hangyang DAI; Hongbing XU

    2009-01-01

    For security services in wireless sensor net-works, key management is a fundamental building block.In this article, we propose a triangle-based key predis-tribution approach and show that it can improve the effectiveness of key management in wireless sensor networks. This is achieved by using the bivariate polynomial in a triangle deployment system based on deployment information about expected locations of the sensor nodes. The analysis indicates that this scheme can achieve higher probability of both direct key establishment and indirect key establishment. On the other hand, the security analysis shows that its security against node capture would increase with a decrease of the sensor node deployment density and size of the deployment model and an increase of the polynomial degree.

  1. Intelligent Hybrid Cluster Based Classification Algorithm for Social Network Analysis

    Directory of Open Access Journals (Sweden)

    S. Muthurajkumar

    2014-05-01

    Full Text Available In this paper, we propose an hybrid clustering based classification algorithm based on mean approach to effectively classify to mine the ordered sequences (paths from weblog data in order to perform social network analysis. In the system proposed in this work for social pattern analysis, the sequences of human activities are typically analyzed by switching behaviors, which are likely to produce overlapping clusters. In this proposed system, a robust Modified Boosting algorithm is proposed to hybrid clustering based classification for clustering the data. This work is useful to provide connection between the aggregated features from the network data and traditional indices used in social network analysis. Experimental results show that the proposed algorithm improves the decision results from data clustering when combined with the proposed classification algorithm and hence it is proved that of provides better classification accuracy when tested with Weblog dataset. In addition, this algorithm improves the predictive performance especially for multiclass datasets which can increases the accuracy.

  2. Cluster-based Multihop Synchronization Scheme for Femtocell Network

    Directory of Open Access Journals (Sweden)

    Aisha H. Abdalla

    2012-10-01

    Full Text Available ABSTRACT: Femtocell technology has been drawing considerable attention as a cost-effective means of improving cellular coverage and capacity. It is connected to the core network through an IP backhaul and can only use timing protocols such as IEEE1588 or Network Time Protocol (NTP. Furthermore, the femtocell is installed indoor, and cannot use a GPS antenna for time synchronization.  High-precision crystal oscillators can solve the timing problem, but they are often too expensive for consumer grade devices. Therefore, femtocell Base Station (fBS synchronization is one of the principle technical trends in femtocell deployment. Since fBSand macrocell Base Station (mBS network operates on the same frequency under a licensed spectrum, fBS network can interfere with the macrocell network. In addition, fBSs can also interfere with each other if multiple units are in close proximity. Furthermore, in a flat fBS structured network using IEEE 1588 synchronization algorithm and fBS-fBS synchronization scheme creates offset and frequency error which results inaccurate synchronization. In order to reduce offset and frequency error (skew, this paper proposed a cluster-based multihop synchronization scheme to achieve precise in fBS neighbor nodes. The proposed scheme is able to reduce the offset and skew significantly.ABSTRAK: Teknologi Femtocell telah menjadi tumpuan sebagai alat yang kos-efektif dalam memperbaiki liputan mudahalih dan kapasiti. Ia menghubungkan jaringan teras melalui IP backhaul dan hanya boleh menggunakan protokol masa seperti IEEE1588 atau Protokol Jaringan Masa (NTP. Seterusnya, femtocell dipasang di dalam, dan tidak boleh menggunakan antena GPS untuk sinkronisasi masa. Osilator Kristal yang tinggi kejituannya boleh menyelesaikan masalah masa, tetapi ianya mahal bagi gred peranti consumer. Oleh itu, sinkronisasi Stesen Asas femtocell (fBS adalah salah satu tren teknikal prinsip dalam deployment femtocell. Memandangkan fBS dan jaringan

  3. Energy Aware Cluster Based Routing Scheme For Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Roy Sohini

    2015-09-01

    Full Text Available Wireless Sensor Network (WSN has emerged as an important supplement to the modern wireless communication systems due to its wide range of applications. The recent researches are facing the various challenges of the sensor network more gracefully. However, energy efficiency has still remained a matter of concern for the researches. Meeting the countless security needs, timely data delivery and taking a quick action, efficient route selection and multi-path routing etc. can only be achieved at the cost of energy. Hierarchical routing is more useful in this regard. The proposed algorithm Energy Aware Cluster Based Routing Scheme (EACBRS aims at conserving energy with the help of hierarchical routing by calculating the optimum number of cluster heads for the network, selecting energy-efficient route to the sink and by offering congestion control. Simulation results prove that EACBRS performs better than existing hierarchical routing algorithms like Distributed Energy-Efficient Clustering (DEEC algorithm for heterogeneous wireless sensor networks and Energy Efficient Heterogeneous Clustered scheme for Wireless Sensor Network (EEHC.

  4. Saturation Detection-Based Blocking Scheme for Transformer Differential Protection

    Directory of Open Access Journals (Sweden)

    Byung Eun Lee

    2014-07-01

    Full Text Available This paper describes a current differential relay for transformer protection that operates in conjunction with a core saturation detection-based blocking algorithm. The differential current for the magnetic inrush or over-excitation has a point of inflection at the start and end of each saturation period of the transformer core. At these instants, discontinuities arise in the first-difference function of the differential current. The second- and third-difference functions convert the points of inflection into pulses, the magnitudes of which are large enough to detect core saturation. The blocking signal is activated if the third-difference of the differential current is larger than the threshold and is maintained for one cycle. In addition, a method to discriminate between transformer saturation and current transformer (CT saturation is included. The performance of the proposed blocking scheme was compared with that of a conventional harmonic blocking method. The test results indicate that the proposed scheme successfully discriminates internal faults even with CT saturation from the magnetic inrush, over-excitation, and external faults with CT saturation, and can significantly reduce the operating time delay of the relay.

  5. Ensemble-based Malware Detection with Different Voting Schemes

    Directory of Open Access Journals (Sweden)

    Ms. Jyoti H. Landage

    2014-10-01

    Full Text Available Now a day’s computer security is the field which attempts to keep information on the computer safe and secure. Security means permitting things you do want, while preventing things you don’t want from happening. Malware represents a serious threat to security of computer system. Traditional anti-malware products use the signature-based, heuristic-based detection techniques to detect the malware. These techniques detect the known malware accurately but can’t detect the new, unknown malware. This paper presents a malware detection system based on the data mining and machine learning technique. The proposed method consists of disassemble process, feature extraction process and feature selection process. Three classification algorithms are employed on dataset to generate and train the classifiers named as Ripper, C4.5, IBk. The ensemble method Voting is used to improve the accuracy of results. Here majority voting and veto voting are implemented; the expected output is decided on the basis of majority and veto voting. The decision strategy of veto is improved by introducing the trust-based veto voting. The results of majority voting, veto voting and trust-based veto voting are compared. The experimental results show that the trust-based veto voting can accurately detect known and unknown malware instances better than majority voting and can identify the benign files better than veto voting.

  6. ENSEMBLE-BASED MALWARE DETECTION WITH DIFFERENT VOTING SCHEMES

    Directory of Open Access Journals (Sweden)

    Jyoti H. Landage

    2015-10-01

    Full Text Available Now a day’s computer security is the field which attempts to keep information on the computer safe and secure. Security means permitting things you do want, while preventing things you don't want from happening. Malware represents a serious threat to security of computer system. Traditional anti-malware products use the signature-based, heuristic-based detection techniques to detect the malware. These techniques detect the known malware accurately but can't detect the new, unknown malware. This paper presents a malware detection system based on the data mining and machine learning technique. The proposed method consists of disassemble process, feature extraction process and feature selection process. Three classification algorithms are employed on dataset to generate and train the classifiers named as Ripper, C4.5, IBk. The ensemble method Voting is used to improve the accuracy of results. Here majority voting and veto voting are implemented; the expected output is decided on the basis of majority and veto voting. The decision strategy of veto is improved by introducing the trust-based veto voting. The results of majority voting, veto voting and trust-based veto voting are compared. The experimental results show that the trust-based veto voting can accurately detect known and unknown malware instances better than majority voting and can identify the benign files better than veto voting.

  7. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar

    Directory of Open Access Journals (Sweden)

    Raja Syamsul Azmir Raja Abdullah

    2016-09-01

    Full Text Available The passive bistatic radar (PBR system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR. The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system’s capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  8. A secure smart-card based authentication and key agreement scheme for telecare medicine information systems.

    Science.gov (United States)

    Lee, Tian-Fu; Liu, Chuan-Ming

    2013-06-01

    A smart-card based authentication scheme for telecare medicine information systems enables patients, doctors, nurses, health visitors and the medicine information systems to establish a secure communication platform through public networks. Zhu recently presented an improved authentication scheme in order to solve the weakness of the authentication scheme of Wei et al., where the off-line password guessing attacks cannot be resisted. This investigation indicates that the improved scheme of Zhu has some faults such that the authentication scheme cannot execute correctly and is vulnerable to the attack of parallel sessions. Additionally, an enhanced authentication scheme based on the scheme of Zhu is proposed. The enhanced scheme not only avoids the weakness in the original scheme, but also provides users' anonymity and authenticated key agreements for secure data communications.

  9. Key-phrase based classification of public health web pages.

    Science.gov (United States)

    Dolamic, Ljiljana; Boyer, Célia

    2013-01-01

    This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.

  10. Support vector classification algorithm based on variable parameter linear programming

    Institute of Scientific and Technical Information of China (English)

    Xiao Jianhua; Lin Jian

    2007-01-01

    To solve the problems of SVM in dealing with large sample size and asymmetric distributed samples, a support vector classification algorithm based on variable parameter linear programming is proposed.In the proposed algorithm, linear programming is employed to solve the optimization problem of classification to decrease the computation time and to reduce its complexity when compared with the original model.The adjusted punishment parameter greatly reduced the classification error resulting from asymmetric distributed samples and the detailed procedure of the proposed algorithm is given.An experiment is conducted to verify whether the proposed algorithm is suitable for asymmetric distributed samples.

  11. Words semantic orientation classification based on HowNet

    Institute of Scientific and Technical Information of China (English)

    LI Dun; MA Yong-tao; GUO Jian-li

    2009-01-01

    Based on the text orientation classification, a new measurement approach to semantic orientation of words was proposed. According to the integrated and detailed definition of words in HowNet, seed sets including the words with intense orientations were built up. The orientation similarity between the seed words and the given word was then calculated using the sentiment weight priority to recognize the semantic orientation of common words. Finally, the words' semantic orientation and the context were combined to recognize the given words' orientation. The experiments show that the measurement approach achieves better results for common words' orientation classification and contributes particularly to the text orientation classification of large granularities.

  12. Radar Target Classification using Recursive Knowledge-Based Methods

    DEFF Research Database (Denmark)

    Jochumsen, Lars Wurtz

    The topic of this thesis is target classification of radar tracks from a 2D mechanically scanning coastal surveillance radar. The measurements provided by the radar are position data and therefore the classification is mainly based on kinematic data, which is deduced from the position. The target...... been terminated. Therefore, an update of the classification results must be made for each measurement of the target. The data for this work are collected throughout the PhD and are both collected from radars and other sensors such as GPS....

  13. Cancer classification based on gene expression using neural networks.

    Science.gov (United States)

    Hu, H P; Niu, Z J; Bai, Y P; Tan, X H

    2015-12-21

    Based on gene expression, we have classified 53 colon cancer patients with UICC II into two groups: relapse and no relapse. Samples were taken from each patient, and gene information was extracted. Of the 53 samples examined, 500 genes were considered proper through analyses by S-Kohonen, BP, and SVM neural networks. Classification accuracy obtained by S-Kohonen neural network reaches 91%, which was more accurate than classification by BP and SVM neural networks. The results show that S-Kohonen neural network is more plausible for classification and has a certain feasibility and validity as compared with BP and SVM neural networks.

  14. Fuzzy Aspect Based Opinion Classification System for Mining Tourist Reviews

    Directory of Open Access Journals (Sweden)

    Muhammad Afzaal

    2016-01-01

    Full Text Available Due to the large amount of opinions available on the websites, tourists are often overwhelmed with information and find it extremely difficult to use the available information to make a decision about the tourist places to visit. A number of opinion mining methods have been proposed in the past to identify and classify an opinion into positive or negative. Recently, aspect based opinion mining has been introduced which targets the various aspects present in the opinion text. A number of existing aspect based opinion classification methods are available in the literature but very limited research work has targeted the automatic aspect identification and extraction of implicit, infrequent, and coreferential aspects. Aspect based classification suffers from the presence of irrelevant sentences in a typical user review. Such sentences make the data noisy and degrade the classification accuracy of the machine learning algorithms. This paper presents a fuzzy aspect based opinion classification system which efficiently extracts aspects from user opinions and perform near to accurate classification. We conducted experiments on real world datasets to evaluate the effectiveness of our proposed system. Experimental results prove that the proposed system not only is effective in aspect extraction but also improves the classification accuracy.

  15. A Syntactic Classification based Web Page Ranking Algorithm

    CERN Document Server

    Mukhopadhyay, Debajyoti; Kim, Young-Chon

    2011-01-01

    The existing search engines sometimes give unsatisfactory search result for lack of any categorization of search result. If there is some means to know the preference of user about the search result and rank pages according to that preference, the result will be more useful and accurate to the user. In the present paper a web page ranking algorithm is being proposed based on syntactic classification of web pages. Syntactic Classification does not bother about the meaning of the content of a web page. The proposed approach mainly consists of three steps: select some properties of web pages based on user's demand, measure them, and give different weightage to each property during ranking for different types of pages. The existence of syntactic classification is supported by running fuzzy c-means algorithm and neural network classification on a set of web pages. The change in ranking for difference in type of pages but for same query string is also being demonstrated.

  16. Texture Classification Using Sparse Frame-Based Representations

    Directory of Open Access Journals (Sweden)

    Skretting Karl

    2006-01-01

    Full Text Available A new method for supervised texture classification, denoted by frame texture classification method (FTCM, is proposed. The method is based on a deterministic texture model in which a small image block, taken from a texture region, is modeled as a sparse linear combination of frame elements. FTCM has two phases. In the design phase a frame is trained for each texture class based on given texture example images. The design method is an iterative procedure in which the representation error, given a sparseness constraint, is minimized. In the classification phase each pixel in a test image is labeled by analyzing its spatial neighborhood. This block is represented by each of the frames designed for the texture classes under consideration, and the frame giving the best representation gives the class. The FTCM is applied to nine test images of natural textures commonly used in other texture classification work, yielding excellent overall performance.

  17. Feature Extraction based Face Recognition, Gender and Age Classification

    Directory of Open Access Journals (Sweden)

    Venugopal K R

    2010-01-01

    Full Text Available The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extraction and Classification. The geometric features of facial images like eyes, nose, mouth etc. are located by using Canny edge operator and face recognition is performed. Based on the texture and shape information gender and age classification is done using Posteriori Class Probability and Artificial Neural Network respectively. It is observed that the face recognition is 100%, the gender and age classification is around 98% and 94% respectively.

  18. Analysis of Kernel Approach in Fuzzy-Based Image Classifications

    Directory of Open Access Journals (Sweden)

    Mragank Singhal

    2013-03-01

    Full Text Available This paper presents a framework of kernel approach in the field of fuzzy based image classification in remote sensing. The goal of image classification is to separate images according to their visual content into two or more disjoint classes. Fuzzy logic is relatively young theory. Major advantage of this theory is that it allows the natural description, in linguistic terms, of problems that should be solved rather than in terms of relationships between precise numerical values. This paper describes how remote sensing data with uncertainty are handled with fuzzy based classification using Kernel approach for land use/land cover maps generation. The introduction to fuzzification using Kernel approach provides the basis for the development of more robust approaches to the remote sensing classification problem. The kernel explicitly defines a similarity measure between two samples and implicitly represents the mapping of the input space to the feature space.

  19. Audio Classification from Time-Frequency Texture

    CERN Document Server

    Yu, Guoshen

    2008-01-01

    Time-frequency representations of audio signals often resemble texture images. This paper derives a simple audio classification algorithm based on treating sound spectrograms as texture images. The algorithm is inspired by an earlier visual classification scheme particularly efficient at classifying textures. While solely based on time-frequency texture features, the algorithm achieves surprisingly good performance in musical instrument classification experiments.

  20. Classification of normal and pathological aging processes based on brain MRI morphology measures

    Science.gov (United States)

    Perez-Gonzalez, J. L.; Yanez-Suarez, O.; Medina-Bañuelos, V.

    2014-03-01

    Reported studies describing normal and abnormal aging based on anatomical MRI analysis do not consider morphological brain changes, but only volumetric measures to distinguish among these processes. This work presents a classification scheme, based both on size and shape features extracted from brain volumes, to determine different aging stages: healthy control (HC) adults, mild cognitive impairment (MCI), and Alzheimer's disease (AD). Three support vector machines were optimized and validated for the pair-wise separation of these three classes, using selected features from a set of 3D discrete compactness measures and normalized volumes of several global and local anatomical structures. Our analysis show classification rates of up to 98.3% between HC and AD; of 85% between HC and MCI and of 93.3% for MCI and AD separation. These results outperform those reported in the literature and demonstrate the viability of the proposed morphological indexes to classify different aging stages.

  1. FEATURE RANKING BASED NESTED SUPPORT VECTOR MACHINE ENSEMBLE FOR MEDICAL IMAGE CLASSIFICATION.

    Science.gov (United States)

    Varol, Erdem; Gaonkar, Bilwaj; Erus, Guray; Schultz, Robert; Davatzikos, Christos

    2012-01-01

    This paper presents a method for classification of structural magnetic resonance images (MRI) of the brain. An ensemble of linear support vector machine classifiers (SVMs) is used for classifying a subject as either patient or normal control. Image voxels are first ranked based on the voxel wise t-statistics between the voxel intensity values and class labels. Then voxel subsets are selected based on the rank value using a forward feature selection scheme. Finally, an SVM classifier is trained on each subset of image voxels. The class label of a test subject is calculated by combining individual decisions of the SVM classifiers using a voting mechanism. The method is applied for classifying patients with neurological diseases such as Alzheimer's disease (AD) and autism spectrum disorder (ASD). The results on both datasets demonstrate superior performance as compared to two state of the art methods for medical image classification.

  2. Holographic storage scheme based on digital signal processing

    Institute of Scientific and Technical Information of China (English)

    Kebin Jia(贾克斌); Dapeng Yang(杨大鹏); Shubo Dun(敦书波); Shiquan Tao(陶世荃); Mingyan Qin(覃鸣燕)

    2003-01-01

    In this paper, a holographic storage scheme for multimedia data storage and retrieval based on the digitalsignal processing (DSP) is designed. A communication model for holographic storage system is obtainedon the analogy of traditional communication system. Many characteristics of holographic storage areembodied in the communication model. Then some new methods of DSP including two-dimensional (2-D)shifting interleaving, encoding and decoding of modulation-array (MA) code and method of soft-decision,etc. are proposed and employed in the system. From the results of experiments it can be seen that thosemeasures can effectively reduce the influence of noise. A segment of multimedia data, including video andaudio data, is retrieved successfully after holographic storage by using those techniques.

  3. Watermarking scheme of colour image based on chaotic sequences

    Institute of Scientific and Technical Information of China (English)

    LIU Nian-sheng; GUO Dong-hui

    2009-01-01

    The proposed perceptual mask is based on the singularity of cover image and matches very well with the properties of the human visual system. The cover colour image is decomposed into several subbands by the wavelet transform. The water-mark composed of chaotic sequence and the covert image is embedded into the subband with the largest energy. The chaos system plays an important role in the security invisibility and robustness of the proposed scheme. The parameter and initial state of chaos system can directly influence the generation of watermark information as a key. Moreover, the watermark information has the property of spread spectrum signal by chaotic sequence to improve the invisibility and security of watermarked image. Experimental results and comparisons with other watermarking techniques prove that the proposed algorithm is effective and feasible, and improves the security, invisibility and robustness of watermarking information.

  4. A LAGUERRE VORONOI BASED SCHEME FOR MESHING PARTICLE SYSTEMS.

    Science.gov (United States)

    Bajaj, Chandrajit

    2005-06-01

    We present Laguerre Voronoi based subdivision algorithms for the quadrilateral and hexahedral meshing of particle systems within a bounded region in two and three dimensions, respectively. Particles are smooth functions over circular or spherical domains. The algorithm first breaks the bounded region containing the particles into Voronoi cells that are then subsequently decomposed into an initial quadrilateral or an initial hexahedral scaffold conforming to individual particles. The scaffolds are subsequently refined via applications of recursive subdivision (splitting and averaging rules). Our choice of averaging rules yield a particle conforming quadrilateral/hexahedral mesh, of good quality, along with being smooth and differentiable in the limit. Extensions of the basic scheme to dynamic re-meshing in the case of addition, deletion, and moving particles are also discussed. Motivating applications of the use of these static and dynamic meshes for particle systems include the mechanics of epoxy/glass composite materials, bio-molecular force field calculations, and gas hydrodynamics simulations in cosmology.

  5. Hierarchical Spread Spectrum Fingerprinting Scheme Based on the CDMA Technique

    Directory of Open Access Journals (Sweden)

    Kuribayashi Minoru

    2011-01-01

    Full Text Available Abstract Digital fingerprinting is a method to insert user's own ID into digital contents in order to identify illegal users who distribute unauthorized copies. One of the serious problems in a fingerprinting system is the collusion attack such that several users combine their copies of the same content to modify/delete the embedded fingerprints. In this paper, we propose a collusion-resistant fingerprinting scheme based on the CDMA technique. Our fingerprint sequences are orthogonal sequences of DCT basic vectors modulated by PN sequence. In order to increase the number of users, a hierarchical structure is produced by assigning a pair of the fingerprint sequences to a user. Under the assumption that the frequency components of detected sequences modulated by PN sequence follow Gaussian distribution, the design of thresholds and the weighting of parameters are studied to improve the performance. The robustness against collusion attack and the computational costs required for the detection are estimated in our simulation.

  6. Object Based and Pixel Based Classification Using Rapideye Satellite Imager of ETI-OSA, Lagos, Nigeria

    Directory of Open Access Journals (Sweden)

    Esther Oluwafunmilayo Makinde

    2016-12-01

    Full Text Available Several studies have been carried out to find an appropriate method to classify the remote sensing data. Traditional classification approaches are all pixel-based, and do not utilize the spatial information within an object which is an important source of information to image classification. Thus, this study compared the pixel based and object based classification algorithms using RapidEye satellite image of Eti-Osa LGA, Lagos. In the object-oriented approach, the image was segmented to homogenous area by suitable parameters such as scale parameter, compactness, shape etc. Classification based on segments was done by a nearest neighbour classifier. In the pixel-based classification, the spectral angle mapper was used to classify the images. The user accuracy for each class using object based classification were 98.31% for waterbody, 92.31% for vegetation, 86.67% for bare soil and 90.57% for Built up while the user accuracy for the pixel based classification were 98.28% for waterbody, 84.06% for Vegetation 86.36% and 79.41% for Built up. These classification techniques were subjected to accuracy assessment and the overall accuracy of the Object based classification was 94.47%, while that of Pixel based classification yielded 86.64%. The result of classification and accuracy assessment show that the object-based approach gave more accurate and satisfying results

  7. A robust anonymous biometric-based remote user authentication scheme using smart cards

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Das

    2015-04-01

    Full Text Available Several biometric-based remote user authentication schemes using smart cards have been proposed in the literature in order to improve the security weaknesses in user authentication system. In 2012, An proposed an enhanced biometric-based remote user authentication scheme using smart cards. It was claimed that the proposed scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. In this paper, we first analyze the security of An’s scheme and we show that this scheme has three serious security flaws in the design of the scheme: (i flaw in user’s biometric verification during the login phase, (ii flaw in user’s password verification during the login and authentication phases, and (iii flaw in user’s password change locally at any time by the user. Due to these security flaws, An’s scheme cannot support mutual authentication between the user and the server. Further, we show that An’s scheme cannot prevent insider attack. In order to remedy the security weaknesses found in An’s scheme, we propose a new robust and secure anonymous biometric-based remote user authentication scheme using smart cards. Through the informal and formal security analysis, we show that our scheme is secure against all possible known attacks including the attacks found in An’s scheme. The simulation results of our scheme using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications tool ensure that our scheme is secure against passive and active attacks. In addition, our scheme is also comparable in terms of the communication and computational overheads with An’s scheme and other related existing schemes. As a result, our scheme is more appropriate for practical applications compared to other approaches.

  8. An ensemble training scheme for machine-learning classification of Hyperion satellite imagery with independent hyperspectral libraries

    Science.gov (United States)

    Friedel, Michael; Buscema, Massimo

    2016-04-01

    A training scheme is proposed for the real-time classification of soil and vegetation (landscape) components in EO-1 Hyperion hyperspectral images. First, an auto-contractive map is used to compute connectivity of reflectance values for spectral bands (N=200) from independent laboratory spectral library components. Second, a minimum spanning tree is used to identify optimal grouping of training components from connectivity values. Third, the reflectance values for optimal landscape component signatures are sorted. Fourth, empirical distribution functions (EDF) are computed for each landscape component. Fifth, the Monte-Carlo technique is used to generate realizations (N=30) for each landscape EDF. The correspondence of component realizations to original signatures validates the stochastic procedure. Presentation of the realizations to the self-organizing map (SOM) is done using three different map sizes: 14x10, 28x20, and 40 x 30. In each case, the SOM training proceeds first with a rough phase (20 iterations using a Gaussian neighborhood with an initial and final radius of 11 units and 3 units) and then fine phase (400 iterations using a Gaussian neighborhood with an initial and final radius of 3 units and 1 unit). The initial and final learning rates of 0.5 and 0.05 decay linearly down to 10-5, and the Gaussian neighborhood function decreases exponentially (decay rate of 10-3 iteration-1) providing reasonable convergence. Following training of the three networks, each corresponding SOM is used to independently classify the original spectral library signatures. In comparing the different SOM networks, the 28x20 map size is chosen for independent reproducibility and processing speed. The corresponding universal distance matrix reveals separation of the seven component classes for this map size thereby supporting it use as a Hyperion classifier.

  9. Tomato classification based on laser metrology and computer algorithms

    Science.gov (United States)

    Igno Rosario, Otoniel; Muñoz Rodríguez, J. Apolinar; Martínez Hernández, Haydeé P.

    2011-08-01

    An automatic technique for tomato classification is presented based on size and color. The size is determined based on surface contouring by laser line scanning. Here, a Bezier network computes the tomato height based on the line position. The tomato color is determined by CIELCH color space and the components red and green. Thus, the tomato size is classified in large, medium and small. Also, the tomato is classified into six colors associated with its maturity. The performance and accuracy of the classification system is evaluated based on methods reported in the recent years. The technique is tested and experimental results are presented.

  10. Cryptanalysis And Further Improvement Of A Biometric-Based Remote User Authentication Scheme Using Smart Cards

    CERN Document Server

    Das, Ashok Kumar

    2011-01-01

    Recently, Li et al. proposed a secure biometric-based remote user authentication scheme using smart cards to withstand the security flaws of Li-Hwang's efficient biometric-based remote user authentication scheme using smart cards. Li et al.'s scheme is based on biometrics verification, smart card and one-way hash function, and it also uses the random nonce rather than a synchronized clock, and thus it is efficient in computational cost and more secure than Li-Hwang's scheme. Unfortunately, in this paper we show that Li et al.'s scheme still has some security weaknesses in their design. In order to withstand those weaknesses in their scheme, we further propose an improvement of their scheme so that the improved scheme always provides proper authentication and as a result, it establishes a session key between the user and the server at the end of successful user authentication.

  11. Visual words based approach for tissue classification in mammograms

    Science.gov (United States)

    Diamant, Idit; Goldberger, Jacob; Greenspan, Hayit

    2013-02-01

    The presence of Microcalcifications (MC) is an important indicator for developing breast cancer. Additional indicators for cancer risk exist, such as breast tissue density type. Different methods have been developed for breast tissue classification for use in Computer-aided diagnosis systems. Recently, the visual words (VW) model has been successfully applied for different classification tasks. The goal of our work is to explore VW based methodologies for various mammography classification tasks. We start with the challenge of classifying breast density and then focus on classification of normal tissue versus Microcalcifications. The presented methodology is based on patch-based visual words model which includes building a dictionary for a training set using local descriptors and representing the image using a visual word histogram. Classification is then performed using k-nearest-neighbour (KNN) and Support vector machine (SVM) classifiers. We tested our algorithm on the MIAS and DDSM publicly available datasets. The input is a representative region-of-interest per mammography image, manually selected and labelled by expert. In the tissue density task, classification accuracy reached 85% using KNN and 88% using SVM, which competes with the state-of-the-art results. For MC vs. normal tissue, accuracy reached 95.6% using SVM. Results demonstrate the feasibility to classify breast tissue using our model. Currently, we are improving the results further while also investigating VW capability to classify additional important mammogram classification problems. We expect that the methodology presented will enable high levels of classification, suggesting new means for automated tools for mammography diagnosis support.

  12. Enhanced ID-Based Authentication Scheme Using OTP in Smart Grid AMI Environment

    Directory of Open Access Journals (Sweden)

    Sang-Soo Yeo

    2014-01-01

    Full Text Available This paper presents the vulnerabilities analyses of KL scheme which is an ID-based authentication scheme for AMI network attached SCADA in smart grid and proposes a security-enhanced authentication scheme which satisfies forward secrecy as well as security requirements introduced in KL scheme and also other existing schemes. The proposed scheme uses MDMS which is the supervising system located in an electrical company as a time-synchronizing server in order to synchronize smart devices at home and conducts authentication between smart meter and smart devices using a new secret value generated by an OTP generator every session. The proposed scheme has forward secrecy, so it increases overall security, but its communication and computation overhead reduce its performance slightly, comparing the existing schemes. Nonetheless, hardware specification and communication bandwidth of smart devices will have better conditions continuously, so the proposed scheme would be a good choice for secure AMI environment.

  13. Cryptanalysis of an Elliptic Curve-based Signcryption Scheme

    CERN Document Server

    Toorani, Mohsen

    2010-01-01

    The signcryption is a relatively new cryptographic technique that is supposed to fulfill the functionalities of encryption and digital signature in a single logical step. Although several signcryption schemes are proposed over the years, some of them are proved to have security problems. In this paper, the security of Han et al.'s signcryption scheme is analyzed, and it is proved that it has many security flaws and shortcomings. Several devastating attacks are also introduced to the mentioned scheme whereby it fails all the desired and essential security attributes of a signcryption scheme.

  14. Classification of LiDAR Data with Point Based Classification Methods

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2016-06-01

    LiDAR is one of the most effective systems for 3 dimensional (3D) data collection in wide areas. Nowadays, airborne LiDAR data is used frequently in various applications such as object extraction, 3D modelling, change detection and revision of maps with increasing point density and accuracy. The classification of the LiDAR points is the first step of LiDAR data processing chain and should be handled in proper way since the 3D city modelling, building extraction, DEM generation, etc. applications directly use the classified point clouds. The different classification methods can be seen in recent researches and most of researches work with the gridded LiDAR point cloud. In grid based data processing of the LiDAR data, the characteristic point loss in the LiDAR point cloud especially vegetation and buildings or losing height accuracy during the interpolation stage are inevitable. In this case, the possible solution is the use of the raw point cloud data for classification to avoid data and accuracy loss in gridding process. In this study, the point based classification possibilities of the LiDAR point cloud is investigated to obtain more accurate classes. The automatic point based approaches, which are based on hierarchical rules, have been proposed to achieve ground, building and vegetation classes using the raw LiDAR point cloud data. In proposed approaches, every single LiDAR point is analyzed according to their features such as height, multi-return, etc. then automatically assigned to the class which they belong to. The use of un-gridded point cloud in proposed point based classification process helped the determination of more realistic rule sets. The detailed parameter analyses have been performed to obtain the most appropriate parameters in the rule sets to achieve accurate classes. The hierarchical rule sets were created for proposed Approach 1 (using selected spatial-based and echo-based features) and Approach 2 (using only selected spatial-based features

  15. Entropy-based gene ranking without selection bias for the predictive classification of microarray data

    Directory of Open Access Journals (Sweden)

    Serafini Maria

    2003-11-01

    Full Text Available Abstract Background We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process. Results With E-RFE, we speed up the recursive feature elimination (RFE with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Conclusions Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  16. S1 gene-based phylogeny of infectious bronchitis virus: An attempt to harmonize virus classification.

    Science.gov (United States)

    Valastro, Viviana; Holmes, Edward C; Britton, Paul; Fusaro, Alice; Jackwood, Mark W; Cattoli, Giovanni; Monne, Isabella

    2016-04-01

    Infectious bronchitis virus (IBV) is the causative agent of a highly contagious disease that results in severe economic losses to the global poultry industry. The virus exists in a wide variety of genetically distinct viral types, and both phylogenetic analysis and measures of pairwise similarity among nucleotide or amino acid sequences have been used to classify IBV strains. However, there is currently no consensus on the method by which IBV sequences should be compared, and heterogeneous genetic group designations that are inconsistent with phylogenetic history have been adopted, leading to the confusing coexistence of multiple genotyping schemes. Herein, we propose a simple and repeatable phylogeny-based classification system combined with an unambiguous and rationale lineage nomenclature for the assignment of IBV strains. By using complete nucleotide sequences of the S1 gene we determined the phylogenetic structure of IBV, which in turn allowed us to define 6 genotypes that together comprise 32 distinct viral lineages and a number of inter-lineage recombinants. Because of extensive rate variation among IBVs, we suggest that the inference of phylogenetic relationships alone represents a more appropriate criterion for sequence classification than pairwise sequence comparisons. The adoption of an internationally accepted viral nomenclature is crucial for future studies of IBV epidemiology and evolution, and the classification scheme presented here can be updated and revised novel S1 sequences should become available.

  17. Semantic Document Image Classification Based on Valuable Text Pattern

    Directory of Open Access Journals (Sweden)

    Hossein Pourghassem

    2011-01-01

    Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.

  18. Indoor scene classification of robot vision based on cloud computing

    Science.gov (United States)

    Hu, Tao; Qi, Yuxiao; Li, Shipeng

    2016-07-01

    For intelligent service robots, indoor scene classification is an important issue. To overcome the weak real-time performance of conventional algorithms, a new method based on Cloud computing is proposed for global image features in indoor scene classification. With MapReduce method, global PHOG feature of indoor scene image is extracted in parallel. And, feature eigenvector is used to train the decision classifier through SVM concurrently. Then, the indoor scene is validly classified by decision classifier. To verify the algorithm performance, we carried out an experiment with 350 typical indoor scene images from MIT LabelMe image library. Experimental results show that the proposed algorithm can attain better real-time performance. Generally, it is 1.4 2.1 times faster than traditional classification methods which rely on single computation, while keeping stable classification correct rate as 70%.

  19. Classification approach based on association rules mining for unbalanced data

    CERN Document Server

    Ndour, Cheikh

    2012-01-01

    This paper deals with the supervised classification when the response variable is binary and its class distribution is unbalanced. In such situation, it is not possible to build a powerful classifier by using standard methods such as logistic regression, classification tree, discriminant analysis, etc. To overcome this short-coming of these methods that provide classifiers with low sensibility, we tackled the classification problem here through an approach based on the association rules learning because this approach has the advantage of allowing the identification of the patterns that are well correlated with the target class. Association rules learning is a well known method in the area of data-mining. It is used when dealing with large database for unsupervised discovery of local patterns that expresses hidden relationships between variables. In considering association rules from a supervised learning point of view, a relevant set of weak classifiers is obtained from which one derives a classification rule...

  20. Ensemble polarimetric SAR image classification based on contextual sparse representation

    Science.gov (United States)

    Zhang, Lamei; Wang, Xiao; Zou, Bin; Qiao, Zhijun

    2016-05-01

    Polarimetric SAR image interpretation has become one of the most interesting topics, in which the construction of the reasonable and effective technique of image classification is of key importance. Sparse representation represents the data using the most succinct sparse atoms of the over-complete dictionary and the advantages of sparse representation also have been confirmed in the field of PolSAR classification. However, it is not perfect, like the ordinary classifier, at different aspects. So ensemble learning is introduced to improve the issue, which makes a plurality of different learners training and obtained the integrated results by combining the individual learner to get more accurate and ideal learning results. Therefore, this paper presents a polarimetric SAR image classification method based on the ensemble learning of sparse representation to achieve the optimal classification.

  1. Pathological Bases for a Robust Application of Cancer Molecular Classification

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2015-04-01

    Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.

  2. New Cryptanalysis of an ID-based Password Authentication Scheme using Smart Cards and Fingerprints

    Directory of Open Access Journals (Sweden)

    MING-JHENG LI

    2010-11-01

    Full Text Available In 2002, Lee, Ryu and Yoo proposed a fingerprint-based remote user authentication scheme using a smart card. Their scheme would strengthen system security by verifying the smart card owner’s fingerprint. In 2003, Kim, Lee and Yoo proposed two ID-based password authentication schemes without passwords or verification tables,with smart card and fingerprints. The proposed nonce-based and timestamp-based schemes can withstand message replay attacks. In addition, the schemes can withstand impersonation attack. In this paper, we will first review Kim et al.’s ID-based password authentication schemes. Next, we will show that Kim et al.’s scheme still has the disadvantage of being vulnerable to impersonation attack.

  3. An Extended Energy Consumption Analysis of Reputation-based Trust Management Schemes of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Riaz Ahmed Shaikh

    2010-03-01

    Full Text Available Energy consumption is one of the most important parameters for evaluation of a scheme proposed for wireless sensor networks (WSNs because of their resource constraint nature. Comprehensive comparative analysis of proposed reputation-based trust management schemes of WSNs from this perspective is currently not available in the literature. In this paper, we have presented a theoretical and simulation based energy consumption analysis and evaluation of three state-of-the-art reputation-based trust management schemes of WSNs. Results show that the GTMS scheme consume less energy as compared with the RFSN and PLUS schemes.

  4. Spitzer IRS Spectra of Luminous 8 micron Sources in the Large Magellanic Cloud: Testing color-based classifications

    CERN Document Server

    Buchanan, Catherine L; Hrivnak, Bruce J; Sahai, Raghvendra

    2009-01-01

    We present archival Spitzer IRS spectra of 19 luminous 8 micron selected sources in the Large Magellanic Cloud (LMC). The object classes derived from these spectra and from an additional 24 spectra in the literature are compared with classifications based on 2MASS/MSX (J, H, K, and 8 micron) colors in order to test the "JHK8" classification scheme (Kastner et al. 2008). The IRS spectra confirm the classifications of 22 of the 31 sources that can be classified under the JHK8 system. The spectroscopic classification of 12 objects that were unclassifiable in the JHK8 scheme allow us to characterize regions of the color-color diagrams that previously lacked spectroscopic verification, enabling refinements to the JHK8 classification system. The results of these new classifications are consistent with previous results concerning the identification of the most infrared-luminous objects in the LMC. In particular, while the IRS spectra reveal several new examples of asymptotic giant branch (AGB) stars with O-rich enve...

  5. A novel chaotic encryption scheme based on arithmetic coding

    Energy Technology Data Exchange (ETDEWEB)

    Mi Bo [Department of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China)], E-mail: mi_bo@163.com; Liao Xiaofeng; Chen Yong [Department of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China)

    2008-12-15

    In this paper, under the combination of arithmetic coding and logistic map, a novel chaotic encryption scheme is presented. The plaintexts are encrypted and compressed by using an arithmetic coder whose mapping intervals are changed irregularly according to a keystream derived from chaotic map and plaintext. Performance and security of the scheme are also studied experimentally and theoretically in detail.

  6. On the Security of Provably Secure Multi-Receiver ID-Based Signcryption Scheme

    Science.gov (United States)

    Tan, Chik-How

    Recently, Duan and Cao proposed an multi-receiver identity-based signcryption scheme. They showed that their scheme is secure against adaptive chosen ciphertext attacks in the random oracle model. In this paper, we show that their scheme is in fact not secure against adaptive chosen ciphertext attacks under their defined security model.

  7. An Anonymous Voting Scheme based on Confirmation Numbers

    Science.gov (United States)

    Alam, Kazi Md. Rokibul; Tamura, Shinsuke; Taniguchi, Shuji; Yanase, Tatsuro

    This paper proposes a new electronic voting (e-voting) scheme that fulfills all the security requirements of e-voting i.e. privacy, accuracy, universal verifiability, fairness, receipt-freeness, incoercibility, dispute-freeness, robustness, practicality and scalability; usually some of which are found to be traded. When compared with other existing schemes, this scheme requires much more simple computations and weaker assumptions about trustworthiness of individual election authorities. The key mechanism is the one that uses confirmation numbers involved in individual votes to make votes verifiable while disabling all entities including voters themselves to know the linkages between voters and their votes. Many existing e-voting schemes extensively deploy zero-knowledge proof (ZKP) to achieve verifiability. However, ZKP is expensive and complicated. The confirmation numbers attain the verifiability requirement in a much more simple and intuitive way, then the scheme becomes scalable and practical.

  8. UPF based autonomous navigation scheme for deep space probe

    Institute of Scientific and Technical Information of China (English)

    Li Peng; Cui Hutao; Cui Pingyuan

    2008-01-01

    The autonomous "celestial navigation scheme" for deep space probe departing from the earth and the autonomous "optical navigation scheme" for encountering object celestial body are presented. Then, aiming at the conditions that large initial estimation errors and non-Gaussian distribution of state or measurement errors may exist in orbit determination process of the two phases, UPF (unscented particle filter) is introduced into the navigation schemes. By tackling nonlinear and non-Gaussian problems, UPF overcomes the accuracy influence brought by the traditional EKF (extended Kalman filter), UKF (unscented Kalman filter), and PF (particle filter) schemes in approximate treatment to nonlinear and non-Gaussian state model and measurement model. The numerical simulations demonstrate the feasibility and higher accuracy of the UPF navigation scheme.

  9. FINGERPRINT CLASSIFICATION BASED ON RECURSIVE NEURAL NETWORK WITH SUPPORT VECTOR MACHINE

    Directory of Open Access Journals (Sweden)

    T. Chakravarthy

    2011-01-01

    Full Text Available Fingerprint classification based on statistical and structural (RNN and SVM approach. RNNs are trained on a structured representation of the fingerprint image. They are also used to extract a set of distributed features of the fingerprint which can be integrated in this support vector machine. SVMs are combined with a new error correcting codes scheme. This approach has two main advantages. (a It can tolerate the presence of ambiguous fingerprint images in the training set and (b It can effectively identify the most difficult fingerprint images in the test set. In this experiment on the fingerprint database NIST-4 (National Institute of Science and Technology, our best classification accuracy of 94.7% is obtained by training SVM on both fingerCode and RNN –extracted futures of segmentation algorithm which has used very sophisticated “region growing process”.

  10. An Enhanced Leakage-Based Precoding Scheme for Multi-User Multi-Layer MIMO Systems

    OpenAIRE

    Yang, Chunliang

    2014-01-01

    In this paper, we propose an enhanced leakage-based precoding scheme, i.e., layer signal to leakage plus noise ratio (layer SLNR) scheme, for multi-user multi-layer MIMO systems. Specifically, the layer SLNR scheme incorporates the MIMO receiver structure into the precoder design procedure, which makes the formulation of signal power and interference / leakage power more accurate. Besides, the layer SLNR scheme not only takes into account the inter-layer interference from different users, but...

  11. Classification of Gait Types Based on the Duty-factor

    DEFF Research Database (Denmark)

    Fihl, Preben; Moeslund, Thomas B.

    2007-01-01

    This paper deals with classification of human gait types based on the notion that different gait types are in fact different types of locomotion, i.e., running is not simply walking done faster. We present the duty-factor, which is a descriptor based on this notion. The duty-factor is independent...

  12. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    Science.gov (United States)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  13. Super pixel density based clustering automatic image classification method

    Science.gov (United States)

    Xu, Mingxing; Zhang, Chuan; Zhang, Tianxu

    2015-12-01

    The image classification is an important means of image segmentation and data mining, how to achieve rapid automated image classification has been the focus of research. In this paper, based on the super pixel density of cluster centers algorithm for automatic image classification and identify outlier. The use of the image pixel location coordinates and gray value computing density and distance, to achieve automatic image classification and outlier extraction. Due to the increased pixel dramatically increase the computational complexity, consider the method of ultra-pixel image preprocessing, divided into a small number of super-pixel sub-blocks after the density and distance calculations, while the design of a normalized density and distance discrimination law, to achieve automatic classification and clustering center selection, whereby the image automatically classify and identify outlier. After a lot of experiments, our method does not require human intervention, can automatically categorize images computing speed than the density clustering algorithm, the image can be effectively automated classification and outlier extraction.

  14. Instrument classification in polyphonic music based on timbre analysis

    Science.gov (United States)

    Zhang, Tong

    2001-07-01

    While most previous work on musical instrument recognition is focused on the classification of single notes in monophonic music, a scheme is proposed in this paper for the distinction of instruments in continuous music pieces which may contain one or more kinds of instruments. Highlights of the system include music segmentation into notes, harmonic partial estimation in polyphonic sound, note feature calculation and normalization, note classification using a set of neural networks, and music piece categorization with fuzzy logic principles. Example outputs of the system are `the music piece is 100% guitar (with 90% likelihood)' and `the music piece is 60% violin and 40% piano, thus a violin/piano duet'. The system has been tested with twelve kinds of musical instruments, and very promising experimental results have been obtained. An accuracy of about 80% is achieved, and the number can be raised to 90% if misindexings within the same instrument family are tolerated (e.g. cello, viola and violin). A demonstration system for musical instrument classification and music timbre retrieval is also presented.

  15. An Efficient ECDSA-Based Signature Scheme for Wireless Networks

    Institute of Scientific and Technical Information of China (English)

    XU Zhong; DAI Guanzhong; YANG Deming

    2006-01-01

    Wired equivalent security is difficult to provide in wireless networks due to high dynamics, wireless link vulnerability, and decentralization. The Elliptic Curve Digital Signature Algorithm(ECDSA) has been applied to wireless networks because of its low computational cost and short key size, which reduces the overheads in a wireless environment. This study improves the ECDSA scheme by reducing its time complexity. The significant advantage of the algorithm is that our new scheme needs not to calculate modular inverse operation in the phases of signature generation and signature verification. Such an improvement makes the proposed scheme more efficient and secure.

  16. A scalable admission control scheme based on time label

    Institute of Scientific and Technical Information of China (English)

    杨松岸; 杨华; 杨宇航

    2004-01-01

    Resource reservation protocols allow communicating hosts to reserve resources such as bandwidth to offer guaranteed service. However, current resource reservation architectures do not scale well for a large number of flows. In this paper, we present a simple reservation protocol and a scalable admission control algorithm, which can provide QoS guarantees to individual flows without per-flow management in the network core. By mapping each flow to a definite time, this scheme addresses the problems that limit the effectiveness of current endpoint admission control schemes. The overall admission control process is described. Analysis is used to explain the reasonability of our scheme and simulation validates its performance.

  17. A Chaos-Based Encryption Scheme for DCT Precoded OFDM-Based Visible Light Communication Systems

    Directory of Open Access Journals (Sweden)

    Zhongpeng Wang

    2016-01-01

    Full Text Available This paper proposes a physical encryption scheme for discrete cosine transform (DCT precoded OFDM-based visible light communication systems by employing chaos scrambling. In the proposed encryption scheme, the Logistic map is adopted for the chaos mapping. The chaos scrambling strategy can allocate the two scrambling sequences to the real (I and imaginary (Q parts of OFDM frames according to the initial condition, which enhance the confidentiality of the physical layer. The simulation experimental results prove the efficiency of the proposed encryption method for DCT precoded OFDM-based VLC systems. The experimental results show that the proposed security scheme can protect the DCT precoded OFDM-based VLC from eavesdropper, while keeping the advantage of the DCT precoding technique, which can reduce the PAPR and improve the BER performance of OFDM-based VLC.

  18. Noninvasive blood pressure measurement scheme based on optical fiber sensor

    Science.gov (United States)

    Liu, Xianxuan; Yuan, Xueguang; Zhang, Yangan

    2016-10-01

    Optical fiber sensing has many advantages, such as volume small, light quality, low loss, strong in anti-jamming. Since the invention of the optical fiber sensing technology in 1977, optical fiber sensing technology has been applied in the military, national defense, aerospace, industrial, medical and other fields in recent years, and made a great contribution to parameter measurement in the environment under the limited condition .With the rapid development of computer, network system, the intelligent optical fiber sensing technology, the sensor technology, the combination of computer and communication technology , the detection, diagnosis and analysis can be automatically and efficiently completed. In this work, we proposed a noninvasive blood pressure detection and analysis scheme which uses optical fiber sensor. Optical fiber sensing system mainly includes the light source, optical fiber, optical detector, optical modulator, the signal processing module and so on. wavelength optical signals were led into the optical fiber sensor and the signals reflected by the human body surface were detected. By comparing actual testing data with the data got by traditional way to measure the blood pressure we can establish models for predicting the blood pressure and achieve noninvasive blood pressure measurement by using spectrum analysis technology. Blood pressure measurement method based on optical fiber sensing system is faster and more convenient than traditional way, and it can get accurate analysis results in a shorter period of time than before, so it can efficiently reduce the time cost and manpower cost.

  19. A Rhythm-Based Authentication Scheme for Smart Media Devices

    Directory of Open Access Journals (Sweden)

    Jae Dong Lee

    2014-01-01

    Full Text Available In recent years, ubiquitous computing has been rapidly emerged in our lives and extensive studies have been conducted in a variety of areas related to smart devices, such as tablets, smartphones, smart TVs, smart refrigerators, and smart media devices, as a measure for realizing the ubiquitous computing. In particular, smartphones have significantly evolved from the traditional feature phones. Increasingly higher-end smartphone models that can perform a range of functions are now available. Smart devices have become widely popular since they provide high efficiency and great convenience for not only private daily activities but also business endeavors. Rapid advancements have been achieved in smart device technologies to improve the end users’ convenience. Consequently, many people increasingly rely on smart devices to store their valuable and important data. With this increasing dependence, an important aspect that must be addressed is security issues. Leaking of private information or sensitive business data due to loss or theft of smart devices could result in exorbitant damage. To mitigate these security threats, basic embedded locking features are provided in smart devices. However, these locking features are vulnerable. In this paper, an original security-locking scheme using a rhythm-based locking system (RLS is proposed to overcome the existing security problems of smart devices. RLS is a user-authenticated system that addresses vulnerability issues in the existing locking features and provides secure confidentiality in addition to convenience.

  20. An Efficient Semantic Model For Concept Based Clustering And Classification

    Directory of Open Access Journals (Sweden)

    SaiSindhu Bandaru

    2012-03-01

    Full Text Available Usually in text mining techniques the basic measures like term frequency of a term (word or phrase is computed to compute the importance of the term in the document. But with statistical analysis, the original semantics of the term may not carry the exact meaning of the term. To overcome this problem, a new framework has been introduced which relies on concept based model and synonym based approach. The proposed model can efficiently find significant matching and related concepts between documents according to concept based and synonym based approaches. Large sets of experiments using the proposed model on different set in clustering and classification are conducted. Experimental results demonstrate the substantialenhancement of the clustering quality using sentence based, document based, corpus based and combined approach concept analysis. A new similarity measure has been proposed to find the similarity between adocument and the existing clusters, which can be used in classification of the document with existing clusters.

  1. Arbitrated quantum signature scheme based on cluster states

    Science.gov (United States)

    Yang, Yu-Guang; Lei, He; Liu, Zhi-Chao; Zhou, Yi-Hua; Shi, Wei-Min

    2016-06-01

    Cluster states can be exploited for some tasks such as topological one-way computation, quantum error correction, teleportation and dense coding. In this paper, we investigate and propose an arbitrated quantum signature scheme with cluster states. The cluster states are used for quantum key distribution and quantum signature. The proposed scheme can achieve an efficiency of 100 %. Finally, we also discuss its security against various attacks.

  2. A Class of Key Predistribution Schemes Based on Orthogonal Arrays

    Institute of Scientific and Technical Information of China (English)

    Jun-Wu Dong; Ding-Yi Pei; Xue-Li Wang

    2008-01-01

    Pairwise key establishment is a fundamental security service in sensor networks; it enables sensor nodes to communicate securely with each other using cryptographic techniques. In order to ensure this security, many approaches have been proposed recently. One of them is to use key predistribution schemes for distributed sensor networks. The secure connectivity and resilience of the resulting sensor network are analyzed. This KPS constructed in our paper has some better properties than those of the existing schemes.

  3. Object Based and Pixel Based Classification Using Rapideye Satellite Imager of ETI-OSA, Lagos, Nigeria

    OpenAIRE

    Esther Oluwafunmilayo Makinde; Ayobami Taofeek Salami; James Bolarinwa Olaleye; Oluwapelumi Comfort Okewusi

    2016-01-01

    Several studies have been carried out to find an appropriate method to classify the remote sensing data. Traditional classification approaches are all pixel-based, and do not utilize the spatial information within an object which is an important source of information to image classification. Thus, this study compared the pixel based and object based classification algorithms using RapidEye satellite image of Eti-Osa LGA, Lagos. In the object-oriented approach, the image was segmented to homog...

  4. Importance Sampling Based Decision Trees for Security Assessment and the Corresponding Preventive Control Schemes: the Danish Case Study

    DEFF Research Database (Denmark)

    Liu, Leo; Rather, Zakir Hussain; Chen, Zhe

    2013-01-01

    and adopts a methodology of importance sampling to maximize the information contained in the database so as to increase the accuracy of DT. Further, this paper also studies the effectiveness of DT by implementing its corresponding preventive control schemes. These approaches are tested on the detailed model......Decision Trees (DT) based security assessment helps Power System Operators (PSO) by providing them with the most significant system attributes and guiding them in implementing the corresponding emergency control actions to prevent system insecurity and blackouts. DT is obtained offline from time......-domain simulation and the process of data mining, which is then implemented online as guidelines for preventive control schemes. An algorithm named Classification and Regression Trees (CART) is used to train the DT and key to this approach lies on the accuracy of DT. This paper proposes contingency oriented DT...

  5. IDENTITY-BASED MULTISIGNATURE AND AGGREGATE SIGNATURE SCHEMES FROM M-TORSION GROUPS

    Institute of Scientific and Technical Information of China (English)

    Cheng Xiangguo; Liu Jingmei; Guo Lifeng; Wang Xinmei

    2006-01-01

    An identity-based multisignature scheme and an identity-based aggregate signature scheme are proposed in this paper. They are both from m-torsion groups on super-singular elliptic curves or hyper-elliptic curves and based on the recently proposed identity-based signature scheme of Cha and Cheon. Due to the sound properties of m-torsion groups and the base scheme, it turns out that our schemes are very simple and efficient. Both schemes are proven to be secure against adaptive chosen message attack in the random oracle model under the normal security notions with the assumption that the Computational Diffie-Hellman problem is hard in the m-torsion groups.

  6. NIM: A Node Influence Based Method for Cancer Classification

    Directory of Open Access Journals (Sweden)

    Yiwen Wang

    2014-01-01

    Full Text Available The classification of different cancer types owns great significance in the medical field. However, the great majority of existing cancer classification methods are clinical-based and have relatively weak diagnostic ability. With the rapid development of gene expression technology, it is able to classify different kinds of cancers using DNA microarray. Our main idea is to confront the problem of cancer classification using gene expression data from a graph-based view. Based on a new node influence model we proposed, this paper presents a novel high accuracy method for cancer classification, which is composed of four parts: the first is to calculate the similarity matrix of all samples, the second is to compute the node influence of training samples, the third is to obtain the similarity between every test sample and each class using weighted sum of node influence and similarity matrix, and the last is to classify each test sample based on its similarity between every class. The data sets used in our experiments are breast cancer, central nervous system, colon tumor, prostate cancer, acute lymphoblastic leukemia, and lung cancer. experimental results showed that our node influence based method (NIM is more efficient and robust than the support vector machine, K-nearest neighbor, C4.5, naive Bayes, and CART.

  7. NIM: a node influence based method for cancer classification.

    Science.gov (United States)

    Wang, Yiwen; Yao, Min; Yang, Jianhua

    2014-01-01

    The classification of different cancer types owns great significance in the medical field. However, the great majority of existing cancer classification methods are clinical-based and have relatively weak diagnostic ability. With the rapid development of gene expression technology, it is able to classify different kinds of cancers using DNA microarray. Our main idea is to confront the problem of cancer classification using gene expression data from a graph-based view. Based on a new node influence model we proposed, this paper presents a novel high accuracy method for cancer classification, which is composed of four parts: the first is to calculate the similarity matrix of all samples, the second is to compute the node influence of training samples, the third is to obtain the similarity between every test sample and each class using weighted sum of node influence and similarity matrix, and the last is to classify each test sample based on its similarity between every class. The data sets used in our experiments are breast cancer, central nervous system, colon tumor, prostate cancer, acute lymphoblastic leukemia, and lung cancer. experimental results showed that our node influence based method (NIM) is more efficient and robust than the support vector machine, K-nearest neighbor, C4.5, naive Bayes, and CART.

  8. DESIGN OF A DIGITAL SIGNATURE SCHEME BASED ON FACTORING AND DISCRETE LOGARITHMS

    Institute of Scientific and Technical Information of China (English)

    杨利英; 覃征; 胡广伍; 王志敏

    2004-01-01

    Objective Focusing on the security problem of authentication and confidentiality in the context of computer networks, a digital signature scheme was proposed based on the public key cryptosystem. Methods Firstly, the course of digital signature based on the public key cryptosystem was given. Then, RSA and ELGamal schemes were described respectively. They were the basis of the proposed scheme. Generalized ELGamal type signature schemes were listed. After comparing with each other, one scheme, whose Signature equation was (m+r)x=j+s modΦ(p) , was adopted in the designing. Results Based on two well-known cryptographic assumptions, the factorization and the discrete logarithms, a digital signature scheme was presented. It must be required that s' was not equal to p'q' in the signing procedure, because attackers could forge the signatures with high probabilities if the discrete logarithms modulo a large prime were solvable. The variable public key "e" is used instead of the invariable parameter "3" in Harns signature scheme to enhance the security. One generalized ELGamal type scheme made the proposed scheme escape one multiplicative inverse operation in the signing procedure and one modular exponentiation in the verification procedure. Conclusion The presented scheme obtains the security that Harn's scheme was originally claimed. It is secure if the factorization and the discrete logarithms are simultaneously unsolvable.

  9. Index-based reactive power compensation scheme for voltage regulation

    Science.gov (United States)

    Dike, Damian Obioma

    2008-10-01

    Increasing demand for electrical power arising from deregulation and the restrictions posed to the construction of new transmission lines by environment, socioeconomic, and political issues had led to higher grid loading. Consequently, voltage instability has become a major concern, and reactive power support is vital to enhance transmission grid performance. Improved reactive power support to distressed grid is possible through the application of relatively unfamiliar emerging technologies of "Flexible AC Transmission Systems (FACTS)" devices and "Distributed Energy Resources (DERS)." In addition to these infrastructure issues, a lack of situational awareness by system operators can cause major power outages as evidenced by the August 14, 2003 widespread North American blackout. This and many other recent major outages have highlighted the inadequacies of existing power system indexes. In this work, a novel "Index-based reactive compensation scheme" appropriate for both on-line and off-line computation of grid status has been developed. A new voltage stability index (Ls-index) suitable for long transmission lines was developed, simulated, and compared to the existing two-machine modeled L-index. This showed the effect of long distance power wheeling amongst regional transmission organizations. The dissertation further provided models for index modulated voltage source converters (VSC) and index-based load flow analysis of both FACTS and microgrid interconnected power systems using the Newton-Raphson's load flow model incorporated with multi-FACTS devices. The developed package has been made user-friendly through the embodiment of interactive graphical user interface and implemented on the IEEE 14, 30, and 300 bus systems. The results showed reactive compensation has system wide-effect, provided readily accessible system status indicators, ensured seamless DERs interconnection through new islanding modes and enhanced VSC utilization. These outcomes may contribute

  10. An improved biometrics-based authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Guo, Dianli; Wen, Qiaoyan; Li, Wenmin; Zhang, Hua; Jin, Zhengping

    2015-03-01

    Telecare medical information system (TMIS) offers healthcare delivery services and patients can acquire their desired medical services conveniently through public networks. The protection of patients' privacy and data confidentiality are significant. Very recently, Mishra et al. proposed a biometrics-based authentication scheme for telecare medical information system. Their scheme can protect user privacy and is believed to resist a range of network attacks. In this paper, we analyze Mishra et al.'s scheme and identify that their scheme is insecure to against known session key attack and impersonation attack. Thereby, we present a modified biometrics-based authentication scheme for TMIS to eliminate the aforementioned faults. Besides, we demonstrate the completeness of the proposed scheme through BAN-logic. Compared to the related schemes, our protocol can provide stronger security and it is more practical.

  11. CONSTRUCTION OF PROXY BLIND SIGNATURE SCHEME BASED ON MULTI-LINEAR TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Zhao Zemao; Liu Fengyu

    2004-01-01

    A general method of constructing proxy blind signature is proposed based on multilinear transform. Based on this method, the four proxy blind signature schemes are correspondently generated with four different signature equations, and each of them has four forms of variations of signs. Hence there are sixteen signatures in all, and all of them are proxy stronglyblind signature schemes. Furthermore, the two degenerated situations of multi-linear transform are discussed. Their corresponding proxy blind signature schemes are shown, too. But some schemes come from one of these degenerate situations are proxy weakly-blind signature scheme.The security for proposed scheme is analyzed in details. The results indicate that these signature schemes have many good properties such as unforgeability, distinguish-ability of proxy signature,non-repudiation and extensive value of application etc.

  12. An arbitrated quantum signature scheme based on entanglement swapping with signer anonymity

    Science.gov (United States)

    Li, Wei; Fan, Ming-Yu; Wang, Guang-Wei

    2012-12-01

    In this paper an arbitrated quantum signature scheme based on entanglement swapping is proposed. In this scheme a message to be signed is coded with unitary operators. Combining quantum measurement with quantum encryption, the signer can generate the signature for a given message. Combining the entangled states generated by the TTP's Bell measurement with the signature information, the verifier can verify the authentication of a signature through a single quantum state measurement. Compared with previous schemes, our scheme is more efficient and less complex, furthermore, our scheme can ensure the anonymity of the signer.

  13. A chaotic map-based authentication scheme for telecare medicine information systems.

    Science.gov (United States)

    Hao, Xinhong; Wang, Jiantao; Yang, Qinghai; Yan, Xiaopeng; Li, Ping

    2013-04-01

    With the development of Internet, patients could enjoy health-care delivery services through telecare medicine information systems (TMIS) in their home. To control the access to remote medical servers' resources, many authentication schemes using smart cards have been proposed. However, the performance of these schemes is not satisfactory since modular exponential operations are used in these schemes. In the paper, we propose a chaotic map-based authentication scheme for telecare medicine information systems. The security and performance analysis shows our scheme is more suitable for TMIS.

  14. Efficient active feedback scheme of image multi-class classification%结合主动反馈的图像多分类框架

    Institute of Scientific and Technical Information of China (English)

    刘君; 王银辉; 李黎; 张宇

    2011-01-01

    为了解决图像语义分类中的训练数据不对称、小样本训练和噪声数据这3个难题,提出结合主动反馈的图像多分类框架.该框架将主动选择的策略应用到图像的多分类中,通过主动的选择出不确定的图片给用户手动标记,扩大训练图片集,提高分类的精度.为了验证该框架的有效性,提出一种有效的结合主动选择的图像多分类算法,即结合投票的DDAG SVM(decision directed acyclic graph support vector machine)算法.该算法提出了新的主动选择策略,即结合投票和旁移机制的主动选择策略.实验结果表明,该算法能有效应用到图像多分类中,比DDAGSVM和采用普通主动选择策略的DDAGSVM具有更高的分类的精度.%In order to solve three difficulties of image classification, including asymmetry of training data, small sample issue and noise sample problem, efficient active feedback scheme of image multi-class classification which introduce active selecting technique into image multi-class classification is proposed. By actively selecting doubtful images for users to label, more training samples can be gotten and more accurate classification can be done. In order to validate the scheme, an image multi-class classification algorithm combining with active selecting is fulfilled, which is named as DDAG SVM (decision directed acyclic graph support vector machine) with voting.Experiments show that the algorithm has more accuracy than DDAG SVM and DDAG SVM with normal selecting strategy, so it is efficient and the proposed scheme is also good for image multi-class classification.

  15. Vessel-guided airway segmentation based on voxel classification

    DEFF Research Database (Denmark)

    Lo, Pechin Chien Pau; Sporring, Jon; Ashraf, Haseem;

    2008-01-01

    This paper presents a method for improving airway tree segmentation using vessel orientation information. We use the fact that an airway branch is always accompanied by an artery, with both structures having similar orientations. This work is based on a  voxel classification airway segmentation...

  16. Hierarchical Real-time Network Traffic Classification Based on ECOC

    Directory of Open Access Journals (Sweden)

    Yaou Zhao

    2013-09-01

    Full Text Available Classification of network traffic is basic and essential for manynetwork researches and managements. With the rapid development ofpeer-to-peer (P2P application using dynamic port disguisingtechniques and encryption to avoid detection, port-based and simplepayload-based network traffic classification methods were diminished.An alternative method based on statistics and machine learning hadattracted researchers' attention in recent years. However, most ofthe proposed algorithms were off-line and usually used a single classifier.In this paper a new hierarchical real-time model was proposed which comprised of a three tuple (source ip, destination ip and destination portlook up table(TT-LUT part and layered milestone part. TT-LUT was used to quickly classify short flows whichneed not to pass the layered milestone part, and milestones in layered milestone partcould classify the other flows in real-time with the real-time feature selection and statistics.Every milestone was a ECOC(Error-Correcting Output Codes based model which was usedto improve classification performance. Experiments showed that the proposedmodel can improve the efficiency of real-time to 80%, and themulti-class classification accuracy encouragingly to 91.4% on the datasets which had been captured from the backbone router in our campus through a week.

  17. Classification and Target Group Selection Based Upon Frequent Patterns

    NARCIS (Netherlands)

    W.H.L.M. Pijls (Wim); R. Potharst (Rob)

    2000-01-01

    textabstractIn this technical report , two new algorithms based upon frequent patterns are proposed. One algorithm is a classification method. The other one is an algorithm for target group selection. In both algorithms, first of all, the collection of frequent patterns in the training set is constr

  18. Pulse frequency classification based on BP neural network

    Institute of Scientific and Technical Information of China (English)

    WANG Rui; WANG Xu; YANG Dan; FU Rong

    2006-01-01

    In Traditional Chinese Medicine (TCM), it is an important parameter of the clinic disease diagnosis to analysis the pulse frequency. This article accords to pulse eight major essentials to identify pulse type of the pulse frequency classification based on back-propagation neural networks (BPNN). The pulse frequency classification includes slow pulse, moderate pulse, rapid pulse etc. By feature parameter of the pulse frequency analysis research and establish to identify system of pulse frequency features. The pulse signal from detecting system extracts period, frequency etc feature parameter to compare with standard feature value of pulse type. The result shows that identify-rate attains 92.5% above.

  19. Optimizing Mining Association Rules for Artificial Immune System based Classification

    Directory of Open Access Journals (Sweden)

    SAMEER DIXIT

    2011-08-01

    Full Text Available The primary function of a biological immune system is to protect the body from foreign molecules known as antigens. It has great pattern recognition capability that may be used to distinguish between foreigncells entering the body (non-self or antigen and the body cells (self. Immune systems have many characteristics such as uniqueness, autonomous, recognition of foreigners, distributed detection, and noise tolerance . Inspired by biological immune systems, Artificial Immune Systems have emerged during the last decade. They are incited by many researchers to design and build immune-based models for a variety of application domains. Artificial immune systems can be defined as a computational paradigm that is inspired by theoretical immunology, observed immune functions, principles and mechanisms. Association rule mining is one of the most important and well researched techniques of data mining. The goal of association rules is to extract interesting correlations, frequent patterns, associations or casual structures among sets of items in thetransaction databases or other data repositories. Association rules are widely used in various areas such as inventory control, telecommunication networks, intelligent decision making, market analysis and risk management etc. Apriori is the most widely used algorithm for mining the association rules. Other popular association rule mining algorithms are frequent pattern (FP growth, Eclat, dynamic itemset counting (DIC etc. Associative classification uses association rule mining in the rule discovery process to predict the class labels of the data. This technique has shown great promise over many other classification techniques. Associative classification also integrates the process of rule discovery and classification to build the classifier for the purpose of prediction. The main problem with the associative classification approach is the discovery of highquality association rules in a very large space of

  20. Fault Diagnosis for Fuel Cell Based on Naive Bayesian Classification

    Directory of Open Access Journals (Sweden)

    Liping Fan

    2013-07-01

    Full Text Available Many kinds of uncertain factors may exist in the process of fault diagnosis and affect diagnostic results. Bayesian network is one of the most effective theoretical models for uncertain knowledge expression and reasoning. The method of naive Bayesian classification is used in this paper in fault diagnosis of a proton exchange membrane fuel cell (PEMFC system. Based on the model of PEMFC, fault data are obtained through simulation experiment, learning and training of the naive Bayesian classification are finished, and some testing samples are selected to validate this method. Simulation results demonstrate that the method is feasible.    

  1. Adaptive stellar spectral subclass classification based on Bayesian SVMs

    Science.gov (United States)

    Du, Changde; Luo, Ali; Yang, Haifeng

    2017-02-01

    Stellar spectral classification is one of the most fundamental tasks in survey astronomy. Many automated classification methods have been applied to spectral data. However, their main limitation is that the model parameters must be tuned repeatedly to deal with different data sets. In this paper, we utilize the Bayesian support vector machines (BSVM) to classify the spectral subclass data. Based on Gibbs sampling, BSVM can infer all model parameters adaptively according to different data sets, which allows us to circumvent the time-consuming cross validation for penalty parameter. We explored different normalization methods for stellar spectral data, and the best one has been suggested in this study. Finally, experimental results on several stellar spectral subclass classification problems show that the BSVM model not only possesses good adaptability but also provides better prediction performance than traditional methods.

  2. A New Digital Signature Scheme Based on Factoring and Discrete Logarithms

    Directory of Open Access Journals (Sweden)

    E. S. Ismail

    2008-01-01

    Full Text Available Problem statement: A digital signature scheme allows one to sign an electronic message and later the produced signature can be validated by the owner of the message or by any verifier. Most of the existing digital signature schemes were developed based on a single hard problem like factoring, discrete logarithm, residuosity or elliptic curve discrete logarithm problems. Although these schemes appear secure, one day in a near future they may be exploded if one finds a solution of the single hard problem. Approach: To overcome this problem, in this study, we proposed a new signature scheme based on multiple hard problems namely factoring and discrete logarithms. We combined the two problems into both signing and verifying equations such that the former depends on two secret keys whereas the latter depends on two corresponding public keys. Results: The new scheme was shown to be secure against the most five considering attacks for signature schemes. The efficiency performance of our scheme only requires 1203Tmul+Th time complexity for signature generation and 1202Tmul+Th time complexity for verification generation and this magnitude of complexity is considered minimal for multiple hard problems-like signature schemes. Conclusions: The new signature scheme based on multiple hard problems provides longer and higher security level than that scheme based on one problem. This is because no enemy can solve multiple hard problems simultaneously.

  3. The Normalization of Citation Counts Based on Classification Systems

    Directory of Open Access Journals (Sweden)

    Andreas Barth

    2013-08-01

    Full Text Available If we want to assess whether the paper in question has had a particularly high or low citation impact compared to other papers, the standard practice in bibliometrics is to normalize citations in respect of the subject category and publication year. A number of proposals for an improved procedure in the normalization of citation impact have been put forward in recent years. Against the background of these proposals, this study describes an ideal solution for the normalization of citation impact: in a first step, the reference set for the publication in question is collated by means of a classification scheme, where every publication is associated with a single principal research field or subfield entry (e.g., via Chemical Abstracts sections and a publication year. In a second step, percentiles of citation counts are calculated for this set and used to assign the normalized citation impact score to the publications (and also to the publication in question.

  4. The normalization of citation counts based on classification systems

    CERN Document Server

    Bornmann, Lutz; Barth, Andreas

    2013-01-01

    If we want to assess whether the paper in question has had a particularly high or low citation impact compared to other papers, the standard practice in bibliometrics is to normalize citations in respect of the subject category and publication year. A number of proposals for an improved procedure in the normalization of citation impact have been put forward in recent years. Against the background of these proposals this study describes an ideal solution for the normalization of citation impact: in a first step, the reference set for the publication in question is collated by means of a classification scheme, where every publication is associated with a single principal research field or subfield entry (e. g. via Chemical Abstracts sections) and a publication year. In a second step, percentiles of citation counts are calculated for this set and used to assign the normalized citation impact score to the publications (and also to the publication in question).

  5. A training-based scheme for communicating over unknown channels with feedback

    CERN Document Server

    Mahajan, Aditya

    2009-01-01

    We consider communication with noiseless feedback over a channel that is either BSC(p) or BSC(1-p); neither the transmitter nor the receiver know which one. The parameter $p \\in [0, 1/2]$ is known to both. We propose a variable length training-based scheme for this channel. The error exponent of this scheme is within a constant fraction of the best possible error exponent. Thus, contrary to popular belief, variable length training-based schemes need not have poor error exponents. Moreover, training-based schemes can preserve the main advantage of feedback -- an error exponent with non-zero slope at rates close to capacity.

  6. An efficient and secure attribute based signcryption scheme with LSSS access structure.

    Science.gov (United States)

    Hong, Hanshu; Sun, Zhixin

    2016-01-01

    Attribute based encryption (ABE) and attribute based signature (ABS) provide flexible access control with authentication for data sharing between users, but realizing both functions will bring about too much computation burden. In this paper, we combine the advantages of CP-ABE with ABS and propose a ciphertext policy attribute based signcryption scheme. In our scheme, only legal receivers can decrypt the ciphertext and verify the signature signed by data owner. Furthermore, we use linear secret sharing scheme instead of tree structure to avoid the frequent calls of recursive algorithm. By security and performance analysis, we prove that our scheme is secure as well as gains higher efficiency.

  7. One-step discrimination scheme on N-particle Greenberger-Horne-Zeilinger bases

    Institute of Scientific and Technical Information of China (English)

    Wang Xin-Wen; Liu Xiang; Fang Mao-Fa

    2007-01-01

    We present an experimentally feasible one-step discrimination scheme on Bell bases with trapped ions, and then generalize it to the case of N-ion Greenberger-Horne-Zeilinger (GHZ) bases. In the scheme, all the orthogonal and complete N-ion GHZ internal states can be exactly discriminated only by one step, and thus it takes very short time. Moreover, the scheme is insensitive to thermal motion and dose not require the individual addressing of the ions. The Bell-state and GHZ-state one-step discrimination scheme can be widely used in quantum information processing based on ion-trap set-up.

  8. Integrative disease classification based on cross-platform microarray data

    Directory of Open Access Journals (Sweden)

    Huang Haiyan

    2009-01-01

    Full Text Available Abstract Background Disease classification has been an important application of microarray technology. However, most microarray-based classifiers can only handle data generated within the same study, since microarray data generated by different laboratories or with different platforms can not be compared directly due to systematic variations. This issue has severely limited the practical use of microarray-based disease classification. Results In this study, we tested the feasibility of disease classification by integrating the large amount of heterogeneous microarray datasets from the public microarray repositories. Cross-platform data compatibility is created by deriving expression log-rank ratios within datasets. One may then compare vectors of log-rank ratios across datasets. In addition, we systematically map textual annotations of datasets to concepts in Unified Medical Language System (UMLS, permitting quantitative analysis of the phenotype "distance" between datasets and automated construction of disease classes. We design a new classification approach named ManiSVM, which integrates Manifold data transformation with SVM learning to exploit the data properties. Using the leave one dataset out cross validation, ManiSVM achieved the overall accuracy of 70.7% (68.6% precision and 76.9% recall with many disease classes achieving the accuracy higher than 80%. Conclusion Our results not only demonstrated the feasibility of the integrated disease classification approach, but also showed that the classification accuracy increases with the number of homogenous training datasets. Thus, the power of the integrative approach will increase with the continuous accumulation of microarray data in public repositories. Our study shows that automated disease diagnosis can be an important and promising application of the enormous amount of costly to generate, yet freely available, public microarray data.

  9. Feature Extraction and Selection Scheme for Intelligent Engine Fault Diagnosis Based on 2DNMF, Mutual Information, and NSGA-II

    Directory of Open Access Journals (Sweden)

    Peng-yuan Liu

    2016-01-01

    Full Text Available A novel feature extraction and selection scheme is presented for intelligent engine fault diagnosis by utilizing two-dimensional nonnegative matrix factorization (2DNMF, mutual information, and nondominated sorting genetic algorithms II (NSGA-II. Experiments are conducted on an engine test rig, in which eight different engine operating conditions including one normal condition and seven fault conditions are simulated, to evaluate the presented feature extraction and selection scheme. In the phase of feature extraction, the S transform technique is firstly utilized to convert the engine vibration signals to time-frequency domain, which can provide richer information on engine operating conditions. Then a novel feature extraction technique, named two-dimensional nonnegative matrix factorization, is employed for characterizing the time-frequency representations. In the feature selection phase, a hybrid filter and wrapper scheme based on mutual information and NSGA-II is utilized to acquire a compact feature subset for engine fault diagnosis. Experimental results by adopted three different classifiers have demonstrated that the proposed feature extraction and selection scheme can achieve a very satisfying classification performance with fewer features for engine fault diagnosis.

  10. Novel Sequence Number Based Secure Authentication Scheme for Wireless LANs

    Institute of Scientific and Technical Information of China (English)

    Rajeev Singh; Teek Parval Sharma

    2015-01-01

    Authentication per frame is an implicit necessity for security in wireless local area networks (WLANs). We propose a novel per frame secure authentication scheme which provides authentication to data frames in WLANs. The scheme involves no cryptographic overheads for authentication of frames. It utilizes the sequence number of the frame along with the authentication stream generators for authentication. Hence, it requires no extra bits or messages for the authentication purpose and also no change in the existing frame format is required. The scheme provides authentication by modifying the sequence number of the frame at the sender, and that the modification is verified at the receiver. The modified sequence number is protected by using the XOR operation with a random number selected from the random stream. The authentication is lightweight due to the fact that it requires only trivial arithmetic operations like the subtraction and XOR operation.

  11. A scalable admission control scheme based on time label

    Institute of Scientific and Technical Information of China (English)

    杨松岸; 杨华; 杨宇航

    2004-01-01

    Resource reservation protocols allow communicating hosts to reserve resources such as bandwidth to offer guaranteed service. However,current resource reservation architectures do not scale well for a large number of flows. In this paper,we present a simple reservation protocol and a scalable admission control algorithm,which can provide QoS guarantees to individual flows without per-flow management in the network core. By mapping each flow to a definite time,this scheme addresses the problems that limit the effectiveness of current endpoint admission control schemes. The overall admission control process is described. Analysis is used to explain the reasonability of our scheme and simulation validates its performance.

  12. Research on road topology based mobility prediction schemes

    Institute of Scientific and Technical Information of China (English)

    Chen Lin; Chen Hongzhong; Jiang Changjun

    2007-01-01

    Geographic routing has been introduced in mobile ad hoc networks and sensor networks . But its performance suffers greatly from mobility- induced location errors that can cause Lost Link (LLNK) and LOOP problems. Thus various mobility prediction algorithms have been proposed to mitigate the errors, but sometimes their prediction errors are substantial . A novel mobility prediction technique that incorporates both mobile positioning information and road topology knowledge was presented. Furthermore, the performance of the scheme was evaluated via simulations , along with two other schemes , namely , Linear Velocity Prediction ( LVP) and Weighted Velocity Prediction ( WVP) for comparison purpose . The results of simulation under Manhattan mobility model show that the proposed scheme could track the movement of a node well and hence provide noticeable improvement over LVP and MVP.

  13. Hardware Accelerators Targeting a Novel Group Based Packet Classification Algorithm

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2013-01-01

    Full Text Available Packet classification is a ubiquitous and key building block for many critical network devices. However, it remains as one of the main bottlenecks faced when designing fast network devices. In this paper, we propose a novel Group Based Search packet classification Algorithm (GBSA that is scalable, fast, and efficient. GBSA consumes an average of 0.4 megabytes of memory for a 10 k rule set. The worst-case classification time per packet is 2 microseconds, and the preprocessing speed is 3 M rules/second based on an Xeon processor operating at 3.4 GHz. When compared with other state-of-the-art classification techniques, the results showed that GBSA outperforms the competition with respect to speed, memory usage, and processing time. Moreover, GBSA is amenable to implementation in hardware. Three different hardware implementations are also presented in this paper including an Application Specific Instruction Set Processor (ASIP implementation and two pure Register-Transfer Level (RTL implementations based on Impulse-C and Handel-C flows, respectively. Speedups achieved with these hardware accelerators ranged from 9x to 18x compared with a pure software implementation running on an Xeon processor.

  14. Fast rule-based bioactivity prediction using associative classification mining

    Directory of Open Access Journals (Sweden)

    Yu Pulan

    2012-11-01

    Full Text Available Abstract Relating chemical features to bioactivities is critical in molecular design and is used extensively in the lead discovery and optimization process. A variety of techniques from statistics, data mining and machine learning have been applied to this process. In this study, we utilize a collection of methods, called associative classification mining (ACM, which are popular in the data mining community, but so far have not been applied widely in cheminformatics. More specifically, classification based on predictive association rules (CPAR, classification based on multiple association rules (CMAR and classification based on association rules (CBA are employed on three datasets using various descriptor sets. Experimental evaluations on anti-tuberculosis (antiTB, mutagenicity and hERG (the human Ether-a-go-go-Related Gene blocker datasets show that these three methods are computationally scalable and appropriate for high speed mining. Additionally, they provide comparable accuracy and efficiency to the commonly used Bayesian and support vector machines (SVM methods, and produce highly interpretable models.

  15. Sparse Representation Based Binary Hypothesis Model for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Yidong Tang

    2016-01-01

    Full Text Available The sparse representation based classifier (SRC and its kernel version (KSRC have been employed for hyperspectral image (HSI classification. However, the state-of-the-art SRC often aims at extended surface objects with linear mixture in smooth scene and assumes that the number of classes is given. Considering the small target with complex background, a sparse representation based binary hypothesis (SRBBH model is established in this paper. In this model, a query pixel is represented in two ways, which are, respectively, by background dictionary and by union dictionary. The background dictionary is composed of samples selected from the local dual concentric window centered at the query pixel. Thus, for each pixel the classification issue becomes an adaptive multiclass classification problem, where only the number of desired classes is required. Furthermore, the kernel method is employed to improve the interclass separability. In kernel space, the coding vector is obtained by using kernel-based orthogonal matching pursuit (KOMP algorithm. Then the query pixel can be labeled by the characteristics of the coding vectors. Instead of directly using the reconstruction residuals, the different impacts the background dictionary and union dictionary have on reconstruction are used for validation and classification. It enhances the discrimination and hence improves the performance.

  16. New Steganographic Scheme Based Of Reed- Solomon Codes

    Directory of Open Access Journals (Sweden)

    DIOP

    2012-04-01

    Full Text Available Modern steganography [1] is a new science that makes a secret communication. Using the technique of Matrix Embedding in steganography schemes tends to reduce distortion during insertion. Recently, Fontaine and Galand [2] showed that the Reed-Solomon codes can be good tools for the design of a Steganographic scheme. In this paper, we present an implementation of the technique Matrix Embedding using the Reed-Solomon codes. The advantage of these codes is that they allow easy way to solve the problem of bounded syndrome decoding, a problem which is the basis of the technique of embedding matrix.

  17. Feedback-based Intra Refresh Scheme for Wireless Video Transmission

    Institute of Scientific and Technical Information of China (English)

    FENGXiubo; XIEJianying

    2004-01-01

    This paper investigates the intra refresh techniques for video transmission in wireless scenario. Twostate Markov model is proposed to simulate the fading wireless channel. Taking feedback information into account, we analyze the channel distortion due to the burst errors. An improved intra refresh scheme is then proposed for optimal selection of coding mode for each Macroblock (MB) in rate-distortion framework. The propo sedscheme can stop error propagation and reduce channel distortion effectively. Simulations using TMN10 show that our scheme obtains improved performance with respect to error resilience for wireless video communication.

  18. The Lifting Scheme Based on the Second Generation Wavelets

    Institute of Scientific and Technical Information of China (English)

    FENG Hui; GUO Lanying; XIAO Jinsheng

    2006-01-01

    The lifting scheme is a custom-design construction of Biorthogonal wavelets, a fast and efficient method to realize wavelet transform, which provides a wider range of application and efficiently reduces the computing time with its particular frame. This paper aims at introducing the second generation wavelets, begins with traditional Mallat algorithms, illustrates the lifting scheme and brings out the detail steps in the construction of Biorthogonal wavelets. Because of isolating the degrees of freedom remaining the biorthogonality relations, we can fully control over the lifting operators to design the wavelet for a particular application, such as increasing the number of the vanishing moments.

  19. Cryptanalysis And Further Improvement Of A Biometric-Based Remote User Authentication Scheme Using Smart Cards

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Das

    2011-03-01

    Full Text Available Recently, Li et al. proposed a secure biometric-based remote user authentication scheme using smartcards to withstand the security flaws of Li-Hwang’s efficient biometric-based remote user authenticationscheme using smart cards. Li et al.’s scheme is based on biometrics verification, smart card and one-wayhash function, and it also uses the random nonce rather than a synchronized clock, and thus it is efficientin computational cost and more secure than Li-Hwang’s scheme. Unfortunately, in this paper we showthat Li et al.’s scheme still has some security weaknesses in their design. In order to withstand thoseweaknesses in their scheme, we further propose an improvement of their scheme so that the improvedscheme always provides proper authentication and as a result, it establishes a session key between theuser and the server at the end of successful user authentication.

  20. A New Digital Signature Scheme Based on Mandelbrot and Julia Fractal Sets

    Directory of Open Access Journals (Sweden)

    M. A. Alia

    2007-01-01

    Full Text Available This paper describes a new cryptographic digital signature scheme based on Mandelbrot and Julia fractal sets. Having fractal based digital signature scheme is possible due to the strong connection between the Mandelbrot and Julia fractal sets. The link between the two fractal sets used for the conversion of the private key to the public key. Mandelbrot fractal function takes the chosen private key as the input parameter and generates the corresponding public-key. Julia fractal function then used to sign the message with receiver's public key and verify the received message based on the receiver's private key. The propose scheme was resistant against attacks, utilizes small key size and performs comparatively faster than the existing DSA, RSA digital signature scheme. Fractal digital signature scheme was an attractive alternative to the traditional number theory digital signature scheme.

  1. A Provably-Secure ECC-Based Authentication Scheme for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junghyun Nam

    2014-11-01

    Full Text Available A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000. Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC, and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure schemes.

  2. A soft-hard combination-based cooperative spectrum sensing scheme for cognitive radio networks.

    Science.gov (United States)

    Do, Nhu Tri; An, Beongku

    2015-02-13

    In this paper we propose a soft-hard combination scheme, called SHC scheme, for cooperative spectrum sensing in cognitive radio networks. The SHC scheme deploys a cluster based network in which Likelihood Ratio Test (LRT)-based soft combination is applied at each cluster, and weighted decision fusion rule-based hard combination is utilized at the fusion center. The novelties of the SHC scheme are as follows: the structure of the SHC scheme reduces the complexity of cooperative detection which is an inherent limitation of soft combination schemes. By using the LRT, we can detect primary signals in a low signal-to-noise ratio regime (around an average of -15 dB). In addition, the computational complexity of the LRT is reduced since we derive the closed-form expression of the probability density function of LRT value. The SHC scheme also takes into account the different effects of large scale fading on different users in the wide area network. The simulation results show that the SHC scheme not only provides the better sensing performance compared to the conventional hard combination schemes, but also reduces sensing overhead in terms of reporting time compared to the conventional soft combination scheme using the LRT.

  3. Vertical diffuse attenuation coefficient () based optical classification of IRS-P3 MOS-B satellite ocean colour data

    Indian Academy of Sciences (India)

    R K Sarangi; Prakash Chauhan; S R Nayak

    2002-09-01

    The optical classification of the different water types provides vital input for studies related to primary productivity, water clarity and determination of euphotic depth. Image data of the IRS- P3 MOS-B, for Path 90 of 27th February, 1998 was used for deriving vertical diffuse attenuation Coeffcient () and an optical classification based on values was performed. An atmospheric correction scheme was used for retrieving water leaving radiances in blue and green channels of 412, 443, 490 and 550 nm. The upwelling radiances from 443nm and 550nm spectral channels were used for computation of vertical diffuse attenuation coeffcient at 490 nm. The waters off the Gujarat coast were classified into different water types based on Jerlov classification scheme. The oceanic water type IA ( range 0.035-0.040m-1), type IB (0.042-0.065m-1), type II (0.07-0.1m-1) and type III (0.115-0.14m-1) were identified. For the coastal waters along Gujarat coast and Gulf of Kachchh, (490) values ranged between 0.15m-1 and 0.35m-1. The depth of 1% of surface light for water type IA, IB, II and III corresponds to 88, 68, 58 and 34 meters respectively. Classification of oceanic and coastal waters based on is useful in understanding the light transmission characteristics for sub-marine navigation and under-water imaging.

  4. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    Science.gov (United States)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  5. Emotional textile image classification based on cross-domain convolutional sparse autoencoders with feature selection

    Science.gov (United States)

    Li, Zuhe; Fan, Yangyu; Liu, Weihua; Yu, Zeqi; Wang, Fengqin

    2017-01-01

    We aim to apply sparse autoencoder-based unsupervised feature learning to emotional semantic analysis for textile images. To tackle the problem of limited training data, we present a cross-domain feature learning scheme for emotional textile image classification using convolutional autoencoders. We further propose a correlation-analysis-based feature selection method for the weights learned by sparse autoencoders to reduce the number of features extracted from large size images. First, we randomly collect image patches on an unlabeled image dataset in the source domain and learn local features with a sparse autoencoder. We then conduct feature selection according to the correlation between different weight vectors corresponding to the autoencoder's hidden units. We finally adopt a convolutional neural network including a pooling layer to obtain global feature activations of textile images in the target domain and send these global feature vectors into logistic regression models for emotional image classification. The cross-domain unsupervised feature learning method achieves 65% to 78% average accuracy in the cross-validation experiments corresponding to eight emotional categories and performs better than conventional methods. Feature selection can reduce the computational cost of global feature extraction by about 50% while improving classification performance.

  6. Trace elements based classification on clinkers. Application to Spanish clinkers

    OpenAIRE

    Tamás, F. D.; Abonyi, J.; Puertas, F.

    2001-01-01

    The qualitative identification to determine the origin (i.e. manufacturing factory) of Spanish clinkers is described. The classification of clinkers produced in different factories can be based on their trace element content. Approximately fifteen clinker sorts are analysed, collected from 11 Spanish cement factories to determine their Mg, Sr, Ba, Mn, Ti, Zr, Zn and V content. An expert system formulated by a binary decision tree is designed based on the collected data. The performance of the...

  7. An adjoint-based scheme for eigenvalue error improvement

    Energy Technology Data Exchange (ETDEWEB)

    Merton, S.R.; Smedley-Stevenson, R.P., E-mail: Simon.Merton@awe.co.uk, E-mail: Richard.Smedley-Stevenson@awe.co.uk [AWE plc, Berkshire (United Kingdom); Pain, C.C.; El-Sheikh, A.H.; Buchan, A.G., E-mail: c.pain@imperial.ac.uk, E-mail: a.el-sheikh@imperial.ac.uk, E-mail: andrew.buchan@imperial.ac.uk [Department of Earth Science and Engineering, Imperial College, London (United Kingdom)

    2011-07-01

    A scheme for improving the accuracy and reducing the error in eigenvalue calculations is presented. Using a rst order Taylor series expansion of both the eigenvalue solution and the residual of the governing equation, an approximation to the error in the eigenvalue is derived. This is done using a convolution of the equation residual and adjoint solution, which is calculated in-line with the primal solution. A defect correction on the solution is then performed in which the approximation to the error is used to apply a correction to the eigenvalue. The method is shown to dramatically improve convergence of the eigenvalue. The equation for the eigenvalue is shown to simplify when certain normalizations are applied to the eigenvector. Two such normalizations are considered; the rst of these is a fission-source type of normalisation and the second is an eigenvector normalisation. Results are demonstrated on a number of demanding elliptic problems using continuous Galerkin weighted nite elements. Moreover, the correction scheme may also be applied to hyperbolic problems and arbitrary discretization. This is not limited to spatial corrections and may be used throughout the phase space of the discrete equation. The applied correction not only improves fidelity of the calculation, it allows assessment of the reliability of numerical schemes to be made and could be used to guide mesh adaption algorithms or to automate mesh generation schemes. (author)

  8. Classification of Mental Disorders Based on Temperament

    Directory of Open Access Journals (Sweden)

    Nadi Sakhvidi

    2015-08-01

    Full Text Available Context Different paradoxical theories are available regarding psychiatric disorders. The current study aimed to establish a more comprehensive overall approach. Evidence Acquisition This basic study examined ancient medical books. “The Canon” by Avicenna and “Comprehensive Textbook of Psychiatry” by Kaplan and Sadock were the most important and frequently consulted books in this study. Results Four groups of temperaments were identified: high active, high flexible; high active, low flexible; low active, low flexible; and low active, high flexible. When temperament deteriorates personality, non-psychotic, and psychotic psychiatric disorders can develop. Conclusions Temperaments can provide a basis to classify psychiatric disorders. Psychiatric disorders can be placed in a spectrum based on temperaments.

  9. ECC Based Threshold Decryption Scheme and Its Application in Web Security

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xian-feng; ZHANG Feng; QIN Zhi-guang; LIU Jin-de

    2004-01-01

    The threshold cryptography provides a new approach to building intrusion tolerance applications. In this paper, a threshold decryption scheme based elliptic curve cryptography is presented. A zero-knowledge test approach based on elliptic curve cryptography is designed. The application of these techniques in Web security is studied. Performance analysis shows that our scheme is characterized by excellent security as well as high efficiency.

  10. Performance Analysis of Virtual MIMO Relaying Schemes Based on Detect–Split–Forward

    KAUST Repository

    Al-Basit, Suhaib M.

    2014-10-29

    © 2014, Springer Science+Business Media New York. Virtual multi-input multi-output (vMIMO) schemes in wireless communication systems improve coverage, throughput, capacity, and quality of service. In this paper, we propose three uplink vMIMO relaying schemes based on detect–split–forward (DSF). In addition, we investigate the effect of several physical parameters such as distance, modulation type and number of relays. Furthermore, an adaptive vMIMO DSF scheme based on VBLAST and STBC is proposed. In order to do that, we provide analytical tools to evaluate the performance of the propose vMIMO relaying scheme.

  11. UNIFIED COMPUTATION OF FLOW WITH COMPRESSIBLE AND INCOMPRESSIBLE FLUID BASED ON ROE'S SCHEME

    Institute of Scientific and Technical Information of China (English)

    HUANG Dian-gui

    2006-01-01

    A unified numerical scheme for the solutions of the compressible and incompressible Navier-Stokes equations is investigated based on a time-derivative preconditioning algorithm. The primitive variables are pressure, velocities and temperature. The time integration scheme is used in conjunction with a finite volume discretization. The preconditioning is coupled with a high order implicit upwind scheme based on the definition of a Roe's type matrix. Computational capabilities are demonstrated through computations of high Mach number, middle Mach number, very low Mach number, and incompressible flow. It has also been demonstrated that the discontinuous surface in flow field can be captured for the implementation Roe's scheme.

  12. A Topic Space Oriented User Group Discovering Scheme in Social Network: A Trust Chain Based Interest Measuring Perspective

    Directory of Open Access Journals (Sweden)

    Wang Dong

    2016-01-01

    Full Text Available Currently, user group has become an effective platform for information sharing and communicating among users in social network sites. In present work, we propose a single topic user group discovering scheme, which includes three phases: topic impact evaluation, interest degree measurement, and trust chain based discovering, to enable selecting influential topic and discovering users into a topic oriented group. Our main works include (1 an overview of proposed scheme and its related definitions; (2 topic space construction method based on topic relatedness clustering and its impact (influence degree and popularity degree evaluation; (3 a trust chain model to take user relation network topological information into account with a strength classification perspective; (4 an interest degree (user explicit and implicit interest degree evaluation method based on trust chain among users; and (5 a topic space oriented user group discovering method to group core users according to their explicit interest degrees and to predict ordinary users under implicit interest and user trust chain. Finally, experimental results are given to explain effectiveness and feasibility of our scheme.

  13. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...

  14. A wrapper-based approach to image segmentation and classification.

    Science.gov (United States)

    Farmer, Michael E; Jain, Anil K

    2005-12-01

    The traditional processing flow of segmentation followed by classification in computer vision assumes that the segmentation is able to successfully extract the object of interest from the background image. It is extremely difficult to obtain a reliable segmentation without any prior knowledge about the object that is being extracted from the scene. This is further complicated by the lack of any clearly defined metrics for evaluating the quality of segmentation or for comparing segmentation algorithms. We propose a method of segmentation that addresses both of these issues, by using the object classification subsystem as an integral part of the segmentation. This will provide contextual information regarding the objects to be segmented, as well as allow us to use the probability of correct classification as a metric to determine the quality of the segmentation. We view traditional segmentation as a filter operating on the image that is independent of the classifier, much like the filter methods for feature selection. We propose a new paradigm for segmentation and classification that follows the wrapper methods of feature selection. Our method wraps the segmentation and classification together, and uses the classification accuracy as the metric to determine the best segmentation. By using shape as the classification feature, we are able to develop a segmentation algorithm that relaxes the requirement that the object of interest to be segmented must be homogeneous in some low-level image parameter, such as texture, color, or grayscale. This represents an improvement over other segmentation methods that have used classification information only to modify the segmenter parameters, since these algorithms still require an underlying homogeneity in some parameter space. Rather than considering our method as, yet, another segmentation algorithm, we propose that our wrapper method can be considered as an image segmentation framework, within which existing image segmentation

  15. Similarity-Based Classification in Partially Labeled Networks

    Science.gov (United States)

    Zhang, Qian-Ming; Shang, Ming-Sheng; Lü, Linyuan

    Two main difficulties in the problem of classification in partially labeled networks are the sparsity of the known labeled nodes and inconsistency of label information. To address these two difficulties, we propose a similarity-based method, where the basic assumption is that two nodes are more likely to be categorized into the same class if they are more similar. In this paper, we introduce ten similarity indices defined based on the network structure. Empirical results on the co-purchase network of political books show that the similarity-based method can, to some extent, overcome these two difficulties and give higher accurate classification than the relational neighbors method, especially when the labeled nodes are sparse. Furthermore, we find that when the information of known labeled nodes is sufficient, the indices considering only local information can perform as good as those global indices while having much lower computational complexity.

  16. Object-Based Classification and Change Detection of Hokkaido, Japan

    Science.gov (United States)

    Park, J. G.; Harada, I.; Kwak, Y.

    2016-06-01

    Topography and geology are factors to characterize the distribution of natural vegetation. Topographic contour is particularly influential on the living conditions of plants such as soil moisture, sunlight, and windiness. Vegetation associations having similar characteristics are present in locations having similar topographic conditions unless natural disturbances such as landslides and forest fires or artificial disturbances such as deforestation and man-made plantation bring about changes in such conditions. We developed a vegetation map of Japan using an object-based segmentation approach with topographic information (elevation, slope, slope direction) that is closely related to the distribution of vegetation. The results found that the object-based classification is more effective to produce a vegetation map than the pixel-based classification.

  17. Twitter content classification

    OpenAIRE

    2010-01-01

    This paper delivers a new Twitter content classification framework based sixteen existing Twitter studies and a grounded theory analysis of a personal Twitter history. It expands the existing understanding of Twitter as a multifunction tool for personal, profession, commercial and phatic communications with a split level classification scheme that offers broad categorization and specific sub categories for deeper insight into the real world application of the service.

  18. Chaos-based partial image encryption scheme based on linear fractional and lifting wavelet transforms

    Science.gov (United States)

    Belazi, Akram; Abd El-Latif, Ahmed A.; Diaconu, Adrian-Viorel; Rhouma, Rhouma; Belghith, Safya

    2017-01-01

    In this paper, a new chaos-based partial image encryption scheme based on Substitution-boxes (S-box) constructed by chaotic system and Linear Fractional Transform (LFT) is proposed. It encrypts only the requisite parts of the sensitive information in Lifting-Wavelet Transform (LWT) frequency domain based on hybrid of chaotic maps and a new S-box. In the proposed encryption scheme, the characteristics of confusion and diffusion are accomplished in three phases: block permutation, substitution, and diffusion. Then, we used dynamic keys instead of fixed keys used in other approaches, to control the encryption process and make any attack impossible. The new S-box was constructed by mixing of chaotic map and LFT to insure the high confidentiality in the inner encryption of the proposed approach. In addition, the hybrid compound of S-box and chaotic systems strengthened the whole encryption performance and enlarged the key space required to resist the brute force attacks. Extensive experiments were conducted to evaluate the security and efficiency of the proposed approach. In comparison with previous schemes, the proposed cryptosystem scheme showed high performances and great potential for prominent prevalence in cryptographic applications.

  19. A Quantum Proxy Weak Blind Signature Scheme Based on Controlled Quantum Teleportation

    Science.gov (United States)

    Cao, Hai-Jing; Yu, Yao-Feng; Song, Qin; Gao, Lan-Xiang

    2015-04-01

    Proxy blind signature is applied to the electronic paying system, electronic voting system, mobile agent system, security of internet, etc. A quantum proxy weak blind signature scheme is proposed in this paper. It is based on controlled quantum teleportation. Five-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement message blinding, so it could guarantee not only the unconditional security of the scheme but also the anonymity of the messages owner.

  20. Error Robust H.264 Video Transmission Schemes Based on Multi-frame

    Institute of Scientific and Technical Information of China (English)

    余红斌; 余松煜; 王慈

    2004-01-01

    Multi-frame coding is supported by the emerging H. 264. It is important for the enhancement of both coding efficiency and error robustness. In this paper, error resilient schemes for H. 264 based on multi-frame were investigated. Error robust H. 264 video transmission schemes were introduced for the applications with and without a feedback channel. The experimental results demonstrate the effectiveness of the proposed schemes.

  1. Remote sensing image compression method based on lift scheme wavelet transform

    Science.gov (United States)

    Tao, Hongjiu; Tang, Xinjian; Liu, Jian; Tian, Jinwen

    2003-06-01

    Based on lifting scheme and the construction theorem of the integer Haar wavelet and biorthogonal wavelet, we propose a new integer wavelet transform construct method on the basis of lift scheme after introduciton of constructing specific-demand biorthogonal wavelet transform using Harr wavelet and Lazy wavelet. In this paper, we represent the method and algorithm of the lifting scheme, and we also give mathematical formulation on this method and experimental results as well.

  2. Efficient Identity Based Signcryption Scheme with Public Verifiability and Forward Security

    Institute of Scientific and Technical Information of China (English)

    LEI Fei-yu; CHEN Wen; CHEN Ke-fei; MA Chang-she

    2005-01-01

    In this paper, we point out that Libert and Quisquater's signcryption scheme cannot provide public verifiability. Then we present a new identity based signcryption scheme using quadratic residue and pairings over elliptic curves. It combines the functionalities of both public verifiability and forward security at the same time. Under the Bilinear Diffie-Hellman and quadratic residue assumption, we describe the new scheme that is more secure and can be somewhat more efficient than Libert and Quisquater's one.

  3. Classification data mining method based on dynamic RBF neural networks

    Science.gov (United States)

    Zhou, Lijuan; Xu, Min; Zhang, Zhang; Duan, Luping

    2009-04-01

    With the widely application of databases and sharp development of Internet, The capacity of utilizing information technology to manufacture and collect data has improved greatly. It is an urgent problem to mine useful information or knowledge from large databases or data warehouses. Therefore, data mining technology is developed rapidly to meet the need. But DM (data mining) often faces so much data which is noisy, disorder and nonlinear. Fortunately, ANN (Artificial Neural Network) is suitable to solve the before-mentioned problems of DM because ANN has such merits as good robustness, adaptability, parallel-disposal, distributing-memory and high tolerating-error. This paper gives a detailed discussion about the application of ANN method used in DM based on the analysis of all kinds of data mining technology, and especially lays stress on the classification Data Mining based on RBF neural networks. Pattern classification is an important part of the RBF neural network application. Under on-line environment, the training dataset is variable, so the batch learning algorithm (e.g. OLS) which will generate plenty of unnecessary retraining has a lower efficiency. This paper deduces an incremental learning algorithm (ILA) from the gradient descend algorithm to improve the bottleneck. ILA can adaptively adjust parameters of RBF networks driven by minimizing the error cost, without any redundant retraining. Using the method proposed in this paper, an on-line classification system was constructed to resolve the IRIS classification problem. Experiment results show the algorithm has fast convergence rate and excellent on-line classification performance.

  4. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  5. Rule-Based Classification of Chemical Structures by Scaffold.

    Science.gov (United States)

    Schuffenhauer, Ansgar; Varin, Thibault

    2011-08-01

    Databases for small organic chemical molecules usually contain millions of structures. The screening decks of pharmaceutical companies contain more than a million of structures. Nevertheless chemical substructure searching in these databases can be performed interactively in seconds. Because of this nobody has really missed structural classification of these databases for the purpose of finding data for individual chemical substructures. However, a full deck high-throughput screen produces also activity data for more than a million of substances. How can this amount of data be analyzed? Which are the active scaffolds identified by an assays? To answer such questions systematic classifications of molecules by scaffolds are needed. In this review it is described how molecules can be hierarchically classified by their scaffolds. It is explained how such classifications can be used to identify active scaffolds in an HTS data set. Once active classes are identified, they need to be visualized in the context of related scaffolds in order to understand SAR. Consequently such visualizations are another topic of this review. In addition scaffold based diversity measures are discussed and an outlook is given about the potential impact of structural classifications on a chemically aware semantic web.

  6. Comparison Of Power Quality Disturbances Classification Based On Neural Network

    Directory of Open Access Journals (Sweden)

    Nway Nway Kyaw Win

    2015-07-01

    Full Text Available Abstract Power quality disturbances PQDs result serious problems in the reliability safety and economy of power system network. In order to improve electric power quality events the detection and classification of PQDs must be made type of transient fault. Software analysis of wavelet transform with multiresolution analysis MRA algorithm and feed forward neural network probabilistic and multilayer feed forward neural network based methodology for automatic classification of eight types of PQ signals flicker harmonics sag swell impulse fluctuation notch and oscillatory will be presented. The wavelet family Db4 is chosen in this system to calculate the values of detailed energy distributions as input features for classification because it can perform well in detecting and localizing various types of PQ disturbances. This technique classifies the types of PQDs problem sevents.The classifiers classify and identify the disturbance type according to the energy distribution. The results show that the PNN can analyze different power disturbance types efficiently. Therefore it can be seen that PNN has better classification accuracy than MLFF.

  7. DWT-Based Robust Color Image Watermarking Scheme

    Institute of Scientific and Technical Information of China (English)

    Liu Lianshan; Li Renhou; Gao Qi

    2005-01-01

    A scheme of embedding an encrypted watermark into the green component of a color image is proposed. The embedding process is implemented in the discrete wavelet transformation (DWT) domain. The original binary watermark image is firstly encrypted through scrambling technique, and then spread with two orthogonal pseudo-random sequences whose mean values are equal to zero, and finally embedded into the DWT low frequency sub-band of green components. The coefficients whose energies are larger than the others are selected to hide watermark, and the hidden watermark strength is determined by the energy ratio between the selected coefficients energies and the mean energy of the subband. The experiment results demonstrate that the proposed watermarking scheme is very robust against the attacks such as additive noise, low-pass filtering, scaling, cropping image, row ( or column ) deleting, and JPEG compression.

  8. A continuous and prognostic convection scheme based on buoyancy, PCMT

    Science.gov (United States)

    Guérémy, Jean-François; Piriou, Jean-Marcel

    2016-04-01

    A new and consistent convection scheme (PCMT: Prognostic Condensates Microphysics and Transport), providing a continuous and prognostic treatment of this atmospheric process, is described. The main concept ensuring the consistency of the whole system is the buoyancy, key element of any vertical motion. The buoyancy constitutes the forcing term of the convective vertical velocity, which is then used to define the triggering condition, the mass flux, and the rates of entrainment-detrainment. The buoyancy is also used in its vertically integrated form (CAPE) to determine the closure condition. The continuous treatment of convection, from dry thermals to deep precipitating convection, is achieved with the help of a continuous formulation of the entrainment-detrainment rates (depending on the convective vertical velocity) and of the CAPE relaxation time (depending on the convective over-turning time). The convective tendencies are directly expressed in terms of condensation and transport. Finally, the convective vertical velocity and condensates are fully prognostic, the latter being treated using the same microphysics scheme as for the resolved condensates but considering the convective environment. A Single Column Model (SCM) validation of this scheme is shown, allowing detailed comparisons with observed and explicitly simulated data. Four cases covering the convective spectrum are considered: over ocean, sensitivity to environmental moisture (S. Derbyshire) non precipitating shallow convection to deep precipitating convection, trade wind shallow convection (BOMEX) and strato-cumulus (FIRE), together with an entire continental diurnal cycle of convection (ARM). The emphasis is put on the characteristics of the scheme which enable a continuous treatment of convection. Then, a 3D LAM validation is presented considering an AMMA case with both observations and a CRM simulation using the same initial and lateral conditions as for the parameterized one. Finally, global

  9. Structure-based classification and ontology in chemistry

    Directory of Open Access Journals (Sweden)

    Hastings Janna

    2012-04-01

    Full Text Available Abstract Background Recent years have seen an explosion in the availability of data in the chemistry domain. With this information explosion, however, retrieving relevant results from the available information, and organising those results, become even harder problems. Computational processing is essential to filter and organise the available resources so as to better facilitate the work of scientists. Ontologies encode expert domain knowledge in a hierarchically organised machine-processable format. One such ontology for the chemical domain is ChEBI. ChEBI provides a classification of chemicals based on their structural features and a role or activity-based classification. An example of a structure-based class is 'pentacyclic compound' (compounds containing five-ring structures, while an example of a role-based class is 'analgesic', since many different chemicals can act as analgesics without sharing structural features. Structure-based classification in chemistry exploits elegant regularities and symmetries in the underlying chemical domain. As yet, there has been neither a systematic analysis of the types of structural classification in use in chemistry nor a comparison to the capabilities of available technologies. Results We analyze the different categories of structural classes in chemistry, presenting a list of patterns for features found in class definitions. We compare these patterns of class definition to tools which allow for automation of hierarchy construction within cheminformatics and within logic-based ontology technology, going into detail in the latter case with respect to the expressive capabilities of the Web Ontology Language and recent extensions for modelling structured objects. Finally we discuss the relationships and interactions between cheminformatics approaches and logic-based approaches. Conclusion Systems that perform intelligent reasoning tasks on chemistry data require a diverse set of underlying computational

  10. MPSK Symbol-based Soft-Information-Forwarding Scheme in Rayleigh Fading Channels

    Directory of Open Access Journals (Sweden)

    Huamei Xin

    2014-06-01

    Full Text Available In this paper, we proposed a symbol-based multiple phase shift keying (MPSK soft-information-forwarding (SIF scheme for a two-hop parallel relay wireless network in Rayleigh fading channel. First the binary information streams at the source are mapped into MPSK symbols, and the relays construct the relay processing function by passing the intermediate soft decisions. Then the relays broadcast the processed symbols to the destination. After the maximum ratio combination, the received symbols at the destination can be decided by maximum-likelihood (ML decision. Four MPSK symbol-based forwarding schemes are investigated and the simulation results show that the bit error rate (BER performance of soft information forwarding scheme has better BER performance than the existing memoryless forwarding scheme based on MPSK modulation, and it is more practical than the SIF scheme based on BPSK modulation

  11. Inverse Category Frequency based supervised term weighting scheme for text categorization

    CERN Document Server

    Wang, Deqing; Wu, Wenjun

    2010-01-01

    Unsupervised term weighting schemes, borrowed from information retrieval field, have been widely used for text categorization and the most famous one is tf.idf. The intuition behind idf seems less reasonable for TC task than IR task. In this paper, we introduce inverse category frequency into supervised term weighting schemes and propose a novel icf-based method. The method combines icf and relevance frequency (rf) to weight terms in training dataset. Our experiments have shown that icf-based supervised term weighting scheme is superior to tf.rf and prob-based supervised term weighting schemes and tf.idf based on two widely used datasets, i.e., the unbalanced Reuters-21578 corpus and the balanced 20 Newsgroup corpus. We also present the detailed evaluations of each category of the two datasets among the four term weighting schemes on precision, recall and F1 measure.

  12. Authentication Scheme Based on Principal Component Analysis for Satellite Images

    Directory of Open Access Journals (Sweden)

    Ashraf. K. Helmy

    2009-09-01

    Full Text Available This paper presents a multi-band wavelet image content authentication scheme for satellite images by incorporating the principal component analysis (PCA. The proposed schemeachieves higher perceptual transparency and stronger robustness. Specifically, the developed watermarking scheme can successfully resist common signal processing such as JPEG compression and geometric distortions such as cropping. In addition, the proposed scheme can be parameterized, thus resulting in more security. That is, an attacker may not be able to extract the embedded watermark if the attacker does not know the parameter.In an order to meet these requirements, the host image is transformed to YIQ to decrease the correlation between different bands, Then Multi-band Wavelet transform (M-WT is applied to each channel separately obtaining one approximate sub band and fifteen detail sub bands. PCA is then applied to the coefficients corresponding to the same spatial location in all detail sub bands. The last principle component band represents an excellent domain forinserting the water mark since it represents lowest correlated features in high frequency area of host image.One of the most important aspects of satellite images is spectral signature, the behavior of different features in different spectral bands, the results of proposed algorithm shows that the spectral stamp for different features doesn't tainted after inserting the watermark.

  13. An AERONET-based aerosol classification using the Mahalanobis distance

    Science.gov (United States)

    Hamill, Patrick; Giordano, Marco; Ward, Carolyne; Giles, David; Holben, Brent

    2016-09-01

    We present an aerosol classification based on AERONET aerosol data from 1993 to 2012. We used the AERONET Level 2.0 almucantar aerosol retrieval products to define several reference aerosol clusters which are characteristic of the following general aerosol types: Urban-Industrial, Biomass Burning, Mixed Aerosol, Dust, and Maritime. The classification of a particular aerosol observation as one of these aerosol types is determined by its five-dimensional Mahalanobis distance to each reference cluster. We have calculated the fractional aerosol type distribution at 190 AERONET sites, as well as the monthly variation in aerosol type at those locations. The results are presented on a global map and individually in the supplementary material. Our aerosol typing is based on recognizing that different geographic regions exhibit characteristic aerosol types. To generate reference clusters we only keep data points that lie within a Mahalanobis distance of 2 from the centroid. Our aerosol characterization is based on the AERONET retrieved quantities, therefore it does not include low optical depth values. The analysis is based on "point sources" (the AERONET sites) rather than globally distributed values. The classifications obtained will be useful in interpreting aerosol retrievals from satellite borne instruments.

  14. Micro-Doppler Based Classification of Human Aquatic Activities via Transfer Learning of Convolutional Neural Networks

    Directory of Open Access Journals (Sweden)

    Jinhee Park

    2016-11-01

    Full Text Available Accurate classification of human aquatic activities using radar has a variety of potential applications such as rescue operations and border patrols. Nevertheless, the classification of activities on water using radar has not been extensively studied, unlike the case on dry ground, due to its unique challenge. Namely, not only is the radar cross section of a human on water small, but the micro-Doppler signatures are much noisier due to water drops and waves. In this paper, we first investigate whether discriminative signatures could be obtained for activities on water through a simulation study. Then, we show how we can effectively achieve high classification accuracy by applying deep convolutional neural networks (DCNN directly to the spectrogram of real measurement data. From the five-fold cross-validation on our dataset, which consists of five aquatic activities, we report that the conventional feature-based scheme only achieves an accuracy of 45.1%. In contrast, the DCNN trained using only the collected data attains 66.7%, and the transfer learned DCNN, which takes a DCNN pre-trained on a RGB image dataset and fine-tunes the parameters using the collected data, achieves a much higher 80.3%, which is a significant performance boost.

  15. Vascular bone tumors: a proposal of a classification based on clinicopathological, radiographic and genetic features

    Energy Technology Data Exchange (ETDEWEB)

    Errani, Costantino [Istituto Ortopedico Rizzoli, Ortopedia Generale, Orthopaedic Service, Bagheria (Italy); Struttura Complessa Ortopedia Generale, Dipartimento Rizzoli-Sicilia, Bagheria, PA (Italy); Vanel, Daniel; Gambarotti, Marco; Alberghini, Marco [Istituto Ortopedico Rizzoli, Pathology Service, Bologna (Italy); Picci, Piero [Istituto Ortopedico Rizzoli, Laboratory for Cancer Research, Bologna (Italy); Faldini, Cesare [Istituto Ortopedico Rizzoli, Ortopedia Generale, Orthopaedic Service, Bagheria (Italy)

    2012-12-15

    The classification of vascular bone tumors remains challenging, with considerable morphological overlap spanning across benign to malignant categories. The vast majority of both benign and malignant vascular tumors are readily diagnosed based on their characteristic histological features, such as the formation of vascular spaces and the expression of endothelial markers. However, some vascular tumors have atypical histological features, such as a solid growth pattern, epithelioid change, or spindle cell morphology, which complicates their diagnosis. Pathologically, these tumors are remarkably similar, which makes differentiating them from each other very difficult. For this rare subset of vascular bone tumors, there remains considerable controversy with regard to the terminology and the classification that should be used. Moreover, one of the most confusing issues related to vascular bone tumors is the myriad of names that are used to describe them. Because the clinical behavior and, consequently, treatment and prognosis of vascular bone tumors can vary significantly, it is important to effectively and accurately distinguish them from each other. Upon review of the nomenclature and the characteristic clinicopathological, radiographic and genetic features of vascular bone tumors, we propose a classification scheme that includes hemangioma, hemangioendothelioma, angiosarcoma, and their epithelioid variants. (orig.)

  16. A sixth order hybrid finite difference scheme based on the minimized dispersion and controllable dissipation technique

    Science.gov (United States)

    Sun, Zhen-sheng; Luo, Lei; Ren, Yu-xin; Zhang, Shi-ying

    2014-08-01

    The dispersion and dissipation properties of a scheme are of great importance for the simulation of flow fields which involve a broad range of length scales. In order to improve the spectral properties of the finite difference scheme, the authors have previously proposed the idea of optimizing the dispersion and dissipation properties separately and a fourth order scheme based on the minimized dispersion and controllable dissipation (MDCD) technique is thus constructed [29]. In the present paper, we further investigate this technique and extend it to a sixth order finite difference scheme to solve the Euler and Navier-Stokes equations. The dispersion properties of the scheme is firstly optimized by minimizing an elaborately designed integrated error function. Then the dispersion-dissipation condition which is newly derived by Hu and Adams [30] is introduced to supply sufficient dissipation to damp the unresolved wavenumbers. Furthermore, the optimized scheme is blended with an optimized Weighted Essentially Non-Oscillation (WENO) scheme to make it possible for the discontinuity-capturing. In this process, the approximation-dispersion-relation (ADR) approach is employed to optimize the spectral properties of the nonlinear scheme to yield the true wave propagation behavior of the finite difference scheme. Several benchmark test problems, which include broadband fluctuations and strong shock waves, are solved to validate the high-resolution, the good discontinuity-capturing capability and the high-efficiency of the proposed scheme.

  17. Secure Cooperative Spectrum Sensing via a Novel User-Classification Scheme in Cognitive Radios for Future Communication Technologies

    Directory of Open Access Journals (Sweden)

    Muhammad Usman

    2015-05-01

    Full Text Available Future communication networks would be required to deliver data on a far greater scale than is known to us today, thus mandating the maximal utilization of the available radio spectrum using cognitive radios. In this paper, we have proposed a novel cooperative spectrum sensing approach for cognitive radios. In cooperative spectrum sensing, the fusion center relies on reports of the cognitive users to make a global decision. The global decision is obtained by assigning weights to the reports received from cognitive users. Computation of such weights requires prior information of the probability of detection and the probability of false alarms, which are not readily available in real scenarios. Further, the cognitive users are divided into reliable and unreliable categories based on their weighted energy by using some empirical threshold. In this paper, we propose a method to classify the cognitive users into reliable, neutral and unreliable categories without using any pre-defined or empirically-obtained threshold. Moreover, the computation of weights does not require the detection, or false alarm probabilities, or an estimate of these probabilities. Reliable cognitive users are assigned the highest weights; neutral cognitive users are assigned medium weights (less than the reliable and higher than the unreliable cognitive users’ weights; and unreliable users are assigned the least weights. We show the performance improvement of our proposed method through simulations by comparing it with the conventional cooperative spectrum sensing scheme through different metrics, like receiver operating characteristic (ROC curve and mean square error. For clarity, we also show the effect of malicious users on detection probability and false alarm probability individually through simulations.

  18. A unified classification of alien species based on the magnitude of their environmental impacts.

    Directory of Open Access Journals (Sweden)

    Tim M Blackburn

    2014-05-01

    Full Text Available Species moved by human activities beyond the limits of their native geographic ranges into areas in which they do not naturally occur (termed aliens can cause a broad range of significant changes to recipient ecosystems; however, their impacts vary greatly across species and the ecosystems into which they are introduced. There is therefore a critical need for a standardised method to evaluate, compare, and eventually predict the magnitudes of these different impacts. Here, we propose a straightforward system for classifying alien species according to the magnitude of their environmental impacts, based on the mechanisms of impact used to code species in the International Union for Conservation of Nature (IUCN Global Invasive Species Database, which are presented here for the first time. The classification system uses five semi-quantitative scenarios describing impacts under each mechanism to assign species to different levels of impact-ranging from Minimal to Massive-with assignment corresponding to the highest level of deleterious impact associated with any of the mechanisms. The scheme also includes categories for species that are Not Evaluated, have No Alien Population, or are Data Deficient, and a method for assigning uncertainty to all the classifications. We show how this classification system is applicable at different levels of ecological complexity and different spatial and temporal scales, and embraces existing impact metrics. In fact, the scheme is analogous to the already widely adopted and accepted Red List approach to categorising extinction risk, and so could conceivably be readily integrated with existing practices and policies in many regions.

  19. Multi-robot system learning based on evolutionary classification

    Directory of Open Access Journals (Sweden)

    Manko Sergey

    2016-01-01

    Full Text Available This paper presents a novel machine learning method for agents of a multi-robot system. The learning process is based on knowledge discovery through continual analysis of robot sensory information. We demonstrate that classification trees and evolutionary forests may be a basis for creation of autonomous robots capable both of learning and knowledge exchange with other agents in multi-robot system. The results of experimental studies confirm the effectiveness of the proposed approach.

  20. Label-Embedding for Attribute-Based Classification

    OpenAIRE

    Akata, Zeynep; Perronnin, Florent; Harchaoui, Zaid; Schmid, Cordelia

    2013-01-01

    International audience; Attributes are an intermediate representation, which enables parameter sharing between classes, a must when training data is scarce. We propose to view attribute-based image classification as a label-embedding problem: each class is embedded in the space of attribute vectors. We introduce a function which measures the compatibility between an image and a label embedding. The parameters of this function are learned on a training set of labeled samples to ensure that, gi...

  1. Hierarchical Classification of Chinese Documents Based on N-grams

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We explore the techniques of utilizing N-gram informatio n tocategorize Chinese text documents hierarchically so that the classifier can shak e off the burden of large dictionaries and complex segmentation processing, and subsequently be domain and time independent. A hierarchical Chinese text classif ier is implemented. Experimental results show that hierarchically classifying Chinese text documents based N-grams can achieve satisfactory performance and outperforms the other traditional Chinese text classifiers.

  2. MULTIMEDIA DATA TRANSMISSION THROUGH TCP/IP USING HASH BASED FEC WITH AUTO-XOR SCHEME

    Directory of Open Access Journals (Sweden)

    R. Shalin

    2012-09-01

    Full Text Available The most preferred mode for communication of multimedia data is through the TCP/IP protocol. But on the other hand the TCP/IP protocol produces huge packet loss unavoidable due to network traffic and congestion. In order to provide a efficient communication it is necessary to recover the loss of packets. The proposed scheme implements Hash based FEC with auto XOR scheme for this purpose. The scheme is implemented through Forward error correction, MD5 and XOR for providing efficient transmission of multimedia data. The proposed scheme provides transmission high accuracy, throughput and low latency and loss.

  3. A novel scheme based on minimum delay at the edges for optical burst switching networks

    Institute of Scientific and Technical Information of China (English)

    Jinhui Yu(于金辉); Yijun Yang(杨毅军); Yuehua Chen(陈月华); Ge Fan(范戈)

    2003-01-01

    This paper proposes a novel scheme based on minimum delay at the edges (MDE) for optical burst switching(OBS) networks. This scheme is designed to overcome the long delay at the edge nodes of OBS networks.The MDE scheme features simultaneous burst assembly, channel scheduling, and pre-transmission of controlpacket. It also features estimated setup and explicit release (ESXR) signaling protocol. The MDE schemecan minimize the delay at the edge nodes for data packets, and improve the end-to-end latency performancefor OBS networks. In addition, comparing with the conventional scheme, the performances of the MDEscheme are analyzed in this paper.

  4. Admission Control Scheme for Multi-class Services in QoS-based Mobile Cellular Networks

    Institute of Scientific and Technical Information of China (English)

    YINZhiming; XIEJianying

    2004-01-01

    Call admission control (CAC) is one of the key schemes to guarantee Quality of service (QoS) in mobile cellular networks. In this paper, we propose an optimal CAC scheme based on Semi-Markov decision processes (SMDP) theory to support multi-class services for QoS wireless networks. Linear programming formulation is used to find the optimal solution, which maximizes the channel utilization while meeting the requirements of QoS constraints. The numerical results show that the performance of our scheme outperforms DCAC scheme.

  5. Cryptanalysis on AW digital signature scheme based on error-correcting codes

    Institute of Scientific and Technical Information of China (English)

    张振峰; 冯登国; 戴宗锋

    2002-01-01

    In 1993, Alabhadi and Wicker gave a modification to Xinmei Digital Signature Scheme based on error-correcting codes, which is usually denoted by AW Scheme. In this paper we show that the AW Scheme is actually not secure: anyone holding public keys of the signatory can obtain the equivalent private keys, and then forge digital signatures for arbitrary messages successfully. We also point out that one can hardly construct a digital signature scheme with high-level security due to the difficulty of decomposing large matrixes.

  6. Improvement of Identity-Based Threshold Proxy Signature Scheme with Known Signers

    Institute of Scientific and Technical Information of China (English)

    LI Fagen; HU Yupu; CHEN Jie

    2006-01-01

    In 2006, Bao et al proposed an identity-based threshold proxy signature scheme with known signers. In this paper, we show that Bao et al 's scheme is vulnerable to the forgery attack. An adversary can forge a valid threshold proxy signature for any message with knowing a previously valid threshold proxy signature. In addition, their scheme also suffers from the weakness that the proxy signers might change the threshold value. That is, the proxy signers can arbitrarily modify the threshold strategy without being detected by the original signer or verifiers, which might violate the original signer's intent. Furthermore, we propose an improved scheme that remedies the weaknesses of Bao et al 's scheme. The improved scheme satisfies all secure requirements for threshold proxy signature.

  7. A Novel Digital Certificate Based Remote Data Access Control Scheme in WSN

    Directory of Open Access Journals (Sweden)

    Wei Liang

    2015-01-01

    Full Text Available A digital certificate based remote data access control scheme is proposed for safe authentication of accessor in wireless sensor network (WSN. The scheme is founded on the access control scheme on the basis of characteristic expression (named CEB scheme. Data is divided by characteristics and the key for encryption is related to characteristic expression. Only the key matching with characteristic expression can decrypt the data. Meanwhile, three distributed certificate detection methods are designed to prevent the certificate from being misappropriated by hostile anonymous users. When a user starts query, the key access control method can judge whether the query is valid. In this case, the scheme can achieve public certificate of users and effectively protect query privacy as well. The security analysis and experiments show that the proposed scheme is superior in communication overhead, storage overhead, and detection probability.

  8. Evaluation of Scheme Design of Blast Furnace Based on Artificial Neural Network

    Institute of Scientific and Technical Information of China (English)

    TANG Hong; LI Jing-min; YAO Bi-qiang; LIAO Hong-fu; YAO Jin

    2008-01-01

    Blast furnace scheme design is very important, since it directly affects the performance, cost and configuration of the blast furnace. An evaluation approach to furnace scheme design was brought forward based on artificial neural network. Ten independent parameters which determined a scheme design were proposed. The improved threelayer BP network algorithm was used to build the evaluation model in which the 10 independent parameters were taken as input evaluation indexes and the degree to which the scheme design satisfies the requirements of the blast furnace as output. It was trained by the existing samples of the scheme design and the experts' experience, and then tested by the other samples so as to develop the evaluation model. As an example, it is found that a good scheme design of blast furnace can be chosen by using the evaluation model proposed.

  9. An efficient authentication scheme based on one-way key chain for sensor network

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To strike a tradeoff between the security and the consumption of energy, computing and communication resources in the nodes, this paper presents an efficient authentication scheme based on one-way key chain for sensor network. The scheme can provide immediate authentication to fulfill the latency and the storage requirements and defends against various attacks such as replay, impersonation and denial of service. Meanwhile,our scheme possesses low overhead and scalability to large networks. Furthermore, the simple related protocols or algorithms in the scheme and inexpensive public-key operation required in view of resource-starved sensor nodes minimize the storage, computation and communication overhead, and improve the efficiency of our scheme. In addition, the proposed scheme also supports source authentication without precluding in-network processing and passive participation.

  10. Comment Fail-Stop Blind Signature Scheme Design Based on Pairings

    Institute of Scientific and Technical Information of China (English)

    HU Xiaoming; HUANG Shangteng

    2006-01-01

    Fail-stop signature schemes provide security for a signer against forgeries of an enemy with unlimited computational power by enabling the signer to provide a proof of forgery when a forgery happens. Chang et al proposed a robust fail-stop blind signature scheme based on bilinear pairings. However, in this paper, it will be found that there are several mistakes in Chang et al' fail-stop blind signature scheme. Moreover, it will be pointed out that this scheme doesn' meet the property of a fail-stop signature: unconditionally secure for a signer. In Chang et al' scheme, a forger can forge a valid signature that can' be proved by a signer using the "proof of forgery". The scheme also doesn' possess the unlinkability property of a blind signature.

  11. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  12. Tree-based disease classification using protein data.

    Science.gov (United States)

    Zhu, Hongtu; Yu, Chang-Yung; Zhang, Heping

    2003-09-01

    A reliable and precise classification of diseases is essential for successful diagnosis and treatment. Using mass spectrometry from clinical specimens, scientists may find the protein variations among disease and use this information to improve diagnosis. In this paper, we propose a novel procedure to classify disease status based on the protein data from mass spectrometry. Our new tree-based algorithm consists of three steps: projection, selection and classification tree. The projection step aims to project all observations from specimens into the same bases so that the projected data have fixed coordinates. Thus, for each specimen, we obtain a large vector of 'coefficients' on the same basis. The purpose of the selection step is data reduction by condensing the large vector from the projection step into a much lower order of informative vector. Finally, using these reduced vectors, we apply recursive partitioning to construct an informative classification tree. This method has been successfully applied to protein data, provided by the Department of Radiology and Chemistry at Duke University.

  13. An implicit evolution scheme for active contours and surfaces based on IIR filtering.

    Science.gov (United States)

    Delibasis, Konstantinos K; Asvestas, Pantelis A; Kechriniotis, Aristides I; Matsopoulos, George K

    2014-05-01

    In this work, we present an approach for implementing an implicit scheme for the numerical solution of the partial differential equation of the evolution of an active contour/surface. The proposed scheme is applicable to any variant of the traditional active contour (AC), irrespectively of the calculation of the image-based force field and it is readily applicable to explicitly parameterized active surfaces (AS). The proposed approach is formulated as an infinite impulse response (IIR) filtering of the coordinates of the contour/surface points. The poles of the filter are determined by the parameters controlling the shape of the active contour/surface. We show that the proposed IIR-based implicit evolution scheme has very low complexity. Furthermore, the proposed scheme is numerically stable, thus it allows the convergence of the AC/AS with significantly fewer iterations than the explicit evolution scheme. It also possesses the separability property along the two parameters of the AS, thus it may be applied to deformable surfaces, without the need to store and invert large sparse matrices. We implemented the proposed IIR-based implicit evolution scheme in the Vector Field Convolution (VFC) AC/AS using synthetic and clinical volumetric data. We compared the segmentation results with those of the explicit AC/AS evolution, in terms of accuracy and efficiency. Results show that the VFC AC/AS with the proposed IIR-based implicit evolution scheme achieves the same segmentation results with the explicit scheme, with considerably less computation time.

  14. Efficient hierarchical identity based encryption scheme in the standard model over lattices

    Institute of Scientific and Technical Information of China (English)

    Feng-he WANG; Chun-xiao WANG; Zhen-hua LIU

    2016-01-01

    Using lattice basis delegation in a fi xed dimension, we propose an efficient lattice-based hierarchical identity based encryption (HIBE) scheme in the standard model whose public key size is only (dm2+mn) log q bits and whose message-ciphertext expansion factor is only log q, where d is the maximum hierarchical depth and (n,m,q) are public parameters. In our construction, a novel public key assignment rule is used to averagely assign one random and public matrix to two identity bits, which implies that d random public matrices are enough to build the proposed HIBE scheme in the standard model, compared with the case in which 2d such public matrices are needed in the scheme proposed at Crypto 2010 whose public key size is (2dm2+mn+m) log q. To reduce the message-ciphertext expansion factor of the proposed scheme to log q, the encryption algorithm of this scheme is built based on Gentry’s encryption scheme, by which m2 bits of plaintext are encrypted into m2 log q bits of ciphertext by a one time encryption operation. Hence, the presented scheme has some advantages with respect to not only the public key size but also the message-ciphertext expansion factor. Based on the hardness of the learning with errors problem, we demonstrate that the scheme is secure under selective identity and chosen plaintext attacks.

  15. Dropping out of Ethiopia’s Community Based Health Insurance scheme

    NARCIS (Netherlands)

    A.D. Mebratie (Anagaw); R.A. Sparrow (Robert); Z.Y. Debebe (Zelalem); G. Alemu (Getnet ); A.S. Bedi (Arjun Singh)

    2014-01-01

    textabstractLow contract renewal rates have been identified as one of the challenges facing the development of community based health insurance schemes (CBHI). This paper uses longitudinal household survey data to examine dropout in the case of Ethiopia’s pilot CBHI scheme, which saw enrolment incre

  16. Remodulation scheme based on a two-section reflective SOA

    Science.gov (United States)

    Guiying, Jiang; Lirong, Huang

    2014-05-01

    A simple and cost-effective remodulation scheme using a two-section reflective semiconductor optical amplifier (RSOA) is proposed for a colorless optical network unit (ONU). Under proper injection currents, the front section functions as a modulator to upload the upstream signal while the rear section serves as a data eraser for efficient suppression of the downstream data. The dependences of the upstream transmission performance on the lengths and driven currents of the RSOA, the injection optical power and extinction ratio of the downstream are investigated. By optimizing these parameters, the downstream data can be more completely suppressed and the upstream transmission performance can be greatly improved.

  17. Scheme of adaptive polarization filtering based on Kalman model

    Institute of Scientific and Technical Information of China (English)

    Song Lizhong; Qi Haiming; Qiao Xiaolin; Meng Xiande

    2006-01-01

    A new kind of adaptive polarization filtering algorithm in order to suppress the angle cheating interference for the active guidance radar is presented. The polarization characteristic of the interference is dynamically tracked by using Kalman estimator under variable environments with time. The polarization filter parameters are designed according to the polarization characteristic of the interference, and the polarization filtering is finished in the target cell. The system scheme of adaptive polarization filter is studied and the tracking performance of polarization filter and improvement of angle measurement precision are simulated. The research results demonstrate this technology can effectively suppress the angle cheating interference in guidance radar and is feasible in engineering.

  18. Gene function classification using Bayesian models with hierarchy-based priors

    Directory of Open Access Journals (Sweden)

    Neal Radford M

    2006-10-01

    Full Text Available Abstract Background We investigate whether annotation of gene function can be improved using a classification scheme that is aware that functional classes are organized in a hierarchy. The classifiers look at phylogenic descriptors, sequence based attributes, and predicted secondary structure. We discuss three Bayesian models and compare their performance in terms of predictive accuracy. These models are the ordinary multinomial logit (MNL model, a hierarchical model based on a set of nested MNL models, and an MNL model with a prior that introduces correlations between the parameters for classes that are nearby in the hierarchy. We also provide a new scheme for combining different sources of information. We use these models to predict the functional class of Open Reading Frames (ORFs from the E. coli genome. Results The results from all three models show substantial improvement over previous methods, which were based on the C5 decision tree algorithm. The MNL model using a prior based on the hierarchy outperforms both the non-hierarchical MNL model and the nested MNL model. In contrast to previous attempts at combining the three sources of information in this dataset, our new approach to combining data sources produces a higher accuracy rate than applying our models to each data source alone. Conclusion Together, these results show that gene function can be predicted with higher accuracy than previously achieved, using Bayesian models that incorporate suitable prior information.

  19. Land Cover - Minnesota Land Cover Classification System

    Data.gov (United States)

    Minnesota Department of Natural Resources — Land cover data set based on the Minnesota Land Cover Classification System (MLCCS) coding scheme. This data was produced using a combination of aerial photograph...

  20. The DTW-based representation space for seismic pattern classification

    Science.gov (United States)

    Orozco-Alzate, Mauricio; Castro-Cabrera, Paola Alexandra; Bicego, Manuele; Londoño-Bonilla, John Makario

    2015-12-01

    Distinguishing among the different seismic volcanic patterns is still one of the most important and labor-intensive tasks for volcano monitoring. This task could be lightened and made free from subjective bias by using automatic classification techniques. In this context, a core but often overlooked issue is the choice of an appropriate representation of the data to be classified. Recently, it has been suggested that using a relative representation (i.e. proximities, namely dissimilarities on pairs of objects) instead of an absolute one (i.e. features, namely measurements on single objects) is advantageous to exploit the relational information contained in the dissimilarities to derive highly discriminant vector spaces, where any classifier can be used. According to that motivation, this paper investigates the suitability of a dynamic time warping (DTW) dissimilarity-based vector representation for the classification of seismic patterns. Results show the usefulness of such a representation in the seismic pattern classification scenario, including analyses of potential benefits from recent advances in the dissimilarity-based paradigm such as the proper selection of representation sets and the combination of different dissimilarity representations that might be available for the same data.

  1. Data Classification Based on Confidentiality in Virtual Cloud Environment

    Directory of Open Access Journals (Sweden)

    Munwar Ali Zardari

    2014-10-01

    Full Text Available The aim of this study is to provide suitable security to data based on the security needs of data. It is very difficult to decide (in cloud which data need what security and which data do not need security. However it will be easy to decide the security level for data after data classification according to their security level based on the characteristics of the data. In this study, we have proposed a data classification cloud model to solve data confidentiality issue in cloud computing environment. The data are classified into two major classes: sensitive and non-sensitive. The K-Nearest Neighbour (K-NN classifier is used for data classification and the Rivest, Shamir and Adelman (RSA algorithm is used to encrypt sensitive data. After implementing the proposed model, it is found that the confidentiality level of data is increased and this model is proved to be more cost and memory friendly for the users as well as for the cloud services providers. The data storage service is one of the cloud services where data servers are virtualized of all users. In a cloud server, the data are stored in two ways. First encrypt the received data and store on cloud servers. Second store data on the cloud servers without encryption. Both of these data storage methods can face data confidentiality issue, because the data have different values and characteristics that must be identified before sending to cloud severs.

  2. Changing Histopathological Diagnostics by Genome-Based Tumor Classification

    Directory of Open Access Journals (Sweden)

    Michael Kloth

    2014-05-01

    Full Text Available Traditionally, tumors are classified by histopathological criteria, i.e., based on their specific morphological appearances. Consequently, current therapeutic decisions in oncology are strongly influenced by histology rather than underlying molecular or genomic aberrations. The increase of information on molecular changes however, enabled by the Human Genome Project and the International Cancer Genome Consortium as well as the manifold advances in molecular biology and high-throughput sequencing techniques, inaugurated the integration of genomic information into disease classification. Furthermore, in some cases it became evident that former classifications needed major revision and adaption. Such adaptations are often required by understanding the pathogenesis of a disease from a specific molecular alteration, using this molecular driver for targeted and highly effective therapies. Altogether, reclassifications should lead to higher information content of the underlying diagnoses, reflecting their molecular pathogenesis and resulting in optimized and individual therapeutic decisions. The objective of this article is to summarize some particularly important examples of genome-based classification approaches and associated therapeutic concepts. In addition to reviewing disease specific markers, we focus on potentially therapeutic or predictive markers and the relevance of molecular diagnostics in disease monitoring.

  3. An ellipse detection algorithm based on edge classification

    Science.gov (United States)

    Yu, Liu; Chen, Feng; Huang, Jianming; Wei, Xiangquan

    2015-12-01

    In order to enhance the speed and accuracy of ellipse detection, an ellipse detection algorithm based on edge classification is proposed. Too many edge points are removed by making edge into point in serialized form and the distance constraint between the edge points. It achieves effective classification by the criteria of the angle between the edge points. And it makes the probability of randomly selecting the edge points falling on the same ellipse greatly increased. Ellipse fitting accuracy is significantly improved by the optimization of the RED algorithm. It uses Euclidean distance to measure the distance from the edge point to the elliptical boundary. Experimental results show that: it can detect ellipse well in case of edge with interference or edges blocking each other. It has higher detecting precision and less time consuming than the RED algorithm.

  4. Entropy coders for image compression based on binary forward classification

    Science.gov (United States)

    Yoo, Hoon; Jeong, Jechang

    2000-12-01

    Entropy coders as a noiseless compression method are widely used as final step compression for images, and there have been many contributions to increase of entropy coder performance and to reduction of entropy coder complexity. In this paper, we propose some entropy coders based on the binary forward classification (BFC). The BFC requires overhead of classification but there is no change between the amount of input information and the total amount of classified output information, which we prove this property in this paper. And using the proved property, we propose entropy coders that are the BFC followed by Golomb-Rice coders (BFC+GR) and the BFC followed by arithmetic coders (BFC+A). The proposed entropy coders introduce negligible additional complexity due to the BFC. Simulation results also show better performance than other entropy coders that have similar complexity to the proposed coders.

  5. A novel classification method based on membership function

    Science.gov (United States)

    Peng, Yaxin; Shen, Chaomin; Wang, Lijia; Zhang, Guixu

    2011-03-01

    We propose a method for medical image classification using membership function. Our aim is to classify the image as several classes based on a prior knowledge. For every point, we calculate its membership function, i.e., the probability that the point belongs to each class. The point is finally labeled as the class with the highest value of membership function. The classification is reduced to a minimization problem of a functional with arguments of membership functions. Three novelties are in our paper. First, bias correction and Rudin-Osher-Fatemi (ROF) model are adopted to the input image to enhance the image quality. Second, unconstrained functional is used. We use variable substitution to avoid the constraints that membership functions should be positive and with sum one. Third, several techniques are used to fasten the computation. The experimental result of ventricle shows the validity of this approach.

  6. SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Thiruvengatanadhan Ramalingam

    2014-01-01

    Full Text Available Audio classification serves as the fundamental step towards the rapid growth in audio data volume. Due to the increasing size of the multimedia sources speech and music classification is one of the most important issues for multimedia information retrieval. In this work a speech/music discrimination system is developed which utilizes the Discrete Wavelet Transform (DWT as the acoustic feature. Multi resolution analysis is the most significant statistical way to extract the features from the input signal and in this study, a method is deployed to model the extracted wavelet feature. Support Vector Machines (SVM are based on the principle of structural risk minimization. SVM is applied to classify audio into their classes namely speech and music, by learning from training data. Then the proposed method extends the application of Gaussian Mixture Models (GMM to estimate the probability density function using maximum likelihood decision methods. The system shows significant results with an accuracy of 94.5%.

  7. A Fuzzy Similarity Based Concept Mining Model for Text Classification

    CERN Document Server

    Puri, Shalini

    2012-01-01

    Text Classification is a challenging and a red hot field in the current scenario and has great importance in text categorization applications. A lot of research work has been done in this field but there is a need to categorize a collection of text documents into mutually exclusive categories by extracting the concepts or features using supervised learning paradigm and different classification algorithms. In this paper, a new Fuzzy Similarity Based Concept Mining Model (FSCMM) is proposed to classify a set of text documents into pre - defined Category Groups (CG) by providing them training and preparing on the sentence, document and integrated corpora levels along with feature reduction, ambiguity removal on each level to achieve high system performance. Fuzzy Feature Category Similarity Analyzer (FFCSA) is used to analyze each extracted feature of Integrated Corpora Feature Vector (ICFV) with the corresponding categories or classes. This model uses Support Vector Machine Classifier (SVMC) to classify correct...

  8. Pairing-Free ID-Based Key-Insulated Signature Scheme

    Institute of Scientific and Technical Information of China (English)

    Guo-Bin Zhu; Hu Xiong; Zhi-Guang Qin

    2015-01-01

    Abstract⎯Without the assumption that the private keys are kept secure perfectly, cryptographic primitives cannot be deployed in the insecure environments where the key leakage is inevitable. In order to reduce the damage caused by the key exposure in the identity-based (ID-based) signature scenarios efficiently, we propose an ID-based key-insulated signature scheme in this paper, which eliminates the expensive bilinear pairing operations. Compared with the previous work, our scheme minimizes the computation cost without any extra cost. Under the discrete logarithm (DL) assumption, a security proof of our scheme in the random oracle model has also been given.

  9. An Effective Capacity Estimation Scheme in IEEE802.11-based Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    H. Zafar

    2012-11-01

    Full Text Available Capacity estimation is a key component of any admission control scheme required to support quality of serviceprovision in mobile ad hoc networks. A range of schemes have been previously proposed to estimate residualcapacity that is derived from window-based measurements of channel estimation. In this paper a simple and improvedmechanism to estimate residual capacity in IEEE802.11-based ad hoc networks is presented. The scheme proposesthe use of a ‘forgiveness’ factor to weight these previous measurements and is shown through simulation-basedevaluation to provide accurate utilizations estimation and improved residual capacity based admission control.

  10. Efficient enhancing scheme for TCP performance over satellite-based internet

    Institute of Scientific and Technical Information of China (English)

    Wang Lina; Gu Xuemai

    2007-01-01

    Satellite link characteristics drastically degrade transport control protocol (TCP) performance. An efficient performance enhancing scheme is proposed. The improvement of TCP performance over satellite-based Intemet is accomplished by protocol transition gateways at each end ora satellite link. The protocol which runs over a satellite link executes the receiver-driven flow control and acknowledgements- and timeouts-based error control strategies. The validity of this TCP performance enhancing scheme is verified by a series of simulation experiments. Results show that the proposed scheme can efficiently enhance the TCP performance over satellite-based Intemet and ensure that the available bandwidth resources of the satellite link are fully utilized.

  11. PERFORMANCE COMPARISON OF CELL-BASED AND PACKET-BASED SWITCHING SCHEMES FOR SHARED MEMORY SWITCHES

    Institute of Scientific and Technical Information of China (English)

    Xi Kang; Ge Ning; Feng Chongxi

    2004-01-01

    Shared Memory (SM) switches are widely used for its high throughput, low delay and efficient use of memory. This paper compares the performance of two prominent switching schemes of SM packet switches: Cell-Based Switching (CBS) and Packet-Based Switching (PBS).Theoretical analysis is carried out to draw qualitative conclusion on the memory requirement,throughput and packet delay of the two schemes. Furthermore, simulations are carried out to get quantitative results of the performance comparison under various system load, traffic patterns,and memory sizes. Simulation results show that PBS has the advantage of shorter time delay while CBS has lower memory requirement and outperforms in throughput when the memory size is limited. The comparison can be used for tradeoff between performance and complexity in switch design.

  12. Implementation of an energy-efficient scheduling scheme based on pipeline flux leak monitoring networks

    Institute of Scientific and Technical Information of China (English)

    ZHOU Peng; YAO JiangHe; PEI JiuLing

    2009-01-01

    Flow against pipeline leakage and the pipe network sudden burst pipe to pipeline leakage flow for the application objects,an energy-efficient real-time scheduling scheme is designed extensively used in pipeline leak monitoring.The proposed scheme can adaptively adjust the network rate in real-time and reduce the cell loss rate,so that it can efficiently avoid the traffic congestion.The recent evolution of wireless sensor networks has yielded a demand to improve energy-efficient scheduling algorithms and energy-efficient medium access protocols.This paper proposes an energy-efficient real-time scheduling scheme that reduces power consumption and network errors on pipeline flux leak monitoring networks.The proposed scheme is based on a dynamic modulation scaling scheme which can scale the number of bits per symbol and a switching scheme which can swap the polling schedule between channels.Built on top of EDF scheduling policy,the proposed scheme enhances the power performance without violating the constraints of real-time streams.The simulation results show that the proposed scheme enhances fault-tolerance and reduces power consumption.Furthermore,that Network congestion avoidance strategy with an energy-efficient real-time scheduling scheme can efficiently improve the bandwidth utilization,TCP friendliness and reduce the packet drop rate in pipeline flux leak monitoring networks.

  13. Security enhancement of a biometric based authentication scheme for telecare medicine information systems with nonce.

    Science.gov (United States)

    Mishra, Dheerendra; Mukhopadhyay, Sourav; Kumari, Saru; Khan, Muhammad Khurram; Chaturvedi, Ankita

    2014-05-01

    Telecare medicine information systems (TMIS) present the platform to deliver clinical service door to door. The technological advances in mobile computing are enhancing the quality of healthcare and a user can access these services using its mobile device. However, user and Telecare system communicate via public channels in these online services which increase the security risk. Therefore, it is required to ensure that only authorized user is accessing the system and user is interacting with the correct system. The mutual authentication provides the way to achieve this. Although existing schemes are either vulnerable to attacks or they have higher computational cost while an scalable authentication scheme for mobile devices should be secure and efficient. Recently, Awasthi and Srivastava presented a biometric based authentication scheme for TMIS with nonce. Their scheme only requires the computation of the hash and XOR functions.pagebreak Thus, this scheme fits for TMIS. However, we observe that Awasthi and Srivastava's scheme does not achieve efficient password change phase. Moreover, their scheme does not resist off-line password guessing attack. Further, we propose an improvement of Awasthi and Srivastava's scheme with the aim to remove the drawbacks of their scheme.

  14. Cryptanalysis and Performance Evaluation of Enhanced Threshold Proxy Signature Scheme Based on RSA for Known Signers

    Directory of Open Access Journals (Sweden)

    Raman Kumar

    2013-01-01

    Full Text Available In these days there are plenty of signature schemes such as the threshold proxy signature scheme (Kumar and Verma 2010. The network is a shared medium so that the weakness security attacks such as eavesdropping, replay attack, and modification attack. Thus, we have to establish a common key for encrypting/decrypting our communications over an insecure network. In this scheme, a threshold proxy signature scheme based on RSA, any or more proxy signers can cooperatively generate a proxy signature while or fewer of them cannot do it. The threshold proxy signature scheme uses the RSA cryptosystem to generate the private and the public key of the signers (Rivest et al., 1978. Comparison is done on the basis of time complexity, space complexity, and communication overhead. We compare the performance of four schemes (Hwang et al. (2003, Kuo and Chen (2005, Yong-Jun et al. (2007, and Li et al. (2007, with the performance of a scheme that has been proposed earlier by the authors of this paper. In the proposed scheme, both the combiner and the secret share holder can verify the correctness of the information that they are receiving from each other. Therefore, the enhanced threshold proxy signature scheme is secure and efficient against notorious conspiracy attacks.

  15. A Fair Off-Line Electronic Cash Scheme Based on Restrictive Partially Blind Signature

    Institute of Scientific and Technical Information of China (English)

    王常吉; 吴建平; 段海新

    2004-01-01

    A fair off-line electronic cash scheme was presented based on a provable secure restrictive partially blind signature.The scheme is more efficient than those of previous works as the expiry date and denomination information are embedded in the electronic cash,which alleviates the storage pressure for the bank to check double spending,and the bank need not use different public keys for different coin values,shops and users need not carry a list of bank's public keys to verify in their electronic wallet.The modular exponentiations are reduced for both the user and the bank by letting the trustee publish the public values with different structure as those of previous electronic cash schemes.The scheme security is based on the random oracle model and the decision Diffie-Hellman assumption.The scheme can be easily extended to multi-trustees and multi-banks using threshold cryptography.

  16. A kind of signature scheme based on class groups of quadratic fields

    Institute of Scientific and Technical Information of China (English)

    董晓蕾; 曹珍富

    2004-01-01

    Quadratic-field cryptosystem is a cryptosystem built from discrete logarithm problem in ideal class groups of quadratic fields(CL-DLP). The problem on digital signature scheme based on ideal class groups of quadratic fields remained open, because of the difficulty of computing class numbers of quadratic fields. In this paper, according to our researches on quadratic fields, we construct the first digital signature scheme in ideal class groups of quadratic fields, using q as modulus, which denotes the prime divisors of ideal class numbers of quadratic fields. Security of the new signature scheme is based fully on CL-DLP. This paper also investigates realization of the scheme, and proposes the concrete technique. In addition, the technique introduced in the paper can be utilized to realize signature schemes of other kinds.

  17. Cost-based droop scheme with lower generation costs for microgrids

    DEFF Research Database (Denmark)

    Nutkani, I. U.; Loh, Poh Chiang; Blaabjerg, Frede

    2013-01-01

    In an autonomous microgrid where centralized management and communication links are not viable, droop control has been the preferred scheme for power sharing among distributed generators (DGs). At present, although many droop variations have been proposed to achieve proportional power sharing based...... on the DG kVA ratings. Other operating characteristics like generation costs, efficiencies and emission penalties at different loadings have not been considered. This makes existing droop schemes not too well-suited for standalone microgrids without central management system, where different types of DGs...... usually exist. As an alternative, this paper proposes a cost-based droop scheme, whose objective is to reduce a generation cost realized with various DG operating characteristics taken into consideration. The proposed droop scheme therefore retains all advantages of the traditional droop schemes, while...

  18. Local fractal dimension based approaches for colonic polyp classification.

    Science.gov (United States)

    Häfner, Michael; Tamaki, Toru; Tanaka, Shinji; Uhl, Andreas; Wimmer, Georg; Yoshida, Shigeto

    2015-12-01

    This work introduces texture analysis methods that are based on computing the local fractal dimension (LFD; or also called the local density function) and applies them for colonic polyp classification. The methods are tested on 8 HD-endoscopic image databases, where each database is acquired using different imaging modalities (Pentax's i-Scan technology combined with or without staining the mucosa) and on a zoom-endoscopic image database using narrow band imaging. In this paper, we present three novel extensions to a LFD based approach. These extensions additionally extract shape and/or gradient information of the image to enhance the discriminativity of the original approach. To compare the results of the LFD based approaches with the results of other approaches, five state of the art approaches for colonic polyp classification are applied to the employed databases. Experiments show that LFD based approaches are well suited for colonic polyp classification, especially the three proposed extensions. The three proposed extensions are the best performing methods or at least among the best performing methods for each of the employed databases. The methods are additionally tested by means of a public texture image database, the UIUCtex database. With this database, the viewpoint invariance of the methods is assessed, an important features for the employed endoscopic image databases. Results imply that most of the LFD based methods are more viewpoint invariant than the other methods. However, the shape, size and orientation adapted LFD approaches (which are especially designed to enhance the viewpoint invariance) are in general not more viewpoint invariant than the other LFD based approaches.

  19. Rule based fuzzy logic approach for classification of fibromyalgia syndrome.

    Science.gov (United States)

    Arslan, Evren; Yildiz, Sedat; Albayrak, Yalcin; Koklukaya, Etem

    2016-06-01

    Fibromyalgia syndrome (FMS) is a chronic muscle and skeletal system disease observed generally in women, manifesting itself with a widespread pain and impairing the individual's quality of life. FMS diagnosis is made based on the American College of Rheumatology (ACR) criteria. However, recently the employability and sufficiency of ACR criteria are under debate. In this context, several evaluation methods, including clinical evaluation methods were proposed by researchers. Accordingly, ACR had to update their criteria announced back in 1990, 2010 and 2011. Proposed rule based fuzzy logic method aims to evaluate FMS at a different angle as well. This method contains a rule base derived from the 1990 ACR criteria and the individual experiences of specialists. The study was conducted using the data collected from 60 inpatient and 30 healthy volunteers. Several tests and physical examination were administered to the participants. The fuzzy logic rule base was structured using the parameters of tender point count, chronic widespread pain period, pain severity, fatigue severity and sleep disturbance level, which were deemed important in FMS diagnosis. It has been observed that generally fuzzy predictor was 95.56 % consistent with at least of the specialists, who are not a creator of the fuzzy rule base. Thus, in diagnosis classification where the severity of FMS was classified as well, consistent findings were obtained from the comparison of interpretations and experiences of specialists and the fuzzy logic approach. The study proposes a rule base, which could eliminate the shortcomings of 1990 ACR criteria during the FMS evaluation process. Furthermore, the proposed method presents a classification on the severity of the disease, which was not available with the ACR criteria. The study was not limited to only disease classification but at the same time the probability of occurrence and severity was classified. In addition, those who were not suffering from FMS were

  20. Dictionary-Based, Clustered Sparse Representation for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Zhen-tao Qin

    2015-01-01

    Full Text Available This paper presents a new, dictionary-based method for hyperspectral image classification, which incorporates both spectral and contextual characteristics of a sample clustered to obtain a dictionary of each pixel. The resulting pixels display a common sparsity pattern in identical clustered groups. We calculated the image’s sparse coefficients using the dictionary approach, which generated the sparse representation features of the remote sensing images. The sparse coefficients are then used to classify the hyperspectral images via a linear SVM. Experiments show that our proposed method of dictionary-based, clustered sparse coefficients can create better representations of hyperspectral images, with a greater overall accuracy and a Kappa coefficient.

  1. Typology of Digital News Media: Theoretical Bases for their Classification

    Directory of Open Access Journals (Sweden)

    Ramón SALAVERRÍA

    2017-01-01

    Full Text Available Since their beginnings in the 1990s, digital news media have undergone a process of settlement and diversification. As a result, the prolific classification of online media has become increasingly rich and complex. Based on a review of media typologies, this article proposes some theoretical bases for the distinction of the online media from previous media and, above all, for the differentiation of the various types of online media among then. With that purpose, nine typologic criteria are proposed: 1 platform, 2 temporality, 3 topic, 4 reach, 5 ownership, 6 authorship, 7 focus, 8 economic purpose, and 9 dynamism.

  2. Network Traffic Anomalies Identification Based on Classification Methods

    Directory of Open Access Journals (Sweden)

    Donatas Račys

    2015-07-01

    Full Text Available A problem of network traffic anomalies detection in the computer networks is analyzed. Overview of anomalies detection methods is given then advantages and disadvantages of the different methods are analyzed. Model for the traffic anomalies detection was developed based on IBM SPSS Modeler and is used to analyze SNMP data of the router. Investigation of the traffic anomalies was done using three classification methods and different sets of the learning data. Based on the results of investigation it was determined that C5.1 decision tree method has the largest accuracy and performance and can be successfully used for identification of the network traffic anomalies.

  3. Comparative Study between Two Schemes of Active-Control-Based Mechatronic Inerter

    Directory of Open Access Journals (Sweden)

    He Lingduo

    2017-01-01

    Full Text Available Based on force-current analogy and velocity-voltage analogy in the theory of electromechanical analogy, the inerter is a device that corresponded to the capacitor completely where conquers the nature restriction of mass, what’s more, it is significant to improve the ratio of the inerter’s inertance to its mass for mechanical networks synthesis. And according to the principle of active-control-based mechatronic inerter, we present two implementation schemes. One was based on linear motor, and the other was based on the ball screw and rotary motor. We introduced the implementation methods and established theoretical model of the two schemes, then compared the ratio of the inerter’s inertance to its mass for the two schemes. Finally, we consider the scheme is better which was based on the ball screw and rotary motor.

  4. The XMM large scale structure survey: optical vs. X-ray classifications of active galactic nuclei and the unified scheme

    CERN Document Server

    Garcet, O; Gosset, E; Sprimont, P G; Surdej, J; Borkowski, V; Tajer, M; Pacaud, F; Pierre, M; Chiappetti, L; MacCagni, D; Page, M J; Carrera, F J; Tedds, J A; Mateos, S; Krumpe, M; Contini, T; Corral, A; Ebrero, J; Gavignaud, I; Schwope, A; Le Fèvre, O; Polletta, M; Rosen, S; Lonsdale, C; Watson, M; Borczyk, W; Väisänen, P

    2007-01-01

    Our goal is to characterize AGN populations by comparing their X-ray and optical classifications. We present a sample of 99 spectroscopically identified X-ray point sources in the XMM-LSS survey which are significantly detected in the [2-10] keV band, and with more than 80 counts. We performed an X-ray spectral analysis for all of these 99 X-ray sources. Introducing the fourfold point correlation coefficient, we find only a mild correlation between the X-ray and the optical classifications, as up to 30% of the sources have differing X-ray and optical classifications: on one hand, 10% of the type 1 sources present broad emission lines in their optical spectra and strong absorption in the X-rays. These objects are highly luminous AGN lying at high redshift and thus dilution effects are totally ruled out, their discrepant nature being an intrinsic property. Their X-ray luminosities and redshifts distributions are consistent with those of the unabsorbed X-ray sources with broad emission lines. On the other hand, ...

  5. Object trajectory-based activity classification and recognition using hidden Markov models.

    Science.gov (United States)

    Bashir, Faisal I; Khokhar, Ashfaq A; Schonfeld, Dan

    2007-07-01

    Motion trajectories provide rich spatiotemporal information about an object's activity. This paper presents novel classification algorithms for recognizing object activity using object motion trajectory. In the proposed classification system, trajectories are segmented at points of change in curvature, and the subtrajectories are represented by their principal component analysis (PCA) coefficients. We first present a framework to robustly estimate the multivariate probability density function based on PCA coefficients of the subtrajectories using Gaussian mixture models (GMMs). We show that GMM-based modeling alone cannot capture the temporal relations and ordering between underlying entities. To address this issue, we use hidden Markov models (HMMs) with a data-driven design in terms of number of states and topology (e.g., left-right versus ergodic). Experiments using a database of over 5700 complex trajectories (obtained from UCI-KDD data archives and Columbia University Multimedia Group) subdivided into 85 different classes demonstrate the superiority of our proposed HMM-based scheme using PCA coefficients of subtrajectories in comparison with other techniques in the literature.

  6. Immunophenotype Discovery, Hierarchical Organization, and Template-based Classification of Flow Cytometry Samples

    Directory of Open Access Journals (Sweden)

    Ariful Azad

    2016-08-01

    Full Text Available We describe algorithms for discovering immunophenotypes from large collections of flow cytometry (FC samples, and using them to organize the samples into a hierarchy based on phenotypic similarity. The hierarchical organization is helpful for effective and robust cytometry data mining, including the creation of collections of cell populations characteristic of different classes of samples, robust classification, and anomaly detection. We summarize a set of samples belonging to a biological class or category with a statistically derived template for the class. Whereas individual samples are represented in terms of their cell populations (clusters, a template consists of generic meta-populations (a group of homogeneous cell populations obtained from the samples in a class that describe key phenotypes shared among all those samples. We organize an FC data collection in a hierarchical data structure that supports the identification of immunophenotypes relevant to clinical diagnosis. A robust template-based classification scheme is also developed, but our primary focus is in the discovery of phenotypic signatures and inter-sample relationships in an FC data collection. This collective analysis approach is more efficient and robust since templates describe phenotypic signatures common to cell populations in several samples, while ignoring noise and small sample-specific variations.We have applied the template-base scheme to analyze several data setsincluding one representing a healthy immune system, and one of Acute Myeloid Leukemia (AMLsamples. The last task is challenging due to the phenotypic heterogeneity of the severalsubtypes of AML. However, we identified thirteen immunophenotypes corresponding to subtypes of AML, and were able to distinguish Acute Promyelocytic Leukemia from other subtypes of AML.

  7. Dense Iterative Contextual Pixel Classification using Kriging

    DEFF Research Database (Denmark)

    Ganz, Melanie; Loog, Marco; Brandt, Sami

    2009-01-01

    In medical applications, segmentation has become an ever more important task. One of the competitive schemes to perform such segmentation is by means of pixel classification. Simple pixel-based classification schemes can be improved by incorporating contextual label information. Various methods h...... relatively long range interactions may play a role. We propose a new method based on Kriging that makes it possible to include such long range interactions, while keeping the computations manageable when dealing with large medical images....

  8. AN AGENT BASED TRANSACTION PROCESSING SCHEME FOR DISCONNECTED MOBILE NODES

    Directory of Open Access Journals (Sweden)

    J.L. Walter Jeyakumar

    2010-12-01

    Full Text Available We present a mobile transaction framework in which mobile users can share data which is stored in the cache of a mobile agent. This mobile agent is a special mobile node which coordinates the sharing process. The proposed framework allows mobile affiliation work groups to be formed dynamically with a mobile agent and mobile hosts. Using short range wireless communication technology, mobile users can simultaneously access the data from the cache of the mobile agent. The data Access Manager module at the mobile agent enforces concurrency control using cache invalidation technique. This model supports disconnected mobile computing allowing mobile agent to move along with the Mobile Hosts. The proposed Transaction frame work has been simulated in Java 2 and performance of this scheme is compared with existing frame works.

  9. Sparse Parallel MRI Based on Accelerated Operator Splitting Schemes

    Science.gov (United States)

    Xie, Weisi; Su, Zhenghang

    2016-01-01

    Recently, the sparsity which is implicit in MR images has been successfully exploited for fast MR imaging with incomplete acquisitions. In this paper, two novel algorithms are proposed to solve the sparse parallel MR imaging problem, which consists of l1 regularization and fidelity terms. The two algorithms combine forward-backward operator splitting and Barzilai-Borwein schemes. Theoretically, the presented algorithms overcome the nondifferentiable property in l1 regularization term. Meanwhile, they are able to treat a general matrix operator that may not be diagonalized by fast Fourier transform and to ensure that a well-conditioned optimization system of equations is simply solved. In addition, we build connections between the proposed algorithms and the state-of-the-art existing methods and prove their convergence with a constant stepsize in Appendix. Numerical results and comparisons with the advanced methods demonstrate the efficiency of proposed algorithms. PMID:27746824

  10. A novel dynamical community detection algorithm based on weighting scheme

    Science.gov (United States)

    Li, Ju; Yu, Kai; Hu, Ke

    2015-12-01

    Network dynamics plays an important role in analyzing the correlation between the function properties and the topological structure. In this paper, we propose a novel dynamical iteration (DI) algorithm, which incorporates the iterative process of membership vector with weighting scheme, i.e. weighting W and tightness T. These new elements can be used to adjust the link strength and the node compactness for improving the speed and accuracy of community structure detection. To estimate the optimal stop time of iteration, we utilize a new stability measure which is defined as the Markov random walk auto-covariance. We do not need to specify the number of communities in advance. It naturally supports the overlapping communities by associating each node with a membership vector describing the node's involvement in each community. Theoretical analysis and experiments show that the algorithm can uncover communities effectively and efficiently.

  11. Analysis on Stability of a Network Based on RED Scheme

    Directory of Open Access Journals (Sweden)

    Shengbo Hu

    2011-04-01

    Full Text Available The RED scheme allows to prevent global synchronization of the sources associated with drop-tail buffers. However, from a control point of view, the drop-tail discipline could lead to strong oscillations and complex behavior of the system. In this paper, the behavior of TCP in high bandwidth-delay product network is analyzed. Secondly, a model of TCP network using RED is described, including RED drop function and model of TCP source. Thirdly, the linear analysis of a single link topology is focused. Finally, a sufficient condition for the stability of a network using RED is given and an engineering approach to select network and protocols’ parameters that lead to stable operation of the linear feedback control system is presented.

  12. Curve interpolation based on Catmull-Clark subdivision scheme

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    An efficient algorithm for curve interpolation is proposed. The algorithm can produce a subdivision surface that can interpolate the predefined cubic B-spline curves by applying the Catmull-Clark scheme to a polygonal mesh containing "symmetric zonal meshes", which possesses some special properties. Many kinds of curve interpolation problems can be dealt with by this algorithm, such as interpolating single open curve or closed curve, a mesh of nonintersecting or intersecting curve. The interpolating surface is C2 everywhere excepting at a finite number of points. At the same time, sharp creases can also be modeled on the limit subdivision surface by duplicating the vertices of the tagged edges of initial mesh, i.e. the surface is only C0 along the cubic B-spline curve that is defined by the tagged edges. Because of being simple and easy to implement, this method can be used for product shape design and graphic software development.

  13. AN IMPROVED DOS-RESISTANT ID-BASED PASSWORD AUTHENTICATION SCHEME WITHOUT USING SMART CARD

    Institute of Scientific and Technical Information of China (English)

    Wen Fengtong; Li Xuelei; Cui Shenjun

    2011-01-01

    In 2010,Hwang,et al.proposed a 'DoS-resistant ID-based password authentication scheme using smart cards' as an improvement of Kim-Lee-Yoo's ‘ID-based password authentication scheme'.In this paper,we cryptanalyze Hwang,et al.'s scheme and point out that the revealed session key could threat the security of the scheme.We demonstrate that extracting information from smart cards is equal to knowing the session key.Thus known session key attacks are also effective under the assumption that the adversary could obtain the information stored in the smart cards.We proposed an improved scheme with security analysis to remedy the weaknesses of Hwang,et al.'s scheme.The new scheme does not only keep all the merits of the original,but also provides several additional phases to improve the flexibility.Finally,the improved scheme is more secure,efficient,practical,and convenient,because elliptic curve cryptosystem is introduced,the expensive smart cards and synchronized clock system are replaced by mobile devices and nonces.

  14. Classification of body movements based on posturographic data.

    Science.gov (United States)

    Saripalle, Sashi K; Paiva, Gavin C; Cliett, Thomas C; Derakhshani, Reza R; King, Gregory W; Lovelace, Christopher T

    2014-02-01

    The human body, standing on two feet, produces a continuous sway pattern. Intended movements, sensory cues, emotional states, and illnesses can all lead to subtle changes in sway appearing as alterations in ground reaction forces and the body's center of pressure (COP). The purpose of this study is to demonstrate that carefully selected COP parameters and classification methods can differentiate among specific body movements while standing, providing new prospects in camera-free motion identification. Force platform data were collected from participants performing 11 choreographed postural and gestural movements. Twenty-three different displacement- and frequency-based features were extracted from COP time series, and supplied to classification-guided feature extraction modules. For identification of movement type, several linear and nonlinear classifiers were explored; including linear discriminants, nearest neighbor classifiers, and support vector machines. The average classification rates on previously unseen test sets ranged from 67% to 100%. Within the context of this experiment, no single method was able to uniformly outperform the others for all movement types, and therefore a set of movement-specific features and classifiers is recommended.

  15. Spectrum-based kernel length estimation for Gaussian process classification.

    Science.gov (United States)

    Wang, Liang; Li, Chuan

    2014-06-01

    Recent studies have shown that Gaussian process (GP) classification, a discriminative supervised learning approach, has achieved competitive performance in real applications compared with most state-of-the-art supervised learning methods. However, the problem of automatic model selection in GP classification, involving the kernel function form and the corresponding parameter values (which are unknown in advance), remains a challenge. To make GP classification a more practical tool, this paper presents a novel spectrum analysis-based approach for model selection by refining the GP kernel function to match the given input data. Specifically, we target the problem of GP kernel length scale estimation. Spectrums are first calculated analytically from the kernel function itself using the autocorrelation theorem as well as being estimated numerically from the training data themselves. Then, the kernel length scale is automatically estimated by equating the two spectrum values, i.e., the kernel function spectrum equals to the estimated training data spectrum. Compared with the classical Bayesian method for kernel length scale estimation via maximizing the marginal likelihood (which is time consuming and could suffer from multiple local optima), extensive experimental results on various data sets show that our proposed method is both efficient and accurate.

  16. Risk Classification and Risk-based Safety and Mission Assurance

    Science.gov (United States)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  17. Geographical classification of apple based on hyperspectral imaging

    Science.gov (United States)

    Guo, Zhiming; Huang, Wenqian; Chen, Liping; Zhao, Chunjiang; Peng, Yankun

    2013-05-01

    Attribute of apple according to geographical origin is often recognized and appreciated by the consumers. It is usually an important factor to determine the price of a commercial product. Hyperspectral imaging technology and supervised pattern recognition was attempted to discriminate apple according to geographical origins in this work. Hyperspectral images of 207 Fuji apple samples were collected by hyperspectral camera (400-1000nm). Principal component analysis (PCA) was performed on hyperspectral imaging data to determine main efficient wavelength images, and then characteristic variables were extracted by texture analysis based on gray level co-occurrence matrix (GLCM) from dominant waveband image. All characteristic variables were obtained by fusing the data of images in efficient spectra. Support vector machine (SVM) was used to construct the classification model, and showed excellent performance in classification results. The total classification rate had the high classify accuracy of 92.75% in the training set and 89.86% in the prediction sets, respectively. The overall results demonstrated that the hyperspectral imaging technique coupled with SVM classifier can be efficiently utilized to discriminate Fuji apple according to geographical origins.

  18. Spectral classification of stars based on LAMOST spectra

    CERN Document Server

    Liu, Chao; Zhang, Bo; Wan, Jun-Chen; Deng, Li-Cai; Hou, Yonghui; Wang, Yuefei; Yang, Ming; Zhang, Yong

    2015-01-01

    In this work, we select the high signal-to-noise ratio spectra of stars from the LAMOST data andmap theirMK classes to the spectral features. The equivalentwidths of the prominent spectral lines, playing the similar role as the multi-color photometry, form a clean stellar locus well ordered by MK classes. The advantage of the stellar locus in line indices is that it gives a natural and continuous classification of stars consistent with either the broadly used MK classes or the stellar astrophysical parameters. We also employ a SVM-based classification algorithm to assignMK classes to the LAMOST stellar spectra. We find that the completenesses of the classification are up to 90% for A and G type stars, while it is down to about 50% for OB and K type stars. About 40% of the OB and K type stars are mis-classified as A and G type stars, respectively. This is likely owe to the difference of the spectral features between the late B type and early A type stars or between the late G and early K type stars are very we...

  19. Content-based and Algorithmic Classifications of Journals: Perspectives on the Dynamics of Scientific Communication and Indexer Effects

    CERN Document Server

    Rafols, Ismael

    2008-01-01

    The aggregated journal-journal citation matrix -based on the Journal Citation Reports (JCR) of the Science Citation Index- can be decomposed by indexers and/or algorithmically. In this study, we test the results of two recently available algorithms for the decomposition of large matrices against two content-based classifications of journals: the ISI Subject Categories and the field/subfield classification of Glaenzel & Schubert (2003). The content-based schemes allow for the attribution of more than a single category to a journal, whereas the algorithms maximize the ratio of within-category citations over between-category citations in the aggregated category-category citation matrix. By adding categories, indexers generate between-category citations, which may enrich the database, for example, in the case of inter-disciplinary developments. The consequent indexer effects are significant in sparse areas of the matrix more than in denser ones. Algorithmic decompositions, on the other hand, are more heavily ...

  20. An Approach for Leukemia Classification Based on Cooperative Game Theory

    Directory of Open Access Journals (Sweden)

    Atefeh Torkaman

    2011-01-01

    Full Text Available Hematological malignancies are the types of cancer that affect blood, bone marrow and lymph nodes. As these tissues are naturally connected through the immune system, a disease affecting one of them will often affect the others as well. The hematological malignancies include; Leukemia, Lymphoma, Multiple myeloma. Among them, leukemia is a serious malignancy that starts in blood tissues especially the bone marrow, where the blood is made. Researches show, leukemia is one of the common cancers in the world. So, the emphasis on diagnostic techniques and best treatments would be able to provide better prognosis and survival for patients. In this paper, an automatic diagnosis recommender system for classifying leukemia based on cooperative game is presented. Through out this research, we analyze the flow cytometry data toward the classification of leukemia into eight classes. We work on real data set from different types of leukemia that have been collected at Iran Blood Transfusion Organization (IBTO. Generally, the data set contains 400 samples taken from human leukemic bone marrow. This study deals with cooperative game used for classification according to different weights assigned to the markers. The proposed method is versatile as there are no constraints to what the input or output represent. This means that it can be used to classify a population according to their contributions. In other words, it applies equally to other groups of data. The experimental results show the accuracy rate of 93.12%, for classification and compared to decision tree (C4.5 with (90.16% in accuracy. The result demonstrates that cooperative game is very promising to be used directly for classification of leukemia as a part of Active Medical decision support system for interpretation of flow cytometry readout. This system could assist clinical hematologists to properly recognize different kinds of leukemia by preparing suggestions and this could improve the treatment