WorldWideScience

Sample records for robust tissue classification

  1. Robust tissue classification for reproducible wound assessment in telemedicine environments

    Science.gov (United States)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  2. Automating the expert consensus paradigm for robust lung tissue classification

    Science.gov (United States)

    Rajagopalan, Srinivasan; Karwoski, Ronald A.; Raghunath, Sushravya; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Clinicians confirm the efficacy of dynamic multidisciplinary interactions in diagnosing Lung disease/wellness from CT scans. However, routine clinical practice cannot readily accomodate such interactions. Current schemes for automating lung tissue classification are based on a single elusive disease differentiating metric; this undermines their reliability in routine diagnosis. We propose a computational workflow that uses a collection (#: 15) of probability density functions (pdf)-based similarity metrics to automatically cluster pattern-specific (#patterns: 5) volumes of interest (#VOI: 976) extracted from the lung CT scans of 14 patients. The resultant clusters are refined for intra-partition compactness and subsequently aggregated into a super cluster using a cluster ensemble technique. The super clusters were validated against the consensus agreement of four clinical experts. The aggregations correlated strongly with expert consensus. By effectively mimicking the expertise of physicians, the proposed workflow could make automation of lung tissue classification a clinical reality.

  3. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    Science.gov (United States)

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  4. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  5. A Robust Geometric Model for Argument Classification

    Science.gov (United States)

    Giannone, Cristina; Croce, Danilo; Basili, Roberto; de Cao, Diego

    Argument classification is the task of assigning semantic roles to syntactic structures in natural language sentences. Supervised learning techniques for frame semantics have been recently shown to benefit from rich sets of syntactic features. However argument classification is also highly dependent on the semantics of the involved lexicals. Empirical studies have shown that domain dependence of lexical information causes large performance drops in outside domain tests. In this paper a distributional approach is proposed to improve the robustness of the learning model against out-of-domain lexical phenomena.

  6. Subcutaneous adipose tissue classification

    Directory of Open Access Journals (Sweden)

    A. Sbarbati

    2010-11-01

    Full Text Available The developments in the technologies based on the use of autologous adipose tissue attracted attention to minor depots as possible sampling areas. Some of those depots have never been studied in detail. The present study was performed on subcutaneous adipose depots sampled in different areas with the aim of explaining their morphology, particularly as far as regards stem niches. The results demonstrated that three different types of white adipose tissue (WAT can be differentiated on the basis of structural and ultrastructural features: deposit WAT (dWAT, structural WAT (sWAT and fibrous WAT (fWAT. dWAT can be found essentially in large fatty depots in the abdominal area (periumbilical. In the dWAT, cells are tightly packed and linked by a weak net of isolated collagen fibers. Collagenic components are very poor, cells are large and few blood vessels are present. The deep portion appears more fibrous then the superficial one. The microcirculation is formed by thin walled capillaries with rare stem niches. Reinforcement pericyte elements are rarely evident. The sWAT is more stromal; it is located in some areas in the limbs and in the hips. The stroma is fairly well represented, with a good vascularity and adequate staminality. Cells are wrapped by a basket of collagen fibers. The fatty depots of the knees and of the trochanteric areas have quite loose meshes. The fWAT has a noteworthy fibrous component and can be found in areas where a severe mechanic stress occurs. Adipocytes have an individual thick fibrous shell. In conclusion, the present study demonstrates evident differences among subcutaneous WAT deposits, thus suggesting that in regenerative procedures based on autologous adipose tissues the sampling area should not be randomly chosen, but it should be oriented by evidence based evaluations. The structural peculiarities of the sWAT, and particularly of its microcirculation, suggest that it could represent a privileged source for

  7. Chance constrained uncertain classification via robust optimization

    NARCIS (Netherlands)

    Ben-Tal, A.; Bhadra, S.; Bhattacharayya, C.; Saketha Nat, J.

    2011-01-01

    This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out

  8. Robust classification using mixtures of dependency networks

    DEFF Research Database (Denmark)

    Gámez, José A.; Mateo, Juan L.; Nielsen, Thomas Dyhre

    2008-01-01

    Dependency networks have previously been proposed as alternatives to e.g. Bayesian networks by supporting fast algorithms for automatic learning. Recently dependency networks have also been proposed as classification models, but as with e.g. general probabilistic inference, the reported speed......-ups are often obtained at the expense of accuracy. In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally...

  9. Integrating Globality and Locality for Robust Representation Based Classification

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2014-01-01

    Full Text Available The representation based classification method (RBCM has shown huge potential for face recognition since it first emerged. Linear regression classification (LRC method and collaborative representation classification (CRC method are two well-known RBCMs. LRC and CRC exploit training samples of each class and all the training samples to represent the testing sample, respectively, and subsequently conduct classification on the basis of the representation residual. LRC method can be viewed as a “locality representation” method because it just uses the training samples of each class to represent the testing sample and it cannot embody the effectiveness of the “globality representation.” On the contrary, it seems that CRC method cannot own the benefit of locality of the general RBCM. Thus we propose to integrate CRC and LRC to perform more robust representation based classification. The experimental results on benchmark face databases substantially demonstrate that the proposed method achieves high classification accuracy.

  10. Robust multi-tissue gene panel for cancer detection

    Directory of Open Access Journals (Sweden)

    Talantov Dmitri

    2010-06-01

    Full Text Available Abstract Background We have identified a set of genes whose relative mRNA expression levels in various solid tumors can be used to robustly distinguish cancer from matching normal tissue. Our current feature set consists of 113 gene probes for 104 unique genes, originally identified as differentially expressed in solid primary tumors in microarray data on Affymetrix HG-U133A platform in five tissue types: breast, colon, lung, prostate and ovary. For each dataset, we first identified a set of genes significantly differentially expressed in tumor vs. normal tissue at p-value = 0.05 using an experimentally derived error model. Our common cancer gene panel is the intersection of these sets of significantly dysregulated genes and can distinguish tumors from normal tissue on all these five tissue types. Methods Frozen tumor specimens were obtained from two commercial vendors Clinomics (Pittsfield, MA and Asterand (Detroit, MI. Biotinylated targets were prepared using published methods (Affymetrix, CA and hybridized to Affymetrix U133A GeneChips (Affymetrix, CA. Expression values for each gene were calculated using Affymetrix GeneChip analysis software MAS 5.0. We then used a software package called Genes@Work for differential expression discovery, and SVM light linear kernel for building classification models. Results We validate the predictability of this gene list on several publicly available data sets generated on the same platform. Of note, when analysing the lung cancer data set of Spira et al, using an SVM linear kernel classifier, our gene panel had 94.7% leave-one-out accuracy compared to 87.8% using the gene panel in the original paper. In addition, we performed high-throughput validation on the Dana Farber Cancer Institute GCOD database and several GEO datasets. Conclusions Our result showed the potential for this panel as a robust classification tool for multiple tumor types on the Affymetrix platform, as well as other whole genome arrays

  11. Robust electrocardiogram (ECG) beat classification using discrete wavelet transform

    International Nuclear Information System (INIS)

    Minhas, Fayyaz-ul-Amir Afsar; Arif, Muhammad

    2008-01-01

    This paper presents a robust technique for the classification of six types of heartbeats through an electrocardiogram (ECG). Features extracted from the QRS complex of the ECG using a wavelet transform along with the instantaneous RR-interval are used for beat classification. The wavelet transform utilized for feature extraction in this paper can also be employed for QRS delineation, leading to reduction in overall system complexity as no separate feature extraction stage would be required in the practical implementation of the system. Only 11 features are used for beat classification with the classification accuracy of ∼99.5% through a KNN classifier. Another main advantage of this method is its robustness to noise, which is illustrated in this paper through experimental results. Furthermore, principal component analysis (PCA) has been used for feature reduction, which reduces the number of features from 11 to 6 while retaining the high beat classification accuracy. Due to reduction in computational complexity (using six features, the time required is ∼4 ms per beat), a simple classifier and noise robustness (at 10 dB signal-to-noise ratio, accuracy is 95%), this method offers substantial advantages over previous techniques for implementation in a practical ECG analyzer

  12. A Dirichlet process mixture model for brain MRI tissue classification.

    Science.gov (United States)

    Ferreira da Silva, Adelino R

    2007-04-01

    Accurate classification of magnetic resonance images according to tissue type or region of interest has become a critical requirement in diagnosis, treatment planning, and cognitive neuroscience. Several authors have shown that finite mixture models give excellent results in the automated segmentation of MR images of the human normal brain. However, performance and robustness of finite mixture models deteriorate when the models have to deal with a variety of anatomical structures. In this paper, we propose a nonparametric Bayesian model for tissue classification of MR images of the brain. The model, known as Dirichlet process mixture model, uses Dirichlet process priors to overcome the limitations of current parametric finite mixture models. To validate the accuracy and robustness of our method we present the results of experiments carried out on simulated MR brain scans, as well as on real MR image data. The results are compared with similar results from other well-known MRI segmentation methods.

  13. Pathological Bases for a Robust Application of Cancer Molecular Classification

    Directory of Open Access Journals (Sweden)

    Salvador J. Diaz-Cano

    2015-04-01

    Full Text Available Any robust classification system depends on its purpose and must refer to accepted standards, its strength relying on predictive values and a careful consideration of known factors that can affect its reliability. In this context, a molecular classification of human cancer must refer to the current gold standard (histological classification and try to improve it with key prognosticators for metastatic potential, staging and grading. Although organ-specific examples have been published based on proteomics, transcriptomics and genomics evaluations, the most popular approach uses gene expression analysis as a direct correlate of cellular differentiation, which represents the key feature of the histological classification. RNA is a labile molecule that varies significantly according with the preservation protocol, its transcription reflect the adaptation of the tumor cells to the microenvironment, it can be passed through mechanisms of intercellular transference of genetic information (exosomes, and it is exposed to epigenetic modifications. More robust classifications should be based on stable molecules, at the genetic level represented by DNA to improve reliability, and its analysis must deal with the concept of intratumoral heterogeneity, which is at the origin of tumor progression and is the byproduct of the selection process during the clonal expansion and progression of neoplasms. The simultaneous analysis of multiple DNA targets and next generation sequencing offer the best practical approach for an analytical genomic classification of tumors.

  14. Median Robust Extended Local Binary Pattern for Texture Classification.

    Science.gov (United States)

    Liu, Li; Lao, Songyang; Fieguth, Paul W; Guo, Yulan; Wang, Xiaogang; Pietikäinen, Matti

    2016-03-01

    Local binary patterns (LBP) are considered among the most computationally efficient high-performance texture features. However, the LBP method is very sensitive to image noise and is unable to capture macrostructure information. To best address these disadvantages, in this paper, we introduce a novel descriptor for texture classification, the median robust extended LBP (MRELBP). Different from the traditional LBP and many LBP variants, MRELBP compares regional image medians rather than raw image intensities. A multiscale LBP type descriptor is computed by efficiently comparing image medians over a novel sampling scheme, which can capture both microstructure and macrostructure texture information. A comprehensive evaluation on benchmark data sets reveals MRELBP's high performance-robust to gray scale variations, rotation changes and noise-but at a low computational cost. MRELBP produces the best classification scores of 99.82%, 99.38%, and 99.77% on three popular Outex test suites. More importantly, MRELBP is shown to be highly robust to image noise, including Gaussian noise, Gaussian blur, salt-and-pepper noise, and random pixel corruption.

  15. Robust Semi-Supervised Manifold Learning Algorithm for Classification

    Directory of Open Access Journals (Sweden)

    Mingxia Chen

    2018-01-01

    Full Text Available In the recent years, manifold learning methods have been widely used in data classification to tackle the curse of dimensionality problem, since they can discover the potential intrinsic low-dimensional structures of the high-dimensional data. Given partially labeled data, the semi-supervised manifold learning algorithms are proposed to predict the labels of the unlabeled points, taking into account label information. However, these semi-supervised manifold learning algorithms are not robust against noisy points, especially when the labeled data contain noise. In this paper, we propose a framework for robust semi-supervised manifold learning (RSSML to address this problem. The noisy levels of the labeled points are firstly predicted, and then a regularization term is constructed to reduce the impact of labeled points containing noise. A new robust semi-supervised optimization model is proposed by adding the regularization term to the traditional semi-supervised optimization model. Numerical experiments are given to show the improvement and efficiency of RSSML on noisy data sets.

  16. A robust probabilistic collaborative representation based classification for multimodal biometrics

    Science.gov (United States)

    Zhang, Jing; Liu, Huanxi; Ding, Derui; Xiao, Jianli

    2018-04-01

    Most of the traditional biometric recognition systems perform recognition with a single biometric indicator. These systems have suffered noisy data, interclass variations, unacceptable error rates, forged identity, and so on. Due to these inherent problems, it is not valid that many researchers attempt to enhance the performance of unimodal biometric systems with single features. Thus, multimodal biometrics is investigated to reduce some of these defects. This paper proposes a new multimodal biometric recognition approach by fused faces and fingerprints. For more recognizable features, the proposed method extracts block local binary pattern features for all modalities, and then combines them into a single framework. For better classification, it employs the robust probabilistic collaborative representation based classifier to recognize individuals. Experimental results indicate that the proposed method has improved the recognition accuracy compared to the unimodal biometrics.

  17. Towards an efficient and robust foot classification from pedobarographic images.

    Science.gov (United States)

    Oliveira, Francisco P M; Sousa, Andreia; Santos, Rubim; Tavares, João Manuel R S

    2012-01-01

    This paper presents a new computational framework for automatic foot classification from digital plantar pressure images. It classifies the foot as left or right and simultaneously calculates two well-known footprint indices: the Cavanagh's arch index (AI) and the modified AI. The accuracy of the framework was evaluated using a set of plantar pressure images from two common pedobarographic devices. The results were outstanding, as all feet under analysis were correctly classified as left or right and no significant differences were observed between the footprint indices calculated using the computational solution and the traditional manual method. The robustness of the proposed framework to arbitrary foot orientations and to the acquisition device was also tested and confirmed.

  18. Robust Pedestrian Classification Based on Hierarchical Kernel Sparse Representation

    Directory of Open Access Journals (Sweden)

    Rui Sun

    2016-08-01

    Full Text Available Vision-based pedestrian detection has become an active topic in computer vision and autonomous vehicles. It aims at detecting pedestrians appearing ahead of the vehicle using a camera so that autonomous vehicles can assess the danger and take action. Due to varied illumination and appearance, complex background and occlusion pedestrian detection in outdoor environments is a difficult problem. In this paper, we propose a novel hierarchical feature extraction and weighted kernel sparse representation model for pedestrian classification. Initially, hierarchical feature extraction based on a CENTRIST descriptor is used to capture discriminative structures. A max pooling operation is used to enhance the invariance of varying appearance. Then, a kernel sparse representation model is proposed to fully exploit the discrimination information embedded in the hierarchical local features, and a Gaussian weight function as the measure to effectively handle the occlusion in pedestrian images. Extensive experiments are conducted on benchmark databases, including INRIA, Daimler, an artificially generated dataset and a real occluded dataset, demonstrating the more robust performance of the proposed method compared to state-of-the-art pedestrian classification methods.

  19. Classification of mislabelled microarrays using robust sparse logistic regression.

    Science.gov (United States)

    Bootkrajang, Jakramate; Kabán, Ata

    2013-04-01

    Previous studies reported that labelling errors are not uncommon in microarray datasets. In such cases, the training set may become misleading, and the ability of classifiers to make reliable inferences from the data is compromised. Yet, few methods are currently available in the bioinformatics literature to deal with this problem. The few existing methods focus on data cleansing alone, without reference to classification, and their performance crucially depends on some tuning parameters. In this article, we develop a new method to detect mislabelled arrays simultaneously with learning a sparse logistic regression classifier. Our method may be seen as a label-noise robust extension of the well-known and successful Bayesian logistic regression classifier. To account for possible mislabelling, we formulate a label-flipping process as part of the classifier. The regularization parameter is automatically set using Bayesian regularization, which not only saves the computation time that cross-validation would take, but also eliminates any unwanted effects of label noise when setting the regularization parameter. Extensive experiments with both synthetic data and real microarray datasets demonstrate that our approach is able to counter the bad effects of labelling errors in terms of predictive performance, it is effective at identifying marker genes and simultaneously it detects mislabelled arrays to high accuracy. The code is available from http://cs.bham.ac.uk/∼jxb008. Supplementary data are available at Bioinformatics online.

  20. Robust Speech/Non-Speech Classification in Heterogeneous Multimedia Content

    NARCIS (Netherlands)

    Huijbregts, M.A.H.; de Jong, Franciska M.G.

    In this paper we present a speech/non-speech classification method that allows high quality classification without the need to know in advance what kinds of audible non-speech events are present in an audio recording and that does not require a single parameter to be tuned on in-domain data. Because

  1. Robust classification of traffic signs using multi-view cues

    NARCIS (Netherlands)

    Hazelhoff, L.; Creusen, I.M.; With, de P.H.N.

    2012-01-01

    Traffic sign inventories are created for road safety and maintenance based on street-level panoramic images. Due to the large capturing interval, large viewpoint deviations between the different capturings occur. These viewpoint variations complicate the classification procedure, which aims at the

  2. Projected estimators for robust semi-supervised classification

    NARCIS (Netherlands)

    Krijthe, J.H.; Loog, M.

    2017-01-01

    For semi-supervised techniques to be applied safely in practice we at least want methods to outperform their supervised counterparts. We study this question for classification using the well-known quadratic surrogate loss function. Unlike other approaches to semi-supervised learning, the

  3. Projected estimators for robust semi-supervised classification

    DEFF Research Database (Denmark)

    Krijthe, Jesse H.; Loog, Marco

    2017-01-01

    For semi-supervised techniques to be applied safely in practice we at least want methods to outperform their supervised counterparts. We study this question for classification using the well-known quadratic surrogate loss function. Unlike other approaches to semi-supervised learning, the procedure...... specifically, we prove that, measured on the labeled and unlabeled training data, this semi-supervised procedure never gives a lower quadratic loss than the supervised alternative. To our knowledge this is the first approach that offers such strong, albeit conservative, guarantees for improvement over...... the supervised solution. The characteristics of our approach are explicated using benchmark datasets to further understand the similarities and differences between the quadratic loss criterion used in the theoretical results and the classification accuracy typically considered in practice....

  4. Active relearning for robust supervised classification of pulmonary emphysema

    Science.gov (United States)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Radiologists are adept at recognizing the appearance of lung parenchymal abnormalities in CT scans. However, the inconsistent differential diagnosis, due to subjective aggregation, mandates supervised classification. Towards optimizing Emphysema classification, we introduce a physician-in-the-loop feedback approach in order to minimize uncertainty in the selected training samples. Using multi-view inductive learning with the training samples, an ensemble of Support Vector Machine (SVM) models, each based on a specific pair-wise dissimilarity metric, was constructed in less than six seconds. In the active relearning phase, the ensemble-expert label conflicts were resolved by an expert. This just-in-time feedback with unoptimized SVMs yielded 15% increase in classification accuracy and 25% reduction in the number of support vectors. The generality of relearning was assessed in the optimized parameter space of six different classifiers across seven dissimilarity metrics. The resultant average accuracy improved to 21%. The co-operative feedback method proposed here could enhance both diagnostic and staging throughput efficiency in chest radiology practice.

  5. Towards an efficient and robust foot classification from pedobarographic images

    OpenAIRE

    Oliveira, Francisco; Sousa, Andreia S. P.; Santos, Rubim; Tavares, João Manuel

    2012-01-01

    O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor). This paper presents a new computational framework for automatic foot classification from digital plantar pressure images. It classifies the foot as left or right and simultaneously calculates two well-known footprint indices: the Cavanagh's arch index and the modified arch index. The accuracy of the framework was evaluated using a set of plantar pressure images from two common pedobarographic devices. The...

  6. Support vector machine classification and validation of cancer tissue samples using microarray expression data.

    Science.gov (United States)

    Furey, T S; Cristianini, N; Duffy, N; Bednarski, D W; Schummer, M; Haussler, D

    2000-10-01

    DNA microarray experiments generating thousands of gene expression measurements, are being used to gather information from tissue and cell samples regarding gene expression differences that will be useful in diagnosing disease. We have developed a new method to analyse this kind of data using support vector machines (SVMs). This analysis consists of both classification of the tissue samples, and an exploration of the data for mis-labeled or questionable tissue results. We demonstrate the method in detail on samples consisting of ovarian cancer tissues, normal ovarian tissues, and other normal tissues. The dataset consists of expression experiment results for 97,802 cDNAs for each tissue. As a result of computational analysis, a tissue sample is discovered and confirmed to be wrongly labeled. Upon correction of this mistake and the removal of an outlier, perfect classification of tissues is achieved, but not with high confidence. We identify and analyse a subset of genes from the ovarian dataset whose expression is highly differentiated between the types of tissues. To show robustness of the SVM method, two previously published datasets from other types of tissues or cells are analysed. The results are comparable to those previously obtained. We show that other machine learning methods also perform comparably to the SVM on many of those datasets. The SVM software is available at http://www.cs. columbia.edu/ approximately bgrundy/svm.

  7. Learning features for tissue classification with the classification restricted Boltzmann machine

    DEFF Research Database (Denmark)

    van Tulder, Gijs; de Bruijne, Marleen

    2014-01-01

    Performance of automated tissue classification in medical imaging depends on the choice of descriptive features. In this paper, we show how restricted Boltzmann machines (RBMs) can be used to learn features that are especially suited for texture-based tissue classification. We introduce the convo...... outperform conventional RBM-based feature learning, which is unsupervised and uses only a generative learning objective, as well as often-used filter banks. We show that a mixture of generative and discriminative learning can produce filters that give a higher classification accuracy....

  8. Robust classification of neonatal apnoea-related desaturations

    International Nuclear Information System (INIS)

    Monasterio, Violeta; Burgess, Fred; Clifford, Gari D

    2012-01-01

    Respiratory signals monitored in the neonatal intensive care units are usually ignored due to the high prevalence of noise and false alarms (FA). Apneic events are generally therefore indicated by a pulse oximeter alarm reacting to the subsequent desaturation. However, the high FA rate in the photoplethysmogram may desensitize staff, reducing the reaction speed. The main reason for the high FA rates of critical care monitors is the unimodal analysis behaviour. In this work, we propose a multimodal analysis framework to reduce the FA rate in neonatal apnoea monitoring. Information about oxygen saturation, heart rate, respiratory rate and signal quality was extracted from electrocardiogram, impedance pneumogram and photoplethysmographic signals for a total of 20 features in the 5 min interval before a desaturation event. 1616 desaturation events from 27 neonatal admissions were annotated by two independent reviewers as true (physiologically relevant) or false (noise-related). Patients were divided into two independent groups for training and validation, and a support vector machine was trained to classify the events as true or false. The best classification performance was achieved on a combination of 13 features with sensitivity, specificity and accuracy of 100% in the training set, and a sensitivity of 86%, a specificity of 91% and an accuracy of 90% in the validation set. (paper)

  9. Implementation of several mathematical algorithms to breast tissue density classification

    International Nuclear Information System (INIS)

    Quintana, C.; Redondo, M.; Tirao, G.

    2014-01-01

    The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories. - Highlights: • Breast density classification can be obtained by suitable mathematical algorithms. • Mathematical processing help radiologists to obtain the BI-RADS classification. • The entropy and joint entropy show high performance for density classification

  10. Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment

    Directory of Open Access Journals (Sweden)

    Rashmi Mukherjee

    2014-01-01

    Full Text Available The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough scheme for chronic wound (CW evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM, were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793.

  11. FAST AND ROBUST SEGMENTATION AND CLASSIFICATION FOR CHANGE DETECTION IN URBAN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    X. Roynard

    2016-06-01

    Full Text Available Change detection is an important issue in city monitoring to analyse street furniture, road works, car parking, etc. For example, parking surveys are needed but are currently a laborious task involving sending operators in the streets to identify the changes in car locations. In this paper, we propose a method that performs a fast and robust segmentation and classification of urban point clouds, that can be used for change detection. We apply this method to detect the cars, as a particular object class, in order to perform parking surveys automatically. A recently proposed method already addresses the need for fast segmentation and classification of urban point clouds, using elevation images. The interest to work on images is that processing is much faster, proven and robust. However there may be a loss of information in complex 3D cases: for example when objects are one above the other, typically a car under a tree or a pedestrian under a balcony. In this paper we propose a method that retain the three-dimensional information while preserving fast computation times and improving segmentation and classification accuracy. It is based on fast region-growing using an octree, for the segmentation, and specific descriptors with Random-Forest for the classification. Experiments have been performed on large urban point clouds acquired by Mobile Laser Scanning. They show that the method is as fast as the state of the art, and that it gives more robust results in the complex 3D cases.

  12. Fast and Robust Segmentation and Classification for Change Detection in Urban Point Clouds

    Science.gov (United States)

    Roynard, X.; Deschaud, J.-E.; Goulette, F.

    2016-06-01

    Change detection is an important issue in city monitoring to analyse street furniture, road works, car parking, etc. For example, parking surveys are needed but are currently a laborious task involving sending operators in the streets to identify the changes in car locations. In this paper, we propose a method that performs a fast and robust segmentation and classification of urban point clouds, that can be used for change detection. We apply this method to detect the cars, as a particular object class, in order to perform parking surveys automatically. A recently proposed method already addresses the need for fast segmentation and classification of urban point clouds, using elevation images. The interest to work on images is that processing is much faster, proven and robust. However there may be a loss of information in complex 3D cases: for example when objects are one above the other, typically a car under a tree or a pedestrian under a balcony. In this paper we propose a method that retain the three-dimensional information while preserving fast computation times and improving segmentation and classification accuracy. It is based on fast region-growing using an octree, for the segmentation, and specific descriptors with Random-Forest for the classification. Experiments have been performed on large urban point clouds acquired by Mobile Laser Scanning. They show that the method is as fast as the state of the art, and that it gives more robust results in the complex 3D cases.

  13. A classification of the mechanisms producing pathological tissue changes.

    Science.gov (United States)

    Grippo, John O; Oh, Daniel S

    2013-05-01

    The objectives are to present a classification of mechanisms which can produce pathological changes in body tissues and fluids, as well as to clarify and define the term biocorrosion, which has had a singular use in engineering. Considering the emerging field of biomedical engineering, it is essential to use precise definitions in the lexicons of engineering, bioengineering and related sciences such as medicine, dentistry and veterinary medicine. The mechanisms of stress, friction and biocorrosion and their pathological effects on tissues are described. Biocorrosion refers to the chemical, biochemical and electrochemical changes by degradation or induced growth of living body tissues and fluids. Various agents which can affect living tissues causing biocorrosion are enumerated which support the necessity and justify the use of this encompassing and more precise definition of biocorrosion. A distinction is made between the mechanisms of corrosion and biocorrosion.

  14. Interactive classification and content-based retrieval of tissue images

    Science.gov (United States)

    Aksoy, Selim; Marchisio, Giovanni B.; Tusk, Carsten; Koperski, Krzysztof

    2002-11-01

    We describe a system for interactive classification and retrieval of microscopic tissue images. Our system models tissues in pixel, region and image levels. Pixel level features are generated using unsupervised clustering of color and texture values. Region level features include shape information and statistics of pixel level feature values. Image level features include statistics and spatial relationships of regions. To reduce the gap between low-level features and high-level expert knowledge, we define the concept of prototype regions. The system learns the prototype regions in an image collection using model-based clustering and density estimation. Different tissue types are modeled using spatial relationships of these regions. Spatial relationships are represented by fuzzy membership functions. The system automatically selects significant relationships from training data and builds models which can also be updated using user relevance feedback. A Bayesian framework is used to classify tissues based on these models. Preliminary experiments show that the spatial relationship models we developed provide a flexible and powerful framework for classification and retrieval of tissue images.

  15. Automated Detection of Connective Tissue by Tissue Counter Analysis and Classification and Regression Trees

    Directory of Open Access Journals (Sweden)

    Josef Smolle

    2001-01-01

    Full Text Available Objective: To evaluate the feasibility of the CART (Classification and Regression Tree procedure for the recognition of microscopic structures in tissue counter analysis. Methods: Digital microscopic images of H&E stained slides of normal human skin and of primary malignant melanoma were overlayed with regularly distributed square measuring masks (elements and grey value, texture and colour features within each mask were recorded. In the learning set, elements were interactively labeled as representing either connective tissue of the reticular dermis, other tissue components or background. Subsequently, CART models were based on these data sets. Results: Implementation of the CART classification rules into the image analysis program showed that in an independent test set 94.1% of elements classified as connective tissue of the reticular dermis were correctly labeled. Automated measurements of the total amount of tissue and of the amount of connective tissue within a slide showed high reproducibility (r=0.97 and r=0.94, respectively; p < 0.001. Conclusions: CART procedure in tissue counter analysis yields simple and reproducible classification rules for tissue elements.

  16. Implementation of several mathematical algorithms to breast tissue density classification

    Science.gov (United States)

    Quintana, C.; Redondo, M.; Tirao, G.

    2014-02-01

    The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories.

  17. Towards precise classification of cancers based on robust gene functional expression profiles

    Directory of Open Access Journals (Sweden)

    Zhu Jing

    2005-03-01

    Full Text Available Abstract Background Development of robust and efficient methods for analyzing and interpreting high dimension gene expression profiles continues to be a focus in computational biology. The accumulated experiment evidence supports the assumption that genes express and perform their functions in modular fashions in cells. Therefore, there is an open space for development of the timely and relevant computational algorithms that use robust functional expression profiles towards precise classification of complex human diseases at the modular level. Results Inspired by the insight that genes act as a module to carry out a highly integrated cellular function, we thus define a low dimension functional expression profile for data reduction. After annotating each individual gene to functional categories defined in a proper gene function classification system such as Gene Ontology applied in this study, we identify those functional categories enriched with differentially expressed genes. For each functional category or functional module, we compute a summary measure (s for the raw expression values of the annotated genes to capture the overall activity level of the module. In this way, we can treat the gene expressions within a functional module as an integrative data point to replace the multiple values of individual genes. We compare the classification performance of decision trees based on functional expression profiles with the conventional gene expression profiles using four publicly available datasets, which indicates that precise classification of tumour types and improved interpretation can be achieved with the reduced functional expression profiles. Conclusion This modular approach is demonstrated to be a powerful alternative approach to analyzing high dimension microarray data and is robust to high measurement noise and intrinsic biological variance inherent in microarray data. Furthermore, efficient integration with current biological knowledge

  18. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  19. Improving Cross-Day EEG-Based Emotion Classification Using Robust Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Yuan-Pin Lin

    2017-07-01

    Full Text Available Constructing a robust emotion-aware analytical framework using non-invasively recorded electroencephalogram (EEG signals has gained intensive attentions nowadays. However, as deploying a laboratory-oriented proof-of-concept study toward real-world applications, researchers are now facing an ecological challenge that the EEG patterns recorded in real life substantially change across days (i.e., day-to-day variability, arguably making the pre-defined predictive model vulnerable to the given EEG signals of a separate day. The present work addressed how to mitigate the inter-day EEG variability of emotional responses with an attempt to facilitate cross-day emotion classification, which was less concerned in the literature. This study proposed a robust principal component analysis (RPCA-based signal filtering strategy and validated its neurophysiological validity and machine-learning practicability on a binary emotion classification task (happiness vs. sadness using a five-day EEG dataset of 12 subjects when participated in a music-listening task. The empirical results showed that the RPCA-decomposed sparse signals (RPCA-S enabled filtering off the background EEG activity that contributed more to the inter-day variability, and predominately captured the EEG oscillations of emotional responses that behaved relatively consistent along days. Through applying a realistic add-day-in classification validation scheme, the RPCA-S progressively exploited more informative features (from 12.67 ± 5.99 to 20.83 ± 7.18 and improved the cross-day binary emotion-classification accuracy (from 58.31 ± 12.33% to 64.03 ± 8.40% as trained the EEG signals from one to four recording days and tested against one unseen subsequent day. The original EEG features (prior to RPCA processing neither achieved the cross-day classification (the accuracy was around chance level nor replicated the encouraging improvement due to the inter-day EEG variability. This result

  20. Neonatal Brain Tissue Classification with Morphological Adaptation and Unified Segmentation

    Directory of Open Access Journals (Sweden)

    Richard eBeare

    2016-03-01

    Full Text Available Measuring the distribution of brain tissue types (tissue classification in neonates is necessary for studying typical and atypical brain development, such as that associated with preterm birth, and may provide biomarkers for neurodevelopmental outcomes. Compared with magnetic resonance images of adults, neonatal images present specific challenges that require the development of specialized, population-specific methods. This paper introduces MANTiS (Morphologically Adaptive Neonatal Tissue Segmentation, which extends the unified segmentation approach to tissue classification implemented in Statistical Parametric Mapping (SPM software to neonates. MANTiS utilizes a combination of unified segmentation, template adaptation via morphological segmentation tools and topological filtering, to segment the neonatal brain into eight tissue classes: cortical gray matter, white matter, deep nuclear gray matter, cerebellum, brainstem, cerebrospinal fluid (CSF, hippocampus and amygdala. We evaluated the performance of MANTiS using two independent datasets. The first dataset, provided by the NeoBrainS12 challenge, consisted of coronal T2-weighted images of preterm infants (born ≤30 weeks’ gestation acquired at 30 weeks’ corrected gestational age (n= 5, coronal T2-weighted images of preterm infants acquired at 40 weeks’ corrected gestational age (n= 5 and axial T2-weighted images of preterm infants acquired at 40 weeks’ corrected gestational age (n= 5. The second dataset, provided by the Washington University NeuroDevelopmental Research (WUNDeR group, consisted of T2-weighted images of preterm infants (born <30 weeks’ gestation acquired shortly after birth (n= 12, preterm infants acquired at term-equivalent age (n= 12, and healthy term-born infants (born ≥38 weeks’ gestation acquired within the first nine days of life (n= 12. For the NeoBrainS12 dataset, mean Dice scores comparing MANTiS with manual segmentations were all above 0.7, except for

  1. Extreme Sparse Multinomial Logistic Regression: A Fast and Robust Framework for Hyperspectral Image Classification

    Science.gov (United States)

    Cao, Faxian; Yang, Zhijing; Ren, Jinchang; Ling, Wing-Kuen; Zhao, Huimin; Marshall, Stephen

    2017-12-01

    Although the sparse multinomial logistic regression (SMLR) has provided a useful tool for sparse classification, it suffers from inefficacy in dealing with high dimensional features and manually set initial regressor values. This has significantly constrained its applications for hyperspectral image (HSI) classification. In order to tackle these two drawbacks, an extreme sparse multinomial logistic regression (ESMLR) is proposed for effective classification of HSI. First, the HSI dataset is projected to a new feature space with randomly generated weight and bias. Second, an optimization model is established by the Lagrange multiplier method and the dual principle to automatically determine a good initial regressor for SMLR via minimizing the training error and the regressor value. Furthermore, the extended multi-attribute profiles (EMAPs) are utilized for extracting both the spectral and spatial features. A combinational linear multiple features learning (MFL) method is proposed to further enhance the features extracted by ESMLR and EMAPs. Finally, the logistic regression via the variable splitting and the augmented Lagrangian (LORSAL) is adopted in the proposed framework for reducing the computational time. Experiments are conducted on two well-known HSI datasets, namely the Indian Pines dataset and the Pavia University dataset, which have shown the fast and robust performance of the proposed ESMLR framework.

  2. Effects on MR images compression in tissue classification quality

    International Nuclear Information System (INIS)

    Santalla, H; Meschino, G; Ballarin, V

    2007-01-01

    It is known that image compression is required to optimize the storage in memory. Moreover, transmission speed can be significantly improved. Lossless compression is used without controversy in medicine, though benefits are limited. If we compress images lossy, where image can not be totally recovered; we can only recover an approximation. In this point definition of 'quality' is essential. What we understand for 'quality'? How can we evaluate a compressed image? Quality in images is an attribute whit several definitions and interpretations, which actually depend on the posterior use we want to give them. This work proposes a quantitative analysis of quality for lossy compressed Magnetic Resonance (MR) images, and their influence in automatic tissue classification, accomplished with these images

  3. Robust

    DEFF Research Database (Denmark)

    2017-01-01

    Robust – Reflections on Resilient Architecture’, is a scientific publication following the conference of the same name in November of 2017. Researches and PhD-Fellows, associated with the Masters programme: Cultural Heritage, Transformation and Restoration (Transformation), at The Royal Danish...

  4. Classification of coronary artery tissues using optical coherence tomography imaging in Kawasaki disease

    Science.gov (United States)

    Abdolmanafi, Atefeh; Prasad, Arpan Suravi; Duong, Luc; Dahdah, Nagib

    2016-03-01

    Intravascular imaging modalities, such as Optical Coherence Tomography (OCT) allow nowadays improving diagnosis, treatment, follow-up, and even prevention of coronary artery disease in the adult. OCT has been recently used in children following Kawasaki disease (KD), the most prevalent acquired coronary artery disease during childhood with devastating complications. The assessment of coronary artery layers with OCT and early detection of coronary sequelae secondary to KD is a promising tool for preventing myocardial infarction in this population. More importantly, OCT is promising for tissue quantification of the inner vessel wall, including neo intima luminal myofibroblast proliferation, calcification, and fibrous scar deposits. The goal of this study is to classify the coronary artery layers of OCT imaging obtained from a series of KD patients. Our approach is focused on developing a robust Random Forest classifier built on the idea of randomly selecting a subset of features at each node and based on second- and higher-order statistical texture analysis which estimates the gray-level spatial distribution of images by specifying the local features of each pixel and extracting the statistics from their distribution. The average classification accuracy for intima and media are 76.36% and 73.72% respectively. Random forest classifier with texture analysis promises for classification of coronary artery tissue.

  5. Quantum Cascade Laser-Based Infrared Microscopy for Label-Free and Automated Cancer Classification in Tissue Sections.

    Science.gov (United States)

    Kuepper, Claus; Kallenbach-Thieltges, Angela; Juette, Hendrik; Tannapfel, Andrea; Großerueschkamp, Frederik; Gerwert, Klaus

    2018-05-16

    A feasibility study using a quantum cascade laser-based infrared microscope for the rapid and label-free classification of colorectal cancer tissues is presented. Infrared imaging is a reliable, robust, automated, and operator-independent tissue classification method that has been used for differential classification of tissue thin sections identifying tumorous regions. However, long acquisition time by the so far used FT-IR-based microscopes hampered the clinical translation of this technique. Here, the used quantum cascade laser-based microscope provides now infrared images for precise tissue classification within few minutes. We analyzed 110 patients with UICC-Stage II and III colorectal cancer, showing 96% sensitivity and 100% specificity of this label-free method as compared to histopathology, the gold standard in routine clinical diagnostics. The main hurdle for the clinical translation of IR-Imaging is overcome now by the short acquisition time for high quality diagnostic images, which is in the same time range as frozen sections by pathologists.

  6. Automatic classification of tissue malignancy for breast carcinoma diagnosis.

    Science.gov (United States)

    Fondón, Irene; Sarmiento, Auxiliadora; García, Ana Isabel; Silvestre, María; Eloy, Catarina; Polónia, António; Aguiar, Paulo

    2018-05-01

    Breast cancer is the second leading cause of cancer death among women. Its early diagnosis is extremely important to prevent avoidable deaths. However, malignancy assessment of tissue biopsies is complex and dependent on observer subjectivity. Moreover, hematoxylin and eosin (H&E)-stained histological images exhibit a highly variable appearance, even within the same malignancy level. In this paper, we propose a computer-aided diagnosis (CAD) tool for automated malignancy assessment of breast tissue samples based on the processing of histological images. We provide four malignancy levels as the output of the system: normal, benign, in situ and invasive. The method is based on the calculation of three sets of features related to nuclei, colour regions and textures considering local characteristics and global image properties. By taking advantage of well-established image processing techniques, we build a feature vector for each image that serves as an input to an SVM (Support Vector Machine) classifier with a quadratic kernel. The method has been rigorously evaluated, first with a 5-fold cross-validation within an initial set of 120 images, second with an external set of 30 different images and third with images with artefacts included. Accuracy levels range from 75.8% when the 5-fold cross-validation was performed to 75% with the external set of new images and 61.11% when the extremely difficult images were added to the classification experiment. The experimental results indicate that the proposed method is capable of distinguishing between four malignancy levels with high accuracy. Our results are close to those obtained with recent deep learning-based methods. Moreover, it performs better than other state-of-the-art methods based on feature extraction, and it can help improve the CAD of breast cancer. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Robust volume assessment of brain tissues for 3-dimensional fourier transformation MRI via a novel multispectral technique.

    Directory of Open Access Journals (Sweden)

    Jyh-Wen Chai

    Full Text Available A new TRIO algorithm method integrating three different algorithms is proposed to perform brain MRI segmentation in the native coordinate space, with no need of transformation to a standard coordinate space or the probability maps for segmentation. The method is a simple voxel-based algorithm, derived from multispectral remote sensing techniques, and only requires minimal operator input to depict GM, WM, and CSF tissue clusters to complete classification of a 3D high-resolution multislice-multispectral MRI data. Results showed very high accuracy and reproducibility in classification of GM, WM, and CSF in multislice-multispectral synthetic MRI data. The similarity indexes, expressing overlap between classification results and the ground truth, were 0.951, 0.962, and 0.956 for GM, WM, and CSF classifications in the image data with 3% noise level and 0% non-uniformity intensity. The method particularly allows for classification of CSF with 0.994, 0.961 and 0.996 of accuracy, sensitivity and specificity in images data with 3% noise level and 0% non-uniformity intensity, which had seldom performed well in previous studies. As for clinical MRI data, the quantitative data of brain tissue volumes aligned closely with the brain morphometrics in three different study groups of young adults, elderly volunteers, and dementia patients. The results also showed very low rates of the intra- and extra-operator variability in measurements of the absolute volumes and volume fractions of cerebral GM, WM, and CSF in three different study groups. The mean coefficients of variation of GM, WM, and CSF volume measurements were in the range of 0.03% to 0.30% of intra-operator measurements and 0.06% to 0.45% of inter-operator measurements. In conclusion, the TRIO algorithm exhibits a remarkable ability in robust classification of multislice-multispectral brain MR images, which would be potentially applicable for clinical brain volumetric analysis and explicitly promising

  8. Data-driven classification of ventilated lung tissues using electrical impedance tomography

    International Nuclear Information System (INIS)

    Gómez-Laberge, Camille; Hogan, Matthew J; Elke, Gunnar; Weiler, Norbert; Frerichs, Inéz; Adler, Andy

    2011-01-01

    Current methods for identifying ventilated lung regions utilizing electrical impedance tomography images rely on dividing the image into arbitrary regions of interest (ROI), manually delineating ROI, or forming ROI with pixels whose signal properties surpass an arbitrary threshold. In this paper, we propose a novel application of a data-driven classification method to identify ventilated lung ROI based on forming k clusters from pixels with correlated signals. A standard first-order model for lung mechanics is then applied to determine which ROI correspond to ventilated lung tissue. We applied the method in an experimental study of 16 mechanically ventilated swine in the supine position, which underwent changes in positive end-expiratory pressure (PEEP) and fraction of inspired oxygen (F I O 2 ). In each stage of the experimental protocol, the method performed best with k = 4 and consistently identified 3 lung tissue ROI and 1 boundary tissue ROI in 15 of the 16 subjects. When testing for changes from baseline in lung position, tidal volume, and respiratory system compliance, we found that PEEP displaced the ventilated lung region dorsally by 2 cm, decreased tidal volume by 1.3%, and increased the respiratory system compliance time constant by 0.3 s. F I O 2 decreased tidal volume by 0.7%. All effects were tested at p < 0.05 with n = 16. These findings suggest that the proposed ROI detection method is robust and sensitive to ventilation dynamics in the experimental setting

  9. A Robust and Device-Free System for the Recognition and Classification of Elderly Activities.

    Science.gov (United States)

    Li, Fangmin; Al-Qaness, Mohammed Abdulaziz Aide; Zhang, Yong; Zhao, Bihai; Luan, Xidao

    2016-12-01

    Human activity recognition, tracking and classification is an essential trend in assisted living systems that can help support elderly people with their daily activities. Traditional activity recognition approaches depend on vision-based or sensor-based techniques. Nowadays, a novel promising technique has obtained more attention, namely device-free human activity recognition that neither requires the target object to wear or carry a device nor install cameras in a perceived area. The device-free technique for activity recognition uses only the signals of common wireless local area network (WLAN) devices available everywhere. In this paper, we present a novel elderly activities recognition system by leveraging the fluctuation of the wireless signals caused by human motion. We present an efficient method to select the correct data from the Channel State Information (CSI) streams that were neglected in previous approaches. We apply a Principle Component Analysis method that exposes the useful information from raw CSI. Thereafter, Forest Decision (FD) is adopted to classify the proposed activities and has gained a high accuracy rate. Extensive experiments have been conducted in an indoor environment to test the feasibility of the proposed system with a total of five volunteer users. The evaluation shows that the proposed system is applicable and robust to electromagnetic noise.

  10. A strategy for tissue self-organization that is robust to cellular heterogeneity and plasticity.

    Science.gov (United States)

    Cerchiari, Alec E; Garbe, James C; Jee, Noel Y; Todhunter, Michael E; Broaders, Kyle E; Peehl, Donna M; Desai, Tejal A; LaBarge, Mark A; Thomson, Matthew; Gartner, Zev J

    2015-02-17

    Developing tissues contain motile populations of cells that can self-organize into spatially ordered tissues based on differences in their interfacial surface energies. However, it is unclear how self-organization by this mechanism remains robust when interfacial energies become heterogeneous in either time or space. The ducts and acini of the human mammary gland are prototypical heterogeneous and dynamic tissues comprising two concentrically arranged cell types. To investigate the consequences of cellular heterogeneity and plasticity on cell positioning in the mammary gland, we reconstituted its self-organization from aggregates of primary cells in vitro. We find that self-organization is dominated by the interfacial energy of the tissue-ECM boundary, rather than by differential homo- and heterotypic energies of cell-cell interaction. Surprisingly, interactions with the tissue-ECM boundary are binary, in that only one cell type interacts appreciably with the boundary. Using mathematical modeling and cell-type-specific knockdown of key regulators of cell-cell cohesion, we show that this strategy of self-organization is robust to severe perturbations affecting cell-cell contact formation. We also find that this mechanism of self-organization is conserved in the human prostate. Therefore, a binary interfacial interaction with the tissue boundary provides a flexible and generalizable strategy for forming and maintaining the structure of two-component tissues that exhibit abundant heterogeneity and plasticity. Our model also predicts that mutations affecting binary cell-ECM interactions are catastrophic and could contribute to loss of tissue architecture in diseases such as breast cancer.

  11. Artificial neural net system for interactive tissue classification with MR imaging and image segmentation

    International Nuclear Information System (INIS)

    Clarke, L.P.; Silbiger, M.; Naylor, C.; Brown, K.

    1990-01-01

    This paper reports on the development of interactive methods for MR tissue classification that permit mathematically rigorous methods for three-dimensional image segmentation and automatic organ/tumor contouring, as required for surgical and RTP planning. The authors investigate a number of image-intensity based tissue- classification methods that make no implicit assumptions on the MR parameters and hence are not limited by image data set. Similarly, we have trained artificial neural net (ANN) systems for both supervised and unsupervised tissue classification

  12. 75 FR 68972 - Medical Devices; General and Plastic Surgery Devices; Classification of Tissue Adhesive With...

    Science.gov (United States)

    2010-11-10

    .... FDA-2010-N-0512] Medical Devices; General and Plastic Surgery Devices; Classification of Tissue... running to unintended areas, etc. B. Wound dehiscence C. Adverse tissue reaction and chemical burns D..., Clinical Studies, Labeling. Adverse tissue reaction and chemical Biocompatibility Animal burns. Testing...

  13. Joint learning and weighting of visual vocabulary for bag-of-feature based tissue classification

    KAUST Repository

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2013-01-01

    their power in this field. Two important issues of bag-of-feature strategy for tissue classification are investigated in this paper: the visual vocabulary learning and weighting, which are always considered independently in traditional methods by neglecting

  14. Mechanically robust cryogels with injectability and bioprinting supportability for adipose tissue engineering.

    Science.gov (United States)

    Qi, Dianjun; Wu, Shaohua; Kuss, Mitchell A; Shi, Wen; Chung, Soonkyu; Deegan, Paul T; Kamenskiy, Alexey; He, Yini; Duan, Bin

    2018-05-26

    Bioengineered adipose tissues have gained increased interest as a promising alternative to autologous tissue flaps and synthetic adipose fillers for soft tissue augmentation and defect reconstruction in clinic. Although many scaffolding materials and biofabrication methods have been investigated for adipose tissue engineering in the last decades, there are still challenges to recapitulate the appropriate adipose tissue microenvironment, maintain volume stability, and induce vascularization to achieve long-term function and integration. In the present research, we fabricated cryogels consisting of methacrylated gelatin, methacrylated hyaluronic acid, and 4arm poly(ethylene glycol) acrylate (PEG-4A) by using cryopolymerization. The cryogels were repeatedly injectable and stretchable, and the addition of PEG-4A improved the robustness and mechanical properties. The cryogels supported human adipose progenitor cell (HWA) and adipose derived mesenchymal stromal cell adhesion, proliferation, and adipogenic differentiation and maturation, regardless of the addition of PEG-4A. The HWA laden cryogels facilitated the co-culture of human umbilical vein endothelial cells (HUVEC) and capillary-like network formation, which in return also promoted adipogenesis. We further combined cryogels with 3D bioprinting to generate handleable adipose constructs with clinically relevant size. 3D bioprinting enabled the deposition of multiple bioinks onto the cryogels. The bioprinted flap-like constructs had an integrated structure without delamination and supported vascularization. Adipose tissue engineering is promising for reconstruction of soft tissue defects, and also challenging for restoring and maintaining soft tissue volume and shape, and achieving vascularization and integration. In this study, we fabricated cryogels with mechanical robustness, injectability, and stretchability by using cryopolymerization. The cryogels promoted cell adhesion, proliferation, and adipogenic

  15. Robust immunohistochemical staining of several classes of proteins in tissues subjected to autolysis.

    Science.gov (United States)

    Maleszewski, Joseph; Lu, Jie; Fox-Talbot, Karen; Halushka, Marc K

    2007-06-01

    Despite the common use of immunohistochemistry in autopsy tissues, the stability of most proteins over extended time periods is unknown. The robustness of signal for 16 proteins (MMP1, MMP2, MMP3, MMP9, TIMP1, TIMP2, TIMP3, AGER, MSR, SCARB1, OLR1, CD36, LTF, LGALS3, LYZ, and DDOST) and two measures of advanced glycation end products (AGE, CML) was evaluated. Two formalin-fixed, paraffin-embedded human tissue arrays containing 16 tissues each were created to evaluate 48 hr of autolysis in a warm or cold environment. For these classes of proteins, matrix metalloproteinases and their inhibitors, scavenger receptors, and advanced glycation end product receptors, we saw no systematic diminution of signal intensity during a period of 24 hr. Analysis was performed by two independent observers and confirmed for a subset of proteins by digital analysis and Western blotting. We conclude that these classes of proteins degrade slowly and faithfully maintain their immunohistochemistry characteristics over at least a 24-hr time interval in devitalized tissues. This study supports the use of autopsy tissues with short postmortem intervals for immunohistochemical studies for diseases such as diabetic vascular disease, cancer, Alzheimer's disease, atherosclerosis, and other pathological states. This manuscript contains online supplemental material at http://www.jhc.org. Please visit this article online to view these materials.

  16. Robust cell tracking in epithelial tissues through identification of maximum common subgraphs.

    Science.gov (United States)

    Kursawe, Jochen; Bardenet, Rémi; Zartman, Jeremiah J; Baker, Ruth E; Fletcher, Alexander G

    2016-11-01

    Tracking of cells in live-imaging microscopy videos of epithelial sheets is a powerful tool for investigating fundamental processes in embryonic development. Characterizing cell growth, proliferation, intercalation and apoptosis in epithelia helps us to understand how morphogenetic processes such as tissue invagination and extension are locally regulated and controlled. Accurate cell tracking requires correctly resolving cells entering or leaving the field of view between frames, cell neighbour exchanges, cell removals and cell divisions. However, current tracking methods for epithelial sheets are not robust to large morphogenetic deformations and require significant manual interventions. Here, we present a novel algorithm for epithelial cell tracking, exploiting the graph-theoretic concept of a 'maximum common subgraph' to track cells between frames of a video. Our algorithm does not require the adjustment of tissue-specific parameters, and scales in sub-quadratic time with tissue size. It does not rely on precise positional information, permitting large cell movements between frames and enabling tracking in datasets acquired at low temporal resolution due to experimental constraints such as phototoxicity. To demonstrate the method, we perform tracking on the Drosophila embryonic epidermis and compare cell-cell rearrangements to previous studies in other tissues. Our implementation is open source and generally applicable to epithelial tissues. © 2016 The Authors.

  17. ATMAD: robust image analysis for Automatic Tissue MicroArray De-arraying.

    Science.gov (United States)

    Nguyen, Hoai Nam; Paveau, Vincent; Cauchois, Cyril; Kervrann, Charles

    2018-04-19

    Over the last two decades, an innovative technology called Tissue Microarray (TMA), which combines multi-tissue and DNA microarray concepts, has been widely used in the field of histology. It consists of a collection of several (up to 1000 or more) tissue samples that are assembled onto a single support - typically a glass slide - according to a design grid (array) layout, in order to allow multiplex analysis by treating numerous samples under identical and standardized conditions. However, during the TMA manufacturing process, the sample positions can be highly distorted from the design grid due to the imprecision when assembling tissue samples and the deformation of the embedding waxes. Consequently, these distortions may lead to severe errors of (histological) assay results when the sample identities are mismatched between the design and its manufactured output. The development of a robust method for de-arraying TMA, which localizes and matches TMA samples with their design grid, is therefore crucial to overcome the bottleneck of this prominent technology. In this paper, we propose an Automatic, fast and robust TMA De-arraying (ATMAD) approach dedicated to images acquired with brightfield and fluorescence microscopes (or scanners). First, tissue samples are localized in the large image by applying a locally adaptive thresholding on the isotropic wavelet transform of the input TMA image. To reduce false detections, a parametric shape model is considered for segmenting ellipse-shaped objects at each detected position. Segmented objects that do not meet the size and the roundness criteria are discarded from the list of tissue samples before being matched with the design grid. Sample matching is performed by estimating the TMA grid deformation under the thin-plate model. Finally, thanks to the estimated deformation, the true tissue samples that were preliminary rejected in the early image processing step are recognized by running a second segmentation step. We

  18. A Robust Method to Generate Mechanically Anisotropic Vascular Smooth Muscle Cell Sheets for Vascular Tissue Engineering.

    Science.gov (United States)

    Backman, Daniel E; LeSavage, Bauer L; Shah, Shivem B; Wong, Joyce Y

    2017-06-01

    In arterial tissue engineering, mimicking native structure and mechanical properties is essential because compliance mismatch can lead to graft failure and further disease. With bottom-up tissue engineering approaches, designing tissue components with proper microscale mechanical properties is crucial to achieve the necessary macroscale properties in the final implant. This study develops a thermoresponsive cell culture platform for growing aligned vascular smooth muscle cell (VSMC) sheets by photografting N-isopropylacrylamide (NIPAAm) onto micropatterned poly(dimethysiloxane) (PDMS). The grafting process is experimentally and computationally optimized to produce PNIPAAm-PDMS substrates optimal for VSMC attachment. To allow long-term VSMC sheet culture and increase the rate of VSMC sheet formation, PNIPAAm-PDMS surfaces were further modified with 3-aminopropyltriethoxysilane yielding a robust, thermoresponsive cell culture platform for culturing VSMC sheets. VSMC cell sheets cultured on patterned thermoresponsive substrates exhibit cellular and collagen alignment in the direction of the micropattern. Mechanical characterization of patterned, single-layer VSMC sheets reveals increased stiffness in the aligned direction compared to the perpendicular direction whereas nonpatterned cell sheets exhibit no directional dependence. Structural and mechanical anisotropy of aligned, single-layer VSMC sheets makes this platform an attractive microstructural building block for engineering a vascular graft to match the in vivo mechanical properties of native arterial tissue. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Joint learning and weighting of visual vocabulary for bag-of-feature based tissue classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2013-12-01

    Automated classification of tissue types of Region of Interest (ROI) in medical images has been an important application in Computer-Aided Diagnosis (CAD). Recently, bag-of-feature methods which treat each ROI as a set of local features have shown their power in this field. Two important issues of bag-of-feature strategy for tissue classification are investigated in this paper: the visual vocabulary learning and weighting, which are always considered independently in traditional methods by neglecting the inner relationship between the visual words and their weights. To overcome this problem, we develop a novel algorithm, Joint-ViVo, which learns the vocabulary and visual word weights jointly. A unified objective function based on large margin is defined for learning of both visual vocabulary and visual word weights, and optimized alternately in the iterative algorithm. We test our algorithm on three tissue classification tasks: classifying breast tissue density in mammograms, classifying lung tissue in High-Resolution Computed Tomography (HRCT) images, and identifying brain tissue type in Magnetic Resonance Imaging (MRI). The results show that Joint-ViVo outperforms the state-of-art methods on tissue classification problems. © 2013 Elsevier Ltd.

  20. The method of diagnosis and classification of the gingival line defects of the teeth hard tissues

    Directory of Open Access Journals (Sweden)

    Olena Bulbuk

    2017-06-01

    Full Text Available For solving the problem of diagnosis and treatment of hard tissue defects the significant role belongs to the choice of tactics for dental treatment of hard tissue defects located in the gingival line of any tooth. This work aims to study the problems of diagnosis and classification of gingival line defects of the teeth hard tissues. That will contribute to the objectification of differentiated diagnostic and therapeutic approaches in the dental treatment of various clinical variants of these defects localization. The objective of the study – is to develop the anatomical-functional classification for differentiated estimation of hard tissue defects in the gingival part, as the basis for the application of differential diagnostic-therapeutic approaches to the dental treatment of hard tissue defects disposed in the gingival part of any tooth. Materials and methods of investigation: There was conducted the examination of 48 patients with hard tissue defects located in the gingival part of any tooth. To assess the magnitude of gingival line destruction the periodontal probe and X-ray examination were used. Results. The result of the performed research the classification of the gingival line defects of the hard tissues was offered using exponent power. The value of this indicator is equal to an integer number expressed in millimeters of distance from the epithelial attachment to the cavity’s bottom of defect. Conclusions. The proposed classification fills an obvious gap in academic representations about hard tissue defects located in the gingival part of any tooth. Also it offers the prospects of consensus on differentiated diagnostic-therapeutic approaches in different clinical variants of location.  This classification builds methodological “bridge of continuity” between therapeutic and prosthetic dentistry in the field of treatment of the gingival line defects of dental hard tissues.

  1. Mass Spectrometry Imaging for the Classification of Tumor Tissue

    NARCIS (Netherlands)

    Mascini, N.E.

    2016-01-01

    Mass spectrometry imaging (MSI) can detect and identify many different molecules without the need for labeling. In addition, it can provide their spatial distributions as ‘molecular maps’. These features make MSI well suited for studying the molecular makeup of tumor tissue. Currently, there is an

  2. Efficacy of hidden markov model over support vector machine on multiclass classification of healthy and cancerous cervical tissues

    Science.gov (United States)

    Mukhopadhyay, Sabyasachi; Kurmi, Indrajit; Pratiher, Sawon; Mukherjee, Sukanya; Barman, Ritwik; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    In this paper, a comparative study between SVM and HMM has been carried out for multiclass classification of cervical healthy and cancerous tissues. In our study, the HMM methodology is more promising to produce higher accuracy in classification.

  3. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    Science.gov (United States)

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  4. A simple and robust method for automated photometric classification of supernovae using neural networks

    Science.gov (United States)

    Karpenka, N. V.; Feroz, F.; Hobson, M. P.

    2013-02-01

    A method is presented for automated photometric classification of supernovae (SNe) as Type Ia or non-Ia. A two-step approach is adopted in which (i) the SN light curve flux measurements in each observing filter are fitted separately to an analytical parametrized function that is sufficiently flexible to accommodate virtually all types of SNe and (ii) the fitted function parameters and their associated uncertainties, along with the number of flux measurements, the maximum-likelihood value of the fit and Bayesian evidence for the model, are used as the input feature vector to a classification neural network that outputs the probability that the SN under consideration is of Type Ia. The method is trained and tested using data released following the Supernova Photometric Classification Challenge (SNPCC), consisting of light curves for 20 895 SNe in total. We consider several random divisions of the data into training and testing sets: for instance, for our sample D_1 (D_4), a total of 10 (40) per cent of the data are involved in training the algorithm and the remainder used for blind testing of the resulting classifier; we make no selection cuts. Assigning a canonical threshold probability of pth = 0.5 on the network output to class an SN as Type Ia, for the sample D_1 (D_4) we obtain a completeness of 0.78 (0.82), purity of 0.77 (0.82) and SNPCC figure of merit of 0.41 (0.50). Including the SN host-galaxy redshift and its uncertainty as additional inputs to the classification network results in a modest 5-10 per cent increase in these values. We find that the quality of the classification does not vary significantly with SN redshift. Moreover, our probabilistic classification method allows one to calculate the expected completeness, purity and figure of merit (or other measures of classification quality) as a function of the threshold probability pth, without knowing the true classes of the SNe in the testing sample, as is the case in the classification of real SNe

  5. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Directory of Open Access Journals (Sweden)

    Miles Miller

    Full Text Available Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism, demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in

  6. Modular design of artificial tissue homeostasis: robust control through synthetic cellular heterogeneity.

    Science.gov (United States)

    Miller, Miles; Hafner, Marc; Sontag, Eduardo; Davidsohn, Noah; Subramanian, Sairam; Purnick, Priscilla E M; Lauffenburger, Douglas; Weiss, Ron

    2012-01-01

    Synthetic biology efforts have largely focused on small engineered gene networks, yet understanding how to integrate multiple synthetic modules and interface them with endogenous pathways remains a challenge. Here we present the design, system integration, and analysis of several large scale synthetic gene circuits for artificial tissue homeostasis. Diabetes therapy represents a possible application for engineered homeostasis, where genetically programmed stem cells maintain a steady population of β-cells despite continuous turnover. We develop a new iterative process that incorporates modular design principles with hierarchical performance optimization targeted for environments with uncertainty and incomplete information. We employ theoretical analysis and computational simulations of multicellular reaction/diffusion models to design and understand system behavior, and find that certain features often associated with robustness (e.g., multicellular synchronization and noise attenuation) are actually detrimental for tissue homeostasis. We overcome these problems by engineering a new class of genetic modules for 'synthetic cellular heterogeneity' that function to generate beneficial population diversity. We design two such modules (an asynchronous genetic oscillator and a signaling throttle mechanism), demonstrate their capacity for enhancing robust control, and provide guidance for experimental implementation with various computational techniques. We found that designing modules for synthetic heterogeneity can be complex, and in general requires a framework for non-linear and multifactorial analysis. Consequently, we adapt a 'phenotypic sensitivity analysis' method to determine how functional module behaviors combine to achieve optimal system performance. We ultimately combine this analysis with Bayesian network inference to extract critical, causal relationships between a module's biochemical rate-constants, its high level functional behavior in isolation, and

  7. Naïve and Robust: Class-Conditional Independence in Human Classification Learning

    Science.gov (United States)

    Jarecki, Jana B.; Meder, Björn; Nelson, Jonathan D.

    2018-01-01

    Humans excel in categorization. Yet from a computational standpoint, learning a novel probabilistic classification task involves severe computational challenges. The present paper investigates one way to address these challenges: assuming class-conditional independence of features. This feature independence assumption simplifies the inference…

  8. Using robust principal component analysis to alleviate day-to-day variability in EEG based emotion classification.

    Science.gov (United States)

    Ping-Keng Jao; Yuan-Pin Lin; Yi-Hsuan Yang; Tzyy-Ping Jung

    2015-08-01

    An emerging challenge for emotion classification using electroencephalography (EEG) is how to effectively alleviate day-to-day variability in raw data. This study employed the robust principal component analysis (RPCA) to address the problem with a posed hypothesis that background or emotion-irrelevant EEG perturbations lead to certain variability across days and somehow submerge emotion-related EEG dynamics. The empirical results of this study evidently validated our hypothesis and demonstrated the RPCA's feasibility through the analysis of a five-day dataset of 12 subjects. The RPCA allowed tackling the sparse emotion-relevant EEG dynamics from the accompanied background perturbations across days. Sequentially, leveraging the RPCA-purified EEG trials from more days appeared to improve the emotion-classification performance steadily, which was not found in the case using the raw EEG features. Therefore, incorporating the RPCA with existing emotion-aware machine-learning frameworks on a longitudinal dataset of each individual may shed light on the development of a robust affective brain-computer interface (ABCI) that can alleviate ecological inter-day variability.

  9. Robust Sounds of Activities of Daily Living Classification in Two-Channel Audio-Based Telemonitoring

    Directory of Open Access Journals (Sweden)

    David Maunder

    2013-01-01

    Full Text Available Despite recent advances in the area of home telemonitoring, the challenge of automatically detecting the sound signatures of activities of daily living of an elderly patient using nonintrusive and reliable methods remains. This paper investigates the classification of eight typical sounds of daily life from arbitrarily positioned two-microphone sensors under realistic noisy conditions. In particular, the role of several source separation and sound activity detection methods is considered. Evaluations on a new four-microphone database collected under four realistic noise conditions reveal that effective sound activity detection can produce significant gains in classification accuracy and that further gains can be made using source separation methods based on independent component analysis. Encouragingly, the results show that recognition accuracies in the range 70%–100% can be consistently obtained using different microphone-pair positions, under all but the most severe noise conditions.

  10. IARC use of oxidative stress as key mode of action characteristic for facilitating cancer classification: Glyphosate case example illustrating a lack of robustness in interpretative implementation.

    Science.gov (United States)

    Bus, James S

    2017-06-01

    The International Agency for Research on Cancer (IARC) has formulated 10 key characteristics of human carcinogens to incorporate mechanistic data into cancer hazard classifications. The analysis used glyphosate as a case example to examine the robustness of IARC's determination of oxidative stress as "strong" evidence supporting a plausible cancer mechanism in humans. The IARC analysis primarily relied on 14 human/mammalian studies; 19 non-mammalian studies were uninformative of human cancer given the broad spectrum of test species and extensive use of formulations and aquatic testing. The mammalian studies had substantial experimental limitations for informing cancer mechanism including use of: single doses and time points; cytotoxic/toxic test doses; tissues not identified as potential cancer targets; glyphosate formulations or mixtures; technically limited oxidative stress biomarkers. The doses were many orders of magnitude higher than human exposures determined in human biomonitoring studies. The glyphosate case example reveals that the IARC evaluation fell substantially short of "strong" supporting evidence of oxidative stress as a plausible human cancer mechanism, and suggests that other IARC monographs relying on the 10 key characteristics approach should be similarly examined for a lack of robust data integration fundamental to reasonable mode of action evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Breast tissue classification using x-ray scattering measurements and multivariate data analysis

    Science.gov (United States)

    Ryan, Elaine A.; Farquharson, Michael J.

    2007-11-01

    This study utilized two radiation scatter interactions in order to differentiate malignant from non-malignant breast tissue. These two interactions were Compton scatter, used to measure the electron density of the tissues, and coherent scatter to obtain a measure of structure. Measurements of these parameters were made using a laboratory experimental set-up comprising an x-ray tube and HPGe detector. The breast tissue samples investigated comprise five different tissue classifications: adipose, malignancy, fibroadenoma, normal fibrous tissue and tissue that had undergone fibrocystic change. The coherent scatter spectra were analysed using a peak fitting routine, and a technique involving multivariate analysis was used to combine the peak fitted scatter profile spectra and the electron density values into a tissue classification model. The number of variables used in the model was refined by finding the sensitivity and specificity of each model and concentrating on differentiating between two tissues at a time. The best model that was formulated had a sensitivity of 54% and a specificity of 100%.

  12. Breast tissue classification using x-ray scattering measurements and multivariate data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, Elaine A; Farquharson, Michael J [School of Allied Health Sciences, City University, Charterhouse Square, London EC1M 6PA (United Kingdom)

    2007-11-21

    This study utilized two radiation scatter interactions in order to differentiate malignant from non-malignant breast tissue. These two interactions were Compton scatter, used to measure the electron density of the tissues, and coherent scatter to obtain a measure of structure. Measurements of these parameters were made using a laboratory experimental set-up comprising an x-ray tube and HPGe detector. The breast tissue samples investigated comprise five different tissue classifications: adipose, malignancy, fibroadenoma, normal fibrous tissue and tissue that had undergone fibrocystic change. The coherent scatter spectra were analysed using a peak fitting routine, and a technique involving multivariate analysis was used to combine the peak fitted scatter profile spectra and the electron density values into a tissue classification model. The number of variables used in the model was refined by finding the sensitivity and specificity of each model and concentrating on differentiating between two tissues at a time. The best model that was formulated had a sensitivity of 54% and a specificity of 100%.

  13. Visualization and tissue classification of human breast cancer images using ultrahigh-resolution OCT (Conference Presentation)

    Science.gov (United States)

    Yao, Xinwen; Gan, Yu; Chang, Ernest W.; Hibshoosh, Hanina; Feldman, Sheldon; Hendon, Christine P.

    2017-02-01

    We employed a home-built ultrahigh resolution (UHR) OCT system at 800nm to image human breast cancer sample ex vivo. The system has an axial resolution of 2.72µm and a lateral resolution of 5.52µm with an extended imaging range of 1.78mm. Over 900 UHR OCT volumes were generated on specimens from 23 breast cancer cases. With better spatial resolution, detailed structures in the breast tissue were better defined. Different types of breast cancer as well as healthy breast tissue can be well delineated from the UHR OCT images. To quantitatively evaluate the advantages of UHR OCT imaging of breast cancer, features derived from OCT intensity images were used as inputs to a machine learning model, the relevance vector machine. A trained machine learning model was employed to evaluate the performance of tissue classification based on UHR OCT images for differentiating tissue types in the breast samples, including adipose tissue, healthy stroma and cancerous region. For adipose tissue, grid-based local features were extracted from OCT intensity data, including standard deviation, entropy, and homogeneity. We showed that it was possible to enhance the classification performance on distinguishing fat tissue from non-fat tissue by using the UHR images when compared with the results based on OCT images from a commercial 1300 nm OCT system. For invasive ductal carcinoma (IDC) and normal stroma differentiation, the classification was based on frame-based features that portray signal penetration depth and tissue reflectivity. The confusing matrix indicated a sensitivity of 97.5% and a sensitivity of 77.8%.

  14. Multiplatform analysis of 12 cancer types reveals molecular classification within and across tissues of origin

    DEFF Research Database (Denmark)

    Hoadley, Katherine A; Yau, Christina; Wolf, Denise M

    2014-01-01

    Recent genomic analyses of pathologically defined tumor types identify "within-a-tissue" disease subtypes. However, the extent to which genomic signatures are shared across tissues is still unclear. We performed an integrative analysis using five genome-wide platforms and one proteomic platform...... on 3,527 specimens from 12 cancer types, revealing a unified classification into 11 major subtypes. Five subtypes were nearly identical to their tissue-of-origin counterparts, but several distinct cancer types were found to converge into common subtypes. Lung squamous, head and neck, and a subset...

  15. Classification between normal and tumor tissues based on the pair-wise gene expression ratio

    International Nuclear Information System (INIS)

    Yap, YeeLeng; Zhang, XueWu; Ling, MT; Wang, XiangHong; Wong, YC; Danchin, Antoine

    2004-01-01

    Precise classification of cancer types is critically important for early cancer diagnosis and treatment. Numerous efforts have been made to use gene expression profiles to improve precision of tumor classification. However, reliable cancer-related signals are generally lacking. Using recent datasets on colon and prostate cancer, a data transformation procedure from single gene expression to pair-wise gene expression ratio is proposed. Making use of the internal consistency of each expression profiling dataset this transformation improves the signal to noise ratio of the dataset and uncovers new relevant cancer-related signals (features). The efficiency in using the transformed dataset to perform normal/tumor classification was investigated using feature partitioning with informative features (gene annotation) as discriminating axes (single gene expression or pair-wise gene expression ratio). Classification results were compared to the original datasets for up to 10-feature model classifiers. 82 and 262 genes that have high correlation to tissue phenotype were selected from the colon and prostate datasets respectively. Remarkably, data transformation of the highly noisy expression data successfully led to lower the coefficient of variation (CV) for the within-class samples as well as improved the correlation with tissue phenotypes. The transformed dataset exhibited lower CV when compared to that of single gene expression. In the colon cancer set, the minimum CV decreased from 45.3% to 16.5%. In prostate cancer, comparable CV was achieved with and without transformation. This improvement in CV, coupled with the improved correlation between the pair-wise gene expression ratio and tissue phenotypes, yielded higher classification efficiency, especially with the colon dataset – from 87.1% to 93.5%. Over 90% of the top ten discriminating axes in both datasets showed significant improvement after data transformation. The high classification efficiency achieved suggested

  16. Robust Automatic Modulation Classification Technique for Fading Channels via Deep Neural Network

    Directory of Open Access Journals (Sweden)

    Jung Hwan Lee

    2017-08-01

    Full Text Available In this paper, we propose a deep neural network (DNN-based automatic modulation classification (AMC for digital communications. While conventional AMC techniques perform well for additive white Gaussian noise (AWGN channels, classification accuracy degrades for fading channels where the amplitude and phase of channel gain change in time. The key contributions of this paper are in two phases. First, we analyze the effectiveness of a variety of statistical features for AMC task in fading channels. We reveal that the features that are shown to be effective for fading channels are different from those known to be good for AWGN channels. Second, we introduce a new enhanced AMC technique based on DNN method. We use the extensive and diverse set of statistical features found in our study for the DNN-based classifier. The fully connected feedforward network with four hidden layers are trained to classify the modulation class for several fading scenarios. Numerical evaluation shows that the proposed technique offers significant performance gain over the existing AMC methods in fading channels.

  17. Robust automated classification of first-motion polarities for focal mechanism determination with machine learning

    Science.gov (United States)

    Ross, Z. E.; Meier, M. A.; Hauksson, E.

    2017-12-01

    Accurate first-motion polarities are essential for determining earthquake focal mechanisms, but are difficult to measure automatically because of picking errors and signal to noise issues. Here we develop an algorithm for reliable automated classification of first-motion polarities using machine learning algorithms. A classifier is designed to identify whether the first-motion polarity is up, down, or undefined by examining the waveform data directly. We first improve the accuracy of automatic P-wave onset picks by maximizing a weighted signal/noise ratio for a suite of candidate picks around the automatic pick. We then use the waveform amplitudes before and after the optimized pick as features for the classification. We demonstrate the method's potential by training and testing the classifier on tens of thousands of hand-made first-motion picks by the Southern California Seismic Network. The classifier assigned the same polarity as chosen by an analyst in more than 94% of the records. We show that the method is generalizable to a variety of learning algorithms, including neural networks and random forest classifiers. The method is suitable for automated processing of large seismic waveform datasets, and can potentially be used in real-time applications, e.g. for improving the source characterizations of earthquake early warning algorithms.

  18. Robust through-the-wall radar image classification using a target-model alignment procedure.

    Science.gov (United States)

    Smith, Graeme E; Mobasseri, Bijan G

    2012-02-01

    A through-the-wall radar image (TWRI) bears little resemblance to the equivalent optical image, making it difficult to interpret. To maximize the intelligence that may be obtained, it is desirable to automate the classification of targets in the image to support human operators. This paper presents a technique for classifying stationary targets based on the high-range resolution profile (HRRP) extracted from 3-D TWRIs. The dependence of the image on the target location is discussed using a system point spread function (PSF) approach. It is shown that the position dependence will cause a classifier to fail, unless the image to be classified is aligned to a classifier-training location. A target image alignment technique based on deconvolution of the image with the system PSF is proposed. Comparison of the aligned target images with measured images shows the alignment process introducing normalized mean squared error (NMSE) ≤ 9%. The HRRP extracted from aligned target images are classified using a naive Bayesian classifier supported by principal component analysis. The classifier is tested using a real TWRI of canonical targets behind a concrete wall and shown to obtain correct classification rates ≥ 97%. © 2011 IEEE

  19. A Robust Classification of Galaxy Spectra: Dealing with Noisy and Incomplete Data

    Science.gov (United States)

    Connolly, A. J.; Szalay, A. S.

    1999-05-01

    Over the next few years new spectroscopic surveys (from the optical surveys of the Sloan Digital Sky Survey and the 2dF Galaxy Survey through to space-based ultraviolet satellites such as GALEX) will provide the opportunity and challenge of understanding how galaxies of different spectral type evolve with redshift. Techniques have been developed to classify galaxies based on their continuum and line spectra. Some of the most promising of these have used the Karhunen & Loève transform (or principal component analysis) to separate galaxies into distinct classes. Their limitation has been that they assume that the spectral coverage and quality of the spectra are constant for all galaxies within a given sample. In this paper we develop a general formalism that accounts for the missing data within the observed spectra (such as the removal of sky lines or the effect of sampling different intrinsic rest-wavelength ranges due to the redshift of a galaxy). We demonstrate that by correcting for these gaps we can recover an almost redshift-independent classification scheme. From this classification we can derive an optimal interpolation that reconstructs the underlying galaxy spectral energy distributions in the regions of missing data. This provides a simple and effective mechanism for building galaxy spectral energy distributions directly from data that may be noisy, incomplete, or drawn from a number of different sources.

  20. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    Science.gov (United States)

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Breast tissue classification in digital tomosynthesis images based on global gradient minimization and texture features

    Science.gov (United States)

    Qin, Xulei; Lu, Guolan; Sechopoulos, Ioannis; Fei, Baowei

    2014-03-01

    Digital breast tomosynthesis (DBT) is a pseudo-three-dimensional x-ray imaging modality proposed to decrease the effect of tissue superposition present in mammography, potentially resulting in an increase in clinical performance for the detection and diagnosis of breast cancer. Tissue classification in DBT images can be useful in risk assessment, computer-aided detection and radiation dosimetry, among other aspects. However, classifying breast tissue in DBT is a challenging problem because DBT images include complicated structures, image noise, and out-of-plane artifacts due to limited angular tomographic sampling. In this project, we propose an automatic method to classify fatty and glandular tissue in DBT images. First, the DBT images are pre-processed to enhance the tissue structures and to decrease image noise and artifacts. Second, a global smooth filter based on L0 gradient minimization is applied to eliminate detailed structures and enhance large-scale ones. Third, the similar structure regions are extracted and labeled by fuzzy C-means (FCM) classification. At the same time, the texture features are also calculated. Finally, each region is classified into different tissue types based on both intensity and texture features. The proposed method is validated using five patient DBT images using manual segmentation as the gold standard. The Dice scores and the confusion matrix are utilized to evaluate the classified results. The evaluation results demonstrated the feasibility of the proposed method for classifying breast glandular and fat tissue on DBT images.

  2. Tissue classification and segmentation of pressure injuries using convolutional neural networks.

    Science.gov (United States)

    Zahia, Sofia; Sierra-Sosa, Daniel; Garcia-Zapirain, Begonya; Elmaghraby, Adel

    2018-06-01

    This paper presents a new approach for automatic tissue classification in pressure injuries. These wounds are localized skin damages which need frequent diagnosis and treatment. Therefore, a reliable and accurate systems for segmentation and tissue type identification are needed in order to achieve better treatment results. Our proposed system is based on a Convolutional Neural Network (CNN) devoted to performing optimized segmentation of the different tissue types present in pressure injuries (granulation, slough, and necrotic tissues). A preprocessing step removes the flash light and creates a set of 5x5 sub-images which are used as input for the CNN network. The network output will classify every sub-image of the validation set into one of the three classes studied. The metrics used to evaluate our approach show an overall average classification accuracy of 92.01%, an average total weighted Dice Similarity Coefficient of 91.38%, and an average precision per class of 97.31% for granulation tissue, 96.59% for necrotic tissue, and 77.90% for slough tissue. Our system has been proven to make recognition of complicated structures in biomedical images feasible. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Combining extreme learning machines using support vector machines for breast tissue classification.

    Science.gov (United States)

    Daliri, Mohammad Reza

    2015-01-01

    In this paper, we present a new approach for breast tissue classification using the features derived from electrical impedance spectroscopy. This method is composed of a feature extraction method, feature selection phase and a classification step. The feature extraction phase derives the features from the electrical impedance spectra. The extracted features consist of the impedivity at zero frequency (I0), the phase angle at 500 KHz, the high-frequency slope of phase angle, the impedance distance between spectral ends, the area under spectrum, the normalised area, the maximum of the spectrum, the distance between impedivity at I0 and the real part of the maximum frequency point and the length of the spectral curve. The system uses the information theoretic criterion as a strategy for feature selection and the combining extreme learning machines (ELMs) for the classification phase. The results of several ELMs are combined using the support vector machines classifier, and the result of classification is reported as a measure of the performance of the system. The results indicate that the proposed system achieves high accuracy in classification of breast tissues using the electrical impedance spectroscopy.

  4. MRI Brain Images Healthy and Pathological Tissues Classification with the Aid of Improved Particle Swarm Optimization and Neural Network

    Science.gov (United States)

    Sheejakumari, V.; Sankara Gomathi, B.

    2015-01-01

    The advantages of magnetic resonance imaging (MRI) over other diagnostic imaging modalities are its higher spatial resolution and its better discrimination of soft tissue. In the previous tissues classification method, the healthy and pathological tissues are classified from the MRI brain images using HGANN. But the method lacks sensitivity and accuracy measures. The classification method is inadequate in its performance in terms of these two parameters. So, to avoid these drawbacks, a new classification method is proposed in this paper. Here, new tissues classification method is proposed with improved particle swarm optimization (IPSO) technique to classify the healthy and pathological tissues from the given MRI images. Our proposed classification method includes the same four stages, namely, tissue segmentation, feature extraction, heuristic feature selection, and tissue classification. The method is implemented and the results are analyzed in terms of various statistical performance measures. The results show the effectiveness of the proposed classification method in classifying the tissues and the achieved improvement in sensitivity and accuracy measures. Furthermore, the performance of the proposed technique is evaluated by comparing it with the other segmentation methods. PMID:25977706

  5. The use of Compton scattering to differentiate between classifications of normal and diseased breast tissue

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, Elaine A; Farquharson, Michael J; Flinton, David M [School of Allied Health Sciences, City University, Charterhouse Square, London EC1M 6PA (United Kingdom)

    2005-07-21

    This study describes a technique for measuring the electron density of breast tissue utilizing Compton scattered photons. The K{sub {alpha}}{sub 2} line from a tungsten target industrial x-ray tube (57.97 keV) was used and the scattered x-rays collected at an angle of 30{sup 0}. At this angle the Compton and coherent photon peaks can be resolved using an energy dispersive detector and a peak fitting algorithm. The system was calibrated using solutions of known electron density. The results obtained from a pilot study of 22 tissues are presented. The tissue samples investigated comprise four different tissue classifications: adipose, malignancy, fibroadenoma and fibrocystic change (FCC). It is shown that there is a difference between adipose and malignant tissue, to a value of 9.0%, and between adipose and FCC, to a value of 12.7%. These figures are found to be significant by statistical analysis. The differences between adipose and fibroadenoma tissues (2.2%) and between malignancy and FCC (3.4%) are not significant. It is hypothesized that the alteration in glucose uptake within malignant cells may cause these tissues to have an elevated electron density. The fibrotic nature of tissue that has undergone FCC gives the highest measure of all tissue types.

  6. The use of Compton scattering to differentiate between classifications of normal and diseased breast tissue

    Science.gov (United States)

    Ryan, Elaine A.; Farquharson, Michael J.; Flinton, David M.

    2005-07-01

    This study describes a technique for measuring the electron density of breast tissue utilizing Compton scattered photons. The Kα2 line from a tungsten target industrial x-ray tube (57.97 keV) was used and the scattered x-rays collected at an angle of 30°. At this angle the Compton and coherent photon peaks can be resolved using an energy dispersive detector and a peak fitting algorithm. The system was calibrated using solutions of known electron density. The results obtained from a pilot study of 22 tissues are presented. The tissue samples investigated comprise four different tissue classifications: adipose, malignancy, fibroadenoma and fibrocystic change (FCC). It is shown that there is a difference between adipose and malignant tissue, to a value of 9.0%, and between adipose and FCC, to a value of 12.7%. These figures are found to be significant by statistical analysis. The differences between adipose and fibroadenoma tissues (2.2%) and between malignancy and FCC (3.4%) are not significant. It is hypothesized that the alteration in glucose uptake within malignant cells may cause these tissues to have an elevated electron density. The fibrotic nature of tissue that has undergone FCC gives the highest measure of all tissue types.

  7. The use of Compton scattering to differentiate between classifications of normal and diseased breast tissue

    International Nuclear Information System (INIS)

    Ryan, Elaine A; Farquharson, Michael J; Flinton, David M

    2005-01-01

    This study describes a technique for measuring the electron density of breast tissue utilizing Compton scattered photons. The K α2 line from a tungsten target industrial x-ray tube (57.97 keV) was used and the scattered x-rays collected at an angle of 30 0 . At this angle the Compton and coherent photon peaks can be resolved using an energy dispersive detector and a peak fitting algorithm. The system was calibrated using solutions of known electron density. The results obtained from a pilot study of 22 tissues are presented. The tissue samples investigated comprise four different tissue classifications: adipose, malignancy, fibroadenoma and fibrocystic change (FCC). It is shown that there is a difference between adipose and malignant tissue, to a value of 9.0%, and between adipose and FCC, to a value of 12.7%. These figures are found to be significant by statistical analysis. The differences between adipose and fibroadenoma tissues (2.2%) and between malignancy and FCC (3.4%) are not significant. It is hypothesized that the alteration in glucose uptake within malignant cells may cause these tissues to have an elevated electron density. The fibrotic nature of tissue that has undergone FCC gives the highest measure of all tissue types

  8. JACOP: A simple and robust method for the automated classification of protein sequences with modular architecture

    Directory of Open Access Journals (Sweden)

    Pagni Marco

    2005-08-01

    Full Text Available Abstract Background Whole-genome sequencing projects are rapidly producing an enormous number of new sequences. Consequently almost every family of proteins now contains hundreds of members. It has thus become necessary to develop tools, which classify protein sequences automatically and also quickly and reliably. The difficulty of this task is intimately linked to the mechanism by which protein sequences diverge, i.e. by simultaneous residue substitutions, insertions and/or deletions and whole domain reorganisations (duplications/swapping/fusion. Results Here we present a novel approach, which is based on random sampling of sub-sequences (probes out of a set of input sequences. The probes are compared to the input sequences, after a normalisation step; the results are used to partition the input sequences into homogeneous groups of proteins. In addition, this method provides information on diagnostic parts of the proteins. The performance of this method is challenged by two data sets. The first one contains the sequences of prokaryotic lyases that could be arranged as a multiple sequence alignment. The second one contains all proteins from Swiss-Prot Release 36 with at least one Src homology 2 (SH2 domain – a classical example for proteins with modular architecture. Conclusion The outcome of our method is robust, highly reproducible as shown using bootstrap and resampling validation procedures. The results are essentially coherent with the biology. This method depends solely on well-established publicly available software and algorithms.

  9. Systematic bias in genomic classification due to contaminating non-neoplastic tissue in breast tumor samples.

    Science.gov (United States)

    Elloumi, Fathi; Hu, Zhiyuan; Li, Yan; Parker, Joel S; Gulley, Margaret L; Amos, Keith D; Troester, Melissa A

    2011-06-30

    Genomic tests are available to predict breast cancer recurrence and to guide clinical decision making. These predictors provide recurrence risk scores along with a measure of uncertainty, usually a confidence interval. The confidence interval conveys random error and not systematic bias. Standard tumor sampling methods make this problematic, as it is common to have a substantial proportion (typically 30-50%) of a tumor sample comprised of histologically benign tissue. This "normal" tissue could represent a source of non-random error or systematic bias in genomic classification. To assess the performance characteristics of genomic classification to systematic error from normal contamination, we collected 55 tumor samples and paired tumor-adjacent normal tissue. Using genomic signatures from the tumor and paired normal, we evaluated how increasing normal contamination altered recurrence risk scores for various genomic predictors. Simulations of normal tissue contamination caused misclassification of tumors in all predictors evaluated, but different breast cancer predictors showed different types of vulnerability to normal tissue bias. While two predictors had unpredictable direction of bias (either higher or lower risk of relapse resulted from normal contamination), one signature showed predictable direction of normal tissue effects. Due to this predictable direction of effect, this signature (the PAM50) was adjusted for normal tissue contamination and these corrections improved sensitivity and negative predictive value. For all three assays quality control standards and/or appropriate bias adjustment strategies can be used to improve assay reliability. Normal tissue sampled concurrently with tumor is an important source of bias in breast genomic predictors. All genomic predictors show some sensitivity to normal tissue contamination and ideal strategies for mitigating this bias vary depending upon the particular genes and computational methods used in the predictor.

  10. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  11. Automated Analysis and Classification of Histological Tissue Features by Multi-Dimensional Microscopic Molecular Profiling.

    Directory of Open Access Journals (Sweden)

    Daniel P Riordan

    Full Text Available Characterization of the molecular attributes and spatial arrangements of cells and features within complex human tissues provides a critical basis for understanding processes involved in development and disease. Moreover, the ability to automate steps in the analysis and interpretation of histological images that currently require manual inspection by pathologists could revolutionize medical diagnostics. Toward this end, we developed a new imaging approach called multidimensional microscopic molecular profiling (MMMP that can measure several independent molecular properties in situ at subcellular resolution for the same tissue specimen. MMMP involves repeated cycles of antibody or histochemical staining, imaging, and signal removal, which ultimately can generate information analogous to a multidimensional flow cytometry analysis on intact tissue sections. We performed a MMMP analysis on a tissue microarray containing a diverse set of 102 human tissues using a panel of 15 informative antibody and 5 histochemical stains plus DAPI. Large-scale unsupervised analysis of MMMP data, and visualization of the resulting classifications, identified molecular profiles that were associated with functional tissue features. We then directly annotated H&E images from this MMMP series such that canonical histological features of interest (e.g. blood vessels, epithelium, red blood cells were individually labeled. By integrating image annotation data, we identified molecular signatures that were associated with specific histological annotations and we developed statistical models for automatically classifying these features. The classification accuracy for automated histology labeling was objectively evaluated using a cross-validation strategy, and significant accuracy (with a median per-pixel rate of 77% per feature from 15 annotated samples for de novo feature prediction was obtained. These results suggest that high-dimensional profiling may advance the

  12. Tissue classification and diagnostics using a fiber probe for combined Raman and fluorescence spectroscopy

    Science.gov (United States)

    Cicchi, Riccardo; Anand, Suresh; Crisci, Alfonso; Giordano, Flavio; Rossari, Susanna; De Giorgi, Vincenzo; Maio, Vincenza; Massi, Daniela; Nesi, Gabriella; Buccoliero, Anna Maria; Guerrini, Renzo; Pimpinelli, Nicola; Pavone, Francesco S.

    2015-07-01

    Two different optical fiber probes for combined Raman and fluorescence spectroscopic measurements were designed, developed and used for tissue diagnostics. Two visible laser diodes were used for fluorescence spectroscopy, whereas a laser diode emitting in the NIR was used for Raman spectroscopy. The two probes were based on fiber bundles with a central multimode optical fiber, used for delivering light to the tissue, and 24 surrounding optical fibers for signal collection. Both fluorescence and Raman spectra were acquired using the same detection unit, based on a cooled CCD camera, connected to a spectrograph. The two probes were successfully employed for diagnostic purposes on various tissues in a good agreement with common routine histology. This study included skin, brain and bladder tissues and in particular the classification of: malignant melanoma against melanocytic lesions and healthy skin; urothelial carcinoma against healthy bladder mucosa; brain tumor against dysplastic brain tissue. The diagnostic capabilities were determined using a cross-validation method with a leave-one-out approach, finding very high sensitivity and specificity for all the examined tissues. The obtained results demonstrated that the multimodal approach is crucial for improving diagnostic capabilities. The system presented here can improve diagnostic capabilities on a broad range of tissues and has the potential of being used for endoscopic inspections in the near future.

  13. Robust Seismic Normal Modes Computation in Radial Earth Models and A Novel Classification Based on Intersection Points of Waveguides

    Science.gov (United States)

    Ye, J.; Shi, J.; De Hoop, M. V.

    2017-12-01

    We develop a robust algorithm to compute seismic normal modes in a spherically symmetric, non-rotating Earth. A well-known problem is the cross-contamination of modes near "intersections" of dispersion curves for separate waveguides. Our novel computational approach completely avoids artificial degeneracies by guaranteeing orthonormality among the eigenfunctions. We extend Wiggins' and Buland's work, and reformulate the Sturm-Liouville problem as a generalized eigenvalue problem with the Rayleigh-Ritz Galerkin method. A special projection operator incorporating the gravity terms proposed by de Hoop and a displacement/pressure formulation are utilized in the fluid outer core to project out the essential spectrum. Moreover, the weak variational form enables us to achieve high accuracy across the solid-fluid boundary, especially for Stoneley modes, which have exponentially decaying behavior. We also employ the mixed finite element technique to avoid spurious pressure modes arising from discretization schemes and a numerical inf-sup test is performed following Bathe's work. In addition, the self-gravitation terms are reformulated to avoid computations outside the Earth, thanks to the domain decomposition technique. Our package enables us to study the physical properties of intersection points of waveguides. According to Okal's classification theory, the group velocities should be continuous within a branch of the same mode family. However, we have found that there will be a small "bump" near intersection points, which is consistent with Miropol'sky's observation. In fact, we can loosely regard Earth's surface and the CMB as independent waveguides. For those modes that are far from the intersection points, their eigenfunctions are localized in the corresponding waveguides. However, those that are close to intersection points will have physical features of both waveguides, which means they cannot be classified in either family. Our results improve on Okal

  14. Spatial cluster analysis of nanoscopically mapped serotonin receptors for classification of fixed brain tissue

    Science.gov (United States)

    Sams, Michael; Silye, Rene; Göhring, Janett; Muresan, Leila; Schilcher, Kurt; Jacak, Jaroslaw

    2014-01-01

    We present a cluster spatial analysis method using nanoscopic dSTORM images to determine changes in protein cluster distributions within brain tissue. Such methods are suitable to investigate human brain tissue and will help to achieve a deeper understanding of brain disease along with aiding drug development. Human brain tissue samples are usually treated postmortem via standard fixation protocols, which are established in clinical laboratories. Therefore, our localization microscopy-based method was adapted to characterize protein density and protein cluster localization in samples fixed using different protocols followed by common fluorescent immunohistochemistry techniques. The localization microscopy allows nanoscopic mapping of serotonin 5-HT1A receptor groups within a two-dimensional image of a brain tissue slice. These nanoscopically mapped proteins can be confined to clusters by applying the proposed statistical spatial analysis. Selected features of such clusters were subsequently used to characterize and classify the tissue. Samples were obtained from different types of patients, fixed with different preparation methods, and finally stored in a human tissue bank. To verify the proposed method, samples of a cryopreserved healthy brain have been compared with epitope-retrieved and paraffin-fixed tissues. Furthermore, samples of healthy brain tissues were compared with data obtained from patients suffering from mental illnesses (e.g., major depressive disorder). Our work demonstrates the applicability of localization microscopy and image analysis methods for comparison and classification of human brain tissues at a nanoscopic level. Furthermore, the presented workflow marks a unique technological advance in the characterization of protein distributions in brain tissue sections.

  15. Classification of fibroglandular tissue distribution in the breast based on radiotherapy planning CT

    International Nuclear Information System (INIS)

    Juneja, Prabhjot; Evans, Philip; Windridge, David; Harris, Emma

    2016-01-01

    Accurate segmentation of breast tissues is required for a number of applications such as model based deformable registration in breast radiotherapy. The accuracy of breast tissue segmentation is affected by the spatial distribution (or pattern) of fibroglandular tissue (FT). The goal of this study was to develop and evaluate texture features, determined from planning computed tomography (CT) data, to classify the spatial distribution of FT in the breast. Planning CT data of 23 patients were evaluated in this study. Texture features were derived from the radial glandular fraction (RGF), which described the distribution of FT within three breast regions (posterior, middle, and anterior). Using visual assessment, experts grouped patients according to FT spatial distribution: sparse or non-sparse. Differences in the features between the two groups were investigated using the Wilcoxon rank test. Classification performance of the features was evaluated for a range of support vector machine (SVM) classifiers. Experts found eight patients and 15 patients had sparse and non-sparse spatial distribution of FT, respectively. A large proportion of features (>9 of 13) from the individual breast regions had significant differences (p <0.05) between the sparse and non-sparse group. The features from middle region had most significant differences and gave the highest classification accuracy for all the SVM kernels investigated. Overall, the features from middle breast region achieved highest accuracy (91 %) with the linear SVM kernel. This study found that features based on radial glandular fraction provide a means for discriminating between fibroglandular tissue distributions and could achieve a classification accuracy of 91 %

  16. SU-F-T-187: Quantifying Normal Tissue Sparing with 4D Robust Optimization of Intensity Modulated Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Newpower, M; Ge, S; Mohan, R [UT MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: To report an approach to quantify the normal tissue sparing for 4D robustly-optimized versus PTV-optimized IMPT plans. Methods: We generated two sets of 90 DVHs from a patient’s 10-phase 4D CT set; one by conventional PTV-based optimization done in the Eclipse treatment planning system, and the other by an in-house robust optimization algorithm. The 90 DVHs were created for the following scenarios in each of the ten phases of the 4DCT: ± 5mm shift along x, y, z; ± 3.5% range uncertainty and a nominal scenario. A Matlab function written by Gay and Niemierko was modified to calculate EUD for each DVH for the following structures: esophagus, heart, ipsilateral lung and spinal cord. An F-test determined whether or not the variances of each structure’s DVHs were statistically different. Then a t-test determined if the average EUDs for each optimization algorithm were statistically significantly different. Results: T-test results showed each structure had a statistically significant difference in average EUD when comparing robust optimization versus PTV-based optimization. Under robust optimization all structures except the spinal cord received lower EUDs than PTV-based optimization. Using robust optimization the average EUDs decreased 1.45% for the esophagus, 1.54% for the heart and 5.45% for the ipsilateral lung. The average EUD to the spinal cord increased 24.86% but was still well below tolerance. Conclusion: This work has helped quantify a qualitative relationship noted earlier in our work: that robust optimization leads to plans with greater normal tissue sparing compared to PTV-based optimization. Except in the case of the spinal cord all structures received a lower EUD under robust optimization and these results are statistically significant. While the average EUD to the spinal cord increased to 25.06 Gy under robust optimization it is still well under the TD50 value of 66.5 Gy from Emami et al. Supported in part by the NCI U19 CA021239.

  17. Breast tissue classification in digital breast tomosynthesis images using texture features: a feasibility study

    Science.gov (United States)

    Kontos, Despina; Berger, Rachelle; Bakic, Predrag R.; Maidment, Andrew D. A.

    2009-02-01

    Mammographic breast density is a known breast cancer risk factor. Studies have shown the potential to automate breast density estimation by using computerized texture-based segmentation of the dense tissue in mammograms. Digital breast tomosynthesis (DBT) is a tomographic x-ray breast imaging modality that could allow volumetric breast density estimation. We evaluated the feasibility of distinguishing between dense and fatty breast regions in DBT using computer-extracted texture features. Our long-term hypothesis is that DBT texture analysis can be used to develop 3D dense tissue segmentation algorithms for estimating volumetric breast density. DBT images from 40 women were analyzed. The dense tissue area was delineated within each central source projection (CSP) image using a thresholding technique (Cumulus, Univ. Toronto). Two (2.5cm)2 ROIs were manually selected: one within the dense tissue region and another within the fatty region. Corresponding (2.5cm)3 ROIs were placed within the reconstructed DBT images. Texture features, previously used for mammographic dense tissue segmentation, were computed. Receiver operating characteristic (ROC) curve analysis was performed to evaluate feature classification performance. Different texture features appeared to perform best in the 3D reconstructed DBT compared to the 2D CSP images. Fractal dimension was superior in DBT (AUC=0.90), while contrast was best in CSP images (AUC=0.92). We attribute these differences to the effects of tissue superimposition in CSP and the volumetric visualization of the breast tissue in DBT. Our results suggest that novel approaches, different than those conventionally used in projection mammography, need to be investigated in order to develop DBT dense tissue segmentation algorithms for estimating volumetric breast density.

  18. Classification of cardiovascular tissues using LBP based descriptors and a cascade SVM.

    Science.gov (United States)

    Mazo, Claudia; Alegre, Enrique; Trujillo, Maria

    2017-08-01

    Histological images have characteristics, such as texture, shape, colour and spatial structure, that permit the differentiation of each fundamental tissue and organ. Texture is one of the most discriminative features. The automatic classification of tissues and organs based on histology images is an open problem, due to the lack of automatic solutions when treating tissues without pathologies. In this paper, we demonstrate that it is possible to automatically classify cardiovascular tissues using texture information and Support Vector Machines (SVM). Additionally, we realised that it is feasible to recognise several cardiovascular organs following the same process. The texture of histological images was described using Local Binary Patterns (LBP), LBP Rotation Invariant (LBPri), Haralick features and different concatenations between them, representing in this way its content. Using a SVM with linear kernel, we selected the more appropriate descriptor that, for this problem, was a concatenation of LBP and LBPri. Due to the small number of the images available, we could not follow an approach based on deep learning, but we selected the classifier who yielded the higher performance by comparing SVM with Random Forest and Linear Discriminant Analysis. Once SVM was selected as the classifier with a higher area under the curve that represents both higher recall and precision, we tuned it evaluating different kernels, finding that a linear SVM allowed us to accurately separate four classes of tissues: (i) cardiac muscle of the heart, (ii) smooth muscle of the muscular artery, (iii) loose connective tissue, and (iv) smooth muscle of the large vein and the elastic artery. The experimental validation was conducted using 3000 blocks of 100 × 100 sized pixels, with 600 blocks per class and the classification was assessed using a 10-fold cross-validation. using LBP as the descriptor, concatenated with LBPri and a SVM with linear kernel, the main four classes of tissues were

  19. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  20. Polarimetry based partial least square classification of ex vivo healthy and basal cell carcinoma human skin tissues.

    Science.gov (United States)

    Ahmad, Iftikhar; Ahmad, Manzoor; Khan, Karim; Ikram, Masroor

    2016-06-01

    Optical polarimetry was employed for assessment of ex vivo healthy and basal cell carcinoma (BCC) tissue samples from human skin. Polarimetric analyses revealed that depolarization and retardance for healthy tissue group were significantly higher (ppolarimetry together with PLS statistics hold promise for automated pathology classification. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Automatic classification of prostate stromal tissue in histological images using Haralick descriptors and Local Binary Patterns

    International Nuclear Information System (INIS)

    Oliveira, D L L; Batista, V R; Duarte, Y A S; Nascimento, M Z; Neves, L A; Godoy, M F; Jacomini, R S; Arruda, P F F; Neto, D S

    2014-01-01

    In this paper we presente a classification system that uses a combination of texture features from stromal regions: Haralick features and Local Binary Patterns (LBP) in wavelet domain. The system has five steps for classification of the tissues. First, the stromal regions were detected and extracted using segmentation techniques based on thresholding and RGB colour space. Second, the Wavelet decomposition was applied in the extracted regions to obtain the Wavelet coefficients. Third, the Haralick and LBP features were extracted from the coefficients. Fourth, relevant features were selected using the ANOVA statistical method. The classication (fifth step) was performed with Radial Basis Function (RBF) networks. The system was tested in 105 prostate images, which were divided into three groups of 35 images: normal, hyperplastic and cancerous. The system performance was evaluated using the area under the ROC curve and resulted in 0.98 for normal versus cancer, 0.95 for hyperplasia versus cancer and 0.96 for normal versus hyperplasia. Our results suggest that texture features can be used as discriminators for stromal tissues prostate images. Furthermore, the system was effective to classify prostate images, specially the hyperplastic class which is the most difficult type in diagnosis and prognosis

  2. Chronic radiation effects on dental hard tissue (''radiation carries''). Classification and therapeutic strategies

    International Nuclear Information System (INIS)

    Groetz, K.A.; Brahm, R.; Al-Nawas, B.; Wagner, W.; Riesenbeck, D.; Willich, N.; Seegenschmiedt, M.H.

    2001-01-01

    Objectives: Since the first description of rapid destruction of dental hard tissues following head and neck radiotherapy 80 years ago, 'radiation caries' is an established clinical finding. The internationally accepted clinical evaluation score RTOG/EORTC however is lacking a classification of this frequent radiogenic alteration. Material and Methods: Medical records, data and images of radiation effects on the teeth of more than 1,500 patients, who underwent periradiotherapeutic care, were analyzed. Macroscopic alterations regarding the grade of late lesions of tooth crowns were used for a classification into 4 grades according to the RTOG/EORTC guidelines. Results: No early radiation effects were found by macroscopic inspection. In the first 90 days following radiotherapy 1/3 of the patients complained of reversible hypersensitivity, which may be related to a temporary hyperemia of the pulp. It was possible to classify radiation caries as a late radiation effect on a graded scale as known from RTOG/EORTC for other organ systems. This is a prerequisite for the integration of radiation caries into the international nomenclature of the RTOG/EORTC classification. Conclusions: The documentation of early radiation effects on dental hard tissues seems to be neglectable. On the other hand the documentation of late radiation effects has a high clinical impact. The identification of an initial lesion at the high-risk areas of the neck and incisal part of the tooth can lead to a successful therapy as a major prerequisite for orofacial rehabilitation. An internationally standardized documentation is a basis for the evaluation of the side effects of radiooncotic therapy as well as the effectiveness of protective and supportive procedures. (orig.) [de

  3. Spiral wave classification using normalized compression distance: Towards atrial tissue spatiotemporal electrophysiological behavior characterization.

    Science.gov (United States)

    Alagoz, Celal; Guez, Allon; Cohen, Andrew; Bullinga, John R

    2015-08-01

    Analysis of electrical activation patterns such as re-entries during atrial fibrillation (Afib) is crucial in understanding arrhythmic mechanisms and assessment of diagnostic measures. Spiral waves are a phenomena that provide intuitive basis for re-entries occurring in cardiac tissue. Distinct spiral wave behaviors such as stable spiral waves, meandering spiral waves, and spiral wave break-up may have distinct electrogram manifestations on a mapping catheter. Hence, it is desirable to have an automated classification of spiral wave behavior based on catheter recordings for a qualitative characterization of spatiotemporal electrophysiological activity on atrial tissue. In this study, we propose a method for classification of spatiotemporal characteristics of simulated atrial activation patterns in terms of distinct spiral wave behaviors during Afib using two different techniques: normalized compressed distance (NCD) and normalized FFT (NFFTD). We use a phenomenological model for cardiac electrical propagation to produce various simulated spiral wave behaviors on a 2D grid and labeled them as stable, meandering, or breakup. By mimicking commonly used catheter types, a star shaped and a circular shaped both of which do the local readings from atrial wall, monopolar and bipolar intracardiac electrograms are simulated. Virtual catheters are positioned at different locations on the grid. The classification performance for different catheter locations, types and for monopolar or bipolar readings were also compared. We observed that the performance for each case differed slightly. However, we found that NCD performance is superior to NFFTD. Through the simulation study, we showed the theoretical validation of the proposed method. Our findings suggest that a qualitative wavefront activation pattern can be assessed during Afib without the need for highly invasive mapping techniques such as multisite simultaneous electrogram recordings.

  4. Automated classification of immunostaining patterns in breast tissue from the human protein atlas.

    Science.gov (United States)

    Swamidoss, Issac Niwas; Kårsnäs, Andreas; Uhlmann, Virginie; Ponnusamy, Palanisamy; Kampf, Caroline; Simonsson, Martin; Wählby, Carolina; Strand, Robin

    2013-01-01

    The Human Protein Atlas (HPA) is an effort to map the location of all human proteins (http://www.proteinatlas.org/). It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA) are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM) features, complex wavelet co-occurrence matrix (CWCM) features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM)-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM) and linear discriminant analysis (LDA) classifier). Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for quantification of staining patterns in histopathology have many

  5. Automated classification of immunostaining patterns in breast tissue from the human protein Atlas

    Directory of Open Access Journals (Sweden)

    Issac Niwas Swamidoss

    2013-01-01

    Full Text Available Background: The Human Protein Atlas (HPA is an effort to map the location of all human proteins (http://www.proteinatlas.org/. It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. Materials and Methods: The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM features, complex wavelet co-occurrence matrix (CWCM features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM and linear discriminant analysis (LDA classifier. Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. Results: We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Conclusions: Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for

  6. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  7. Robust microwave-assisted extraction protocol for determination of total mercury and methylmercury in fish tissues

    Energy Technology Data Exchange (ETDEWEB)

    Reyes, L. Hinojosa; Rahman, G.M. Mizanur [Department of Chemistry and Biochemistry, Duquesne University, Pittsburgh, PA 15282 (United States); Kingston, H.M. Skip [Department of Chemistry and Biochemistry, Duquesne University, Pittsburgh, PA 15282 (United States)], E-mail: kingston@duq.edu

    2009-01-12

    A rapid and efficient closed vessel microwave-assisted extraction (MAE) method based on acidic leaching was developed and optimized for the extraction of total mercury (Hg), inorganic mercury (Hg{sup 2+}) and methylmercury (CH{sub 3}Hg{sup +}) from fish tissues. The quantitative extraction of total Hg and mercury species from biological samples was achieved by using 5 mol L{sup -1} HCl and 0.25 mol L{sup -1} NaCl during 10 min at 60 deg. C. Total Hg content was determined using inductively coupled plasma mass spectrometry (ICP-MS). Mercury species were measured by liquid chromatography hyphenated with inductively coupled plasma mass spectrometry (LC-ICP-MS). The method was validated using biological certified reference materials ERM-CE464, DOLT-3, and NIST SRM-1946. The analytical results were in good agreement with the certified reference values of total Hg and CH{sub 3}Hg{sup +} at a 95% confidence level. Further, accuracy validation using speciated isotope-dilution mass spectrometry (SIDMS, as described in the EPA Method 6800) was carried out. SIDMS was also applied to study and correct for unwanted species transformation reactions during and/or after sample preparation steps. For the studied reference materials, no statistically significant transformation between mercury species was observed during the extraction and determination procedures. The proposed method was successfully applied to fish tissues with good agreement between SIDMS results and external calibration (EC) results. Interspecies transformations in fish tissues were slightly higher than certified reference materials due to differences in matrix composition. Depending on the type of fish tissue, up to 10.24% of Hg{sup 2+} was methylated and up to 1.75% of CH{sub 3}Hg{sup +} was demethylated to Hg{sup 2+}.

  8. Tissue classifications in Monte Carlo simulations of patient dose for photon beam tumor treatments

    Science.gov (United States)

    Lin, Mu-Han; Chao, Tsi-Chian; Lee, Chung-Chi; Tung-Chieh Chang, Joseph; Tung, Chuan-Jong

    2010-07-01

    The purpose of this work was to study the calculated dose uncertainties induced by the material classification that determined the interaction cross-sections and the water-to-material stopping-power ratios. Calculations were made for a head- and neck-cancer patient treated with five intensity-modulated radiotherapy fields using 6 MV photon beams. The patient's CT images were reconstructed into two voxelized patient phantoms based on different CT-to-material classification schemes. Comparisons of the depth-dose curve of the anterior-to-posterior field and the dose-volume-histogram of the treatment plan were used to evaluate the dose uncertainties from such schemes. The results indicated that any misassignment of tissue materials could lead to a substantial dose difference, which would affect the treatment outcome. To assure an appropriate material assignment, it is desirable to have different conversion tables for various parts of the body. The assignment of stopping-power ratio should be based on the chemical composition and the density of the material.

  9. Tissue classifications in Monte Carlo simulations of patient dose for photon beam tumor treatments

    International Nuclear Information System (INIS)

    Lin, Mu-Han; Chao, Tsi-Chian; Lee, Chung-Chi; Tung-Chieh Chang, Joseph; Tung, Chuan-Jong

    2010-01-01

    The purpose of this work was to study the calculated dose uncertainties induced by the material classification that determined the interaction cross-sections and the water-to-material stopping-power ratios. Calculations were made for a head- and neck-cancer patient treated with five intensity-modulated radiotherapy fields using 6 MV photon beams. The patient's CT images were reconstructed into two voxelized patient phantoms based on different CT-to-material classification schemes. Comparisons of the depth-dose curve of the anterior-to-posterior field and the dose-volume-histogram of the treatment plan were used to evaluate the dose uncertainties from such schemes. The results indicated that any misassignment of tissue materials could lead to a substantial dose difference, which would affect the treatment outcome. To assure an appropriate material assignment, it is desirable to have different conversion tables for various parts of the body. The assignment of stopping-power ratio should be based on the chemical composition and the density of the material.

  10. Classification of breast tissue using a laboratory system for small-angle x-ray scattering (SAXS)

    International Nuclear Information System (INIS)

    Sidhu, S; Siu, K K W; Falzon, G; Hart, S A; Fox, J G; Lewis, R A

    2011-01-01

    Structural changes in breast tissue at the nanometre scale have been shown to differentiate between tissue types using synchrotron SAXS techniques. Classification of breast tissues using information acquired from a laboratory SAXS camera source could possibly provide a means of adopting SAXS as a viable diagnostic procedure. Tissue samples were obtained from surgical waste from 66 patients and structural components of the tissues were examined between q = 0.25 and 2.3 nm -1 . Principal component analysis showed that the amplitude of the fifth-order axial Bragg peak, the magnitude of the integrated intensity and the full-width at half-maximum of the fat peak were significantly different between tissue types. A discriminant analysis showed that excellent classification can be achieved; however, only 30% of the tissue samples provided the 16 variables required for classification. This suggests that the presence of disease is represented by a combination of factors, rather than one specific trait. A closer examination of the amorphous scattering intensity showed not only a trend of increased scattering intensity with disease severity, but also a corresponding decrease in the size of the scatterers contributing to this intensity.

  11. A robust, efficient and flexible method for staining myelinated axons in blocks of brain tissue.

    Science.gov (United States)

    Wahlsten, Douglas; Colbourne, Frederick; Pleus, Richard

    2003-03-15

    Previous studies have demonstrated the utility of the gold chloride method for en bloc staining of a bisected brain in mice and rats. The present study explores several variations in the method, assesses its reliability, and extends the limits of its application. We conclude that the method is very efficient, highly robust, sufficiently accurate for most purposes, and adaptable to many morphometric measures. We obtained acceptable staining of commissures in every brain, despite a wide variety of fixation methods. One-half could be stained 24 h after the brain was extracted and the other half could be stained months later. When staining failed because of an exhausted solution, the brain could be stained successfully in fresh solution. Relatively small changes were found in the sizes of commissures several weeks after initial fixation or staining. A half brain stained to reveal the mid-sagittal section could then be sectioned coronally and stained again in either gold chloride for myelin or cresyl violet for Nissl substance. Uncertainty, arising from pixelation of digitized images was far less than errors arising from human judgments about the histological limits of major commissures. Useful data for morphometric analysis were obtained by scanning the surface of a gold chloride stained block of brain with an inexpensive flatbed scanner.

  12. Murine Neonates Infected with Yersinia enterocolitica Develop Rapid and Robust Proinflammatory Responses in Intestinal Lymphoid Tissues

    Science.gov (United States)

    Siefker, David T.; Echeverry, Andrea; Brambilla, Roberta; Fukata, Masayuki; Schesser, Kurt

    2014-01-01

    Neonatal animals are generally very susceptible to infection with bacterial pathogens. However, we recently reported that neonatal mice are highly resistant to orogastric infection with Yersinia enterocolitica. Here, we show that proinflammatory responses greatly exceeding those in adults arise very rapidly in the mesenteric lymph nodes (MLN) of neonates. High-level induction of proinflammatory gene expression occurred in the neonatal MLN as early as 18 h postinfection. Marked innate phagocyte recruitment was subsequently detected at 24 h postinfection. Enzyme-linked immunosorbent spot assay (ELISPOT) analyses indicated that enhanced inflammation in neonatal MLN is contributed to, in part, by an increased frequency of proinflammatory cytokine-secreting cells. Moreover, both CD11b+ and CD11b− cell populations appeared to play a role in proinflammatory gene expression. The level of inflammation in neonatal MLN was also dependent on key bacterial components. Y. enterocolitica lacking the virulence plasmid failed to induce innate phagocyte recruitment. In contrast, tumor necrosis factor alpha (TNF-α) protein expression and neutrophil recruitment were strikingly higher in neonatal MLN after infection with a yopP-deficient strain than with wild-type Y. enterocolitica, whereas only modest increases occurred in adults. This hyperinflammatory response was associated with greater colonization of the spleen and higher mortality in neonates, while there was no difference in mortality among adults. This model highlights the dynamic levels of inflammation in the intestinal lymphoid tissues and reveals the protective (wild-type strain) versus harmful (yopP-deficient strain) consequences of inflammation in neonates. Moreover, these results reveal that the neonatal intestinal lymphoid tissues have great potential to rapidly mobilize innate components in response to infection with bacterial enteropathogens. PMID:24478090

  13. A discriminative model-constrained EM approach to 3D MRI brain tissue classification and intensity non-uniformity correction

    International Nuclear Information System (INIS)

    Wels, Michael; Hornegger, Joachim; Zheng Yefeng; Comaniciu, Dorin; Huber, Martin

    2011-01-01

    We describe a fully automated method for tissue classification, which is the segmentation into cerebral gray matter (GM), cerebral white matter (WM), and cerebral spinal fluid (CSF), and intensity non-uniformity (INU) correction in brain magnetic resonance imaging (MRI) volumes. It combines supervised MRI modality-specific discriminative modeling and unsupervised statistical expectation maximization (EM) segmentation into an integrated Bayesian framework. While both the parametric observation models and the non-parametrically modeled INUs are estimated via EM during segmentation itself, a Markov random field (MRF) prior model regularizes segmentation and parameter estimation. Firstly, the regularization takes into account knowledge about spatial and appearance-related homogeneity of segments in terms of pairwise clique potentials of adjacent voxels. Secondly and more importantly, patient-specific knowledge about the global spatial distribution of brain tissue is incorporated into the segmentation process via unary clique potentials. They are based on a strong discriminative model provided by a probabilistic boosting tree (PBT) for classifying image voxels. It relies on the surrounding context and alignment-based features derived from a probabilistic anatomical atlas. The context considered is encoded by 3D Haar-like features of reduced INU sensitivity. Alignment is carried out fully automatically by means of an affine registration algorithm minimizing cross-correlation. Both types of features do not immediately use the observed intensities provided by the MRI modality but instead rely on specifically transformed features, which are less sensitive to MRI artifacts. Detailed quantitative evaluations on standard phantom scans and standard real-world data show the accuracy and robustness of the proposed method. They also demonstrate relative superiority in comparison to other state-of-the-art approaches to this kind of computational task: our method achieves average

  14. Robustness analysis of a green chemistry-based model for the classification of silver nanoparticles synthesis processes

    Science.gov (United States)

    This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...

  15. Classification of mass and normal breast tissue: A convolution neural network classifier with spatial domain and texture images

    International Nuclear Information System (INIS)

    Sahiner, B.; Chan, H.P.; Petrick, N.; Helvie, M.A.; Adler, D.D.; Goodsitt, M.M.; Wei, D.

    1996-01-01

    The authors investigated the classification of regions of interest (ROI's) on mammograms as either mass or normal tissue using a convolution neural network (CNN). A CNN is a back-propagation neural network with two-dimensional (2-D) weight kernels that operate on images. A generalized, fast and stable implementation of the CNN was developed. The input images to the CNN were obtained form the ROI's using two techniques. The first technique employed averaging and subsampling. The second technique employed texture feature extraction methods applied to small subregions inside the ROI. Features computed over different subregions were arranged as texture images, which were subsequently used as CNN inputs. The effects of CNN architecture and texture feature parameters on classification accuracy were studied. Receiver operating characteristic (ROC) methodology was used to evaluate the classification accuracy. A data set consisting of 168 ROI's containing biopsy-proven masses and 504 ROI's containing normal breast tissue was extracted from 168 mammograms by radiologists experienced in mammography. This data set was used for training and testing the CNN. With the best combination of CNN architecture and texture feature parameters, the area under the test ROC curve reached 0.87, which corresponded to a true-positive fraction of 90% at a false positive fraction of 31%. The results demonstrate the feasibility of using a CNN for classification of masses and normal tissue on mammograms

  16. A new classification method for MALDI imaging mass spectrometry data acquired on formalin-fixed paraffin-embedded tissue samples.

    Science.gov (United States)

    Boskamp, Tobias; Lachmund, Delf; Oetjen, Janina; Cordero Hernandez, Yovany; Trede, Dennis; Maass, Peter; Casadonte, Rita; Kriegsmann, Jörg; Warth, Arne; Dienemann, Hendrik; Weichert, Wilko; Kriegsmann, Mark

    2017-07-01

    Matrix-assisted laser desorption/ionization imaging mass spectrometry (MALDI IMS) shows a high potential for applications in histopathological diagnosis, and in particular for supporting tumor typing and subtyping. The development of such applications requires the extraction of spectral fingerprints that are relevant for the given tissue and the identification of biomarkers associated with these spectral patterns. We propose a novel data analysis method based on the extraction of characteristic spectral patterns (CSPs) that allow automated generation of classification models for spectral data. Formalin-fixed paraffin embedded (FFPE) tissue samples from N=445 patients assembled on 12 tissue microarrays were analyzed. The method was applied to discriminate primary lung and pancreatic cancer, as well as adenocarcinoma and squamous cell carcinoma of the lung. A classification accuracy of 100% and 82.8%, resp., could be achieved on core level, assessed by cross-validation. The method outperformed the more conventional classification method based on the extraction of individual m/z values in the first application, while achieving a comparable accuracy in the second. LC-MS/MS peptide identification demonstrated that the spectral features present in selected CSPs correspond to peptides relevant for the respective classification. This article is part of a Special Issue entitled: MALDI Imaging, edited by Dr. Corinna Henkel and Prof. Peter Hoffmann. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. β-Tricalcium phosphate/poly(glycerol sebacate) scaffolds with robust mechanical property for bone tissue engineering.

    Science.gov (United States)

    Yang, Kai; Zhang, Jing; Ma, Xiaoyu; Ma, Yifan; Kan, Chao; Ma, Haiyan; Li, Yulin; Yuan, Yuan; Liu, Changsheng

    2015-11-01

    Despite good biocompatibility and osteoconductivity, porous β-TCP scaffolds still lack the structural stability and mechanical robustness, which greatly limit their application in the field of bone regeneration. The hybridization of β-TCP with conventional synthetic biodegradable PLA and PCL only produced a limited toughening effect due to the plasticity of the polymers in nature. In this study, a β-TCP/poly(glycerol sebacate) scaffold (β-TCP/PGS) with well interconnected porous structure and robust mechanical property was prepared. Porous β-TCP scaffold was first prepared with polyurethane sponge as template and then impregnated into PGS pre-polymer solution with moderate viscosity, followed by in situ heat crosslinking and freezing-drying process. The results indicated that the freezing-drying under vacuum process could further facilitate crosslinking of PGS and formation of Ca(2+)-COO(-) ionic complexing and thus synergistically improved the mechanical strength of the β-TCP/PGS with in situ heat crosslinking. Particularly, the β-TCP/PGS with 15% PGS content after heat crosslinking at 130°C and freezing-drying at -50°C under vacuum exhibited an elongation at break of 375±25% and a compressive strength of 1.73MPa, 3.7-fold and 200-fold enhancement compared to the β-TCP, respectively. After the abrupt drop of compressive load, the β-TCP/PGS scaffolds exhibited a full recovery of their original shape. More importantly, the PGS polymer in the β-TCP/PGS scaffolds could direct the biomineralization of Ca/P from particulate shape into a nanofiber-interweaved structure. Furthermore, the β-TCP/PGS scaffolds allowed for cell penetration and proliferation, indicating a good cytobiocompatibility. It is believed that β-TCP/PGS scaffolds have great potential application in rigid tissue regeneration. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Improved classification and visualization of healthy and pathological hard dental tissues by modeling specular reflections in NIR hyperspectral images

    Science.gov (United States)

    Usenik, Peter; Bürmen, Miran; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2012-03-01

    Despite major improvements in dental healthcare and technology, dental caries remains one of the most prevalent chronic diseases of modern society. The initial stages of dental caries are characterized by demineralization of enamel crystals, commonly known as white spots, which are difficult to diagnose. Near-infrared (NIR) hyperspectral imaging is a new promising technique for early detection of demineralization which can classify healthy and pathological dental tissues. However, due to non-ideal illumination of the tooth surface the hyperspectral images can exhibit specular reflections, in particular around the edges and the ridges of the teeth. These reflections significantly affect the performance of automated classification and visualization methods. Cross polarized imaging setup can effectively remove the specular reflections, however is due to the complexity and other imaging setup limitations not always possible. In this paper, we propose an alternative approach based on modeling the specular reflections of hard dental tissues, which significantly improves the classification accuracy in the presence of specular reflections. The method was evaluated on five extracted human teeth with corresponding gold standard for 6 different healthy and pathological hard dental tissues including enamel, dentin, calculus, dentin caries, enamel caries and demineralized regions. Principal component analysis (PCA) was used for multivariate local modeling of healthy and pathological dental tissues. The classification was performed by employing multiple discriminant analysis. Based on the obtained results we believe the proposed method can be considered as an effective alternative to the complex cross polarized imaging setups.

  19. Automated classification and visualization of healthy and pathological dental tissues based on near-infrared hyper-spectral imaging

    Science.gov (United States)

    Usenik, Peter; Bürmen, Miran; Vrtovec, Tomaž; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2011-03-01

    Despite major improvements in dental healthcare and technology, dental caries remains one of the most prevalent chronic diseases of modern society. The initial stages of dental caries are characterized by demineralization of enamel crystals, commonly known as white spots which are difficult to diagnose. If detected early enough, such demineralization can be arrested and reversed by non-surgical means through well established dental treatments (fluoride therapy, anti-bacterial therapy, low intensity laser irradiation). Near-infrared (NIR) hyper-spectral imaging is a new promising technique for early detection of demineralization based on distinct spectral features of healthy and pathological dental tissues. In this study, we apply NIR hyper-spectral imaging to classify and visualize healthy and pathological dental tissues including enamel, dentin, calculus, dentin caries, enamel caries and demineralized areas. For this purpose, a standardized teeth database was constructed consisting of 12 extracted human teeth with different degrees of natural dental lesions imaged by NIR hyper-spectral system, X-ray and digital color camera. The color and X-ray images of teeth were presented to a clinical expert for localization and classification of the dental tissues, thereby obtaining the gold standard. Principal component analysis was used for multivariate local modeling of healthy and pathological dental tissues. Finally, the dental tissues were classified by employing multiple discriminant analysis. High agreement was observed between the resulting classification and the gold standard with the classification sensitivity and specificity exceeding 85 % and 97 %, respectively. This study demonstrates that NIR hyper-spectral imaging has considerable diagnostic potential for imaging hard dental tissues.

  20. β-Tricalcium phosphate/poly(glycerol sebacate) scaffolds with robust mechanical property for bone tissue engineering

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Kai [The State Key Laboratory of Bioreactor Engineering, East China University of Science and Technology, Shanghai 200237 (China); Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Zhang, Jing; Ma, Xiaoyu; Ma, Yifan; Kan, Chao [Key Laboratory for Ultrafine Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Ma, Haiyan [Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Li, Yulin, E-mail: yulinli@ecust.edu.cn [Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Yuan, Yuan, E-mail: yyuan@ecust.edu.cn [The State Key Laboratory of Bioreactor Engineering, East China University of Science and Technology, Shanghai 200237 (China); Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Liu, Changsheng, E-mail: liucs@ecust.edu.cn [The State Key Laboratory of Bioreactor Engineering, East China University of Science and Technology, Shanghai 200237 (China); Key Laboratory for Ultrafine Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China); Engineering Research Centre for Biomedical Materials of Ministry of Education, East China University of Science and Technology, Shanghai 200237 (China)

    2015-11-01

    Despite good biocompatibility and osteoconductivity, porous β-TCP scaffolds still lack the structural stability and mechanical robustness, which greatly limit their application in the field of bone regeneration. The hybridization of β-TCP with conventional synthetic biodegradable PLA and PCL only produced a limited toughening effect due to the plasticity of the polymers in nature. In this study, a β-TCP/poly(glycerol sebacate) scaffold (β-TCP/PGS) with well interconnected porous structure and robust mechanical property was prepared. Porous β-TCP scaffold was first prepared with polyurethane sponge as template and then impregnated into PGS pre-polymer solution with moderate viscosity, followed by in situ heat crosslinking and freezing–drying process. The results indicated that the freezing–drying under vacuum process could further facilitate crosslinking of PGS and formation of Ca{sup 2+}–COO{sup −} ionic complexing and thus synergistically improved the mechanical strength of the β-TCP/PGS with in situ heat crosslinking. Particularly, the β-TCP/PGS with 15% PGS content after heat crosslinking at 130 °C and freezing–drying at − 50 °C under vacuum exhibited an elongation at break of 375 ± 25% and a compressive strength of 1.73 MPa, 3.7-fold and 200-fold enhancement compared to the β-TCP, respectively. After the abrupt drop of compressive load, the β-TCP/PGS scaffolds exhibited a full recovery of their original shape. More importantly, the PGS polymer in the β-TCP/PGS scaffolds could direct the biomineralization of Ca/P from particulate shape into a nanofiber-interweaved structure. Furthermore, the β-TCP/PGS scaffolds allowed for cell penetration and proliferation, indicating a good cytobiocompatibility. It is believed that β-TCP/PGS scaffolds have great potential application in rigid tissue regeneration. - Graphical abstract: Robust β-TCP/PGS porous scaffolds are developed by incorporation of poly(glycerol sebacate) (PGS, a flexible

  1. MLVA Based Classification of Mycobacterium tuberculosis Complex Lineages for a Robust Phylogeographic Snapshot of Its Worldwide Molecular Diversity

    Science.gov (United States)

    Hill, Véronique; Zozio, Thierry; Sadikalay, Syndia; Viegas, Sofia; Streit, Elisabeth; Kallenius, Gunilla; Rastogi, Nalin

    2012-01-01

    Multiple-locus variable-number tandem repeat analysis (MLVA) is useful to establish transmission routes and sources of infections for various microorganisms including Mycobacterium tuberculosis complex (MTC). The recently released SITVITWEB database contains 12-loci Mycobacterial Interspersed Repetitive Units – Variable Number of Tandem DNA Repeats (MIRU-VNTR) profiles and spoligotype patterns for thousands of MTC strains; it uses MIRU International Types (MIT) and Spoligotype International Types (SIT) to designate clustered patterns worldwide. Considering existing doubts on the ability of spoligotyping alone to reveal exact phylogenetic relationships between MTC strains, we developed a MLVA based classification for MTC genotypic lineages. We studied 6 different subsets of MTC isolates encompassing 7793 strains worldwide. Minimum spanning trees (MST) were constructed to identify major lineages, and the most common representative located as a central node was taken as the prototype defining different phylogenetic groups. A total of 7 major lineages with their respective prototypes were identified: Indo-Oceanic/MIT57, East Asian and African Indian/MIT17, Euro American/MIT116, West African-I/MIT934, West African-II/MIT664, M. bovis/MIT49, M.canettii/MIT60. Further MST subdivision identified an additional 34 sublineage MIT prototypes. The phylogenetic relationships among the 37 newly defined MIRU-VNTR lineages were inferred using a classification algorithm based on a bayesian approach. This information was used to construct an updated phylogenetic and phylogeographic snapshot of worldwide MTC diversity studied both at the regional, sub-regional, and country level according to the United Nations specifications. We also looked for IS6110 insertional events that are known to modify the results of the spoligotyping in specific circumstances, and showed that a fair portion of convergence leading to the currently observed bias in phylogenetic classification of strains may

  2. The synchronous neural interactions test as a functional neuromarker for post-traumatic stress disorder (PTSD): a robust classification method based on the bootstrap

    Science.gov (United States)

    Georgopoulos, A. P.; Tan, H.-R. M.; Lewis, S. M.; Leuthold, A. C.; Winskowski, A. M.; Lynch, J. K.; Engdahl, B.

    2010-02-01

    Traumatic experiences can produce post-traumatic stress disorder (PTSD) which is a debilitating condition and for which no biomarker currently exists (Institute of Medicine (US) 2006 Posttraumatic Stress Disorder: Diagnosis and Assessment (Washington, DC: National Academies)). Here we show that the synchronous neural interactions (SNI) test which assesses the functional interactions among neural populations derived from magnetoencephalographic (MEG) recordings (Georgopoulos A P et al 2007 J. Neural Eng. 4 349-55) can successfully differentiate PTSD patients from healthy control subjects. Externally cross-validated, bootstrap-based analyses yielded >90% overall accuracy of classification. In addition, all but one of 18 patients who were not receiving medications for their disease were correctly classified. Altogether, these findings document robust differences in brain function between the PTSD and control groups that can be used for differential diagnosis and which possess the potential for assessing and monitoring disease progression and effects of therapy.

  3. A statistically harmonized alignment-classification in image space enables accurate and robust alignment of noisy images in single particle analysis.

    Science.gov (United States)

    Kawata, Masaaki; Sato, Chikara

    2007-06-01

    In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.

  4. Multiplex coherent anti-Stokes Raman scattering microspectroscopy of brain tissue with higher ranking data classification for biomedical imaging

    Science.gov (United States)

    Pohling, Christoph; Bocklitz, Thomas; Duarte, Alex S.; Emmanuello, Cinzia; Ishikawa, Mariana S.; Dietzeck, Benjamin; Buckup, Tiago; Uckermann, Ortrud; Schackert, Gabriele; Kirsch, Matthias; Schmitt, Michael; Popp, Jürgen; Motzkus, Marcus

    2017-06-01

    Multiplex coherent anti-Stokes Raman scattering (MCARS) microscopy was carried out to map a solid tumor in mouse brain tissue. The border between normal and tumor tissue was visualized using support vector machines (SVM) as a higher ranking type of data classification. Training data were collected separately in both tissue types, and the image contrast is based on class affiliation of the single spectra. Color coding in the image generated by SVM is then related to pathological information instead of single spectral intensities or spectral differences within the data set. The results show good agreement with the H&E stained reference and spontaneous Raman microscopy, proving the validity of the MCARS approach in combination with SVM.

  5. Detecting epileptic seizure with different feature extracting strategies using robust machine learning classification techniques by applying advance parameter optimization approach.

    Science.gov (United States)

    Hussain, Lal

    2018-06-01

    Epilepsy is a neurological disorder produced due to abnormal excitability of neurons in the brain. The research reveals that brain activity is monitored through electroencephalogram (EEG) of patients suffered from seizure to detect the epileptic seizure. The performance of EEG detection based epilepsy require feature extracting strategies. In this research, we have extracted varying features extracting strategies based on time and frequency domain characteristics, nonlinear, wavelet based entropy and few statistical features. A deeper study was undertaken using novel machine learning classifiers by considering multiple factors. The support vector machine kernels are evaluated based on multiclass kernel and box constraint level. Likewise, for K-nearest neighbors (KNN), we computed the different distance metrics, Neighbor weights and Neighbors. Similarly, the decision trees we tuned the paramours based on maximum splits and split criteria and ensemble classifiers are evaluated based on different ensemble methods and learning rate. For training/testing tenfold Cross validation was employed and performance was evaluated in form of TPR, NPR, PPV, accuracy and AUC. In this research, a deeper analysis approach was performed using diverse features extracting strategies using robust machine learning classifiers with more advanced optimal options. Support Vector Machine linear kernel and KNN with City block distance metric give the overall highest accuracy of 99.5% which was higher than using the default parameters for these classifiers. Moreover, highest separation (AUC = 0.9991, 0.9990) were obtained at different kernel scales using SVM. Additionally, the K-nearest neighbors with inverse squared distance weight give higher performance at different Neighbors. Moreover, to distinguish the postictal heart rate oscillations from epileptic ictal subjects, and highest performance of 100% was obtained using different machine learning classifiers.

  6. Hunter-gatherer postcranial robusticity relative to patterns of mobility, climatic adaptation, and selection for tissue economy.

    Science.gov (United States)

    Stock, J T

    2006-10-01

    Human skeletal robusticity is influenced by a number of factors, including habitual behavior, climate, and physique. Conflicting evidence as to the relative importance of these factors complicates our ability to interpret variation in robusticity in the past. It remains unclear how the pattern of robusticity in the skeleton relates to adaptive constraints on skeletal morphology. This study investigates variation in robusticity in claviculae, humeri, ulnae, femora, and tibiae among human foragers, relative to climate and habitual behavior. Cross-sectional geometric properties of the diaphyses are compared among hunter-gatherers from southern Africa (n = 83), the Andaman Islands (n = 32), Tierra del Fuego (n = 34), and the Great Lakes region (n = 15). The robusticity of both proximal and distal limb segments correlates negatively with climate and positively with patterns of terrestrial and marine mobility among these groups. However, the relative correspondence between robusticity and these factors varies throughout the body. In the lower limb, partial correlations between polar second moment of area (J(0.73)) and climate decrease from proximal to distal section locations, while this relationship increases from proximal to distal in the upper limb. Patterns of correlation between robusticity and mobility, either terrestrial or marine, generally increase from proximal to distal in the lower and upper limbs, respectively. This suggests that there may be a stronger relationship between observed patterns of diaphyseal hypertrophy and behavioral differences between populations in distal elements. Despite this trend, strength circularity indices at the femoral midshaft show the strongest correspondence with terrestrial mobility, particularly among males.

  7. Three-dimensional analysis and classification of arteries in the skin and subcutaneous adipofascial tissue by computer graphics imaging.

    Science.gov (United States)

    Nakajima, H; Minabe, T; Imanishi, N

    1998-09-01

    To develop new types of surgical flaps that utilize portions of the skin and subcutaneous tissue (e.g., a thin flap or an adipofascial flap), three-dimensional investigation of the vasculature in the skin and subcutaneous tissue has been anticipated. In the present study, total-body arterial injection and three-dimensional imaging of the arteries by computer graphics were performed. The full-thickness skin and subcutaneous adipofascial tissue samples, which were obtained from fresh human cadavers injected with radio-opaque medium, were divided into three distinct layers. Angiograms of each layer were introduced into a personal computer to construct three-dimensional images. On a computer monitor, each artery was shown color-coded according to the three portions: the deep adipofascial layer, superficial adipofascial layer, and dermis. Three-dimensional computerized images of each artery in the skin and subcutaneous tissue revealed the components of each vascular plexus and permitted their classification into six types. The distribution of types in the body correlated with the tissue mobility of each area. Clinically, appreciation of the three-dimensional structure of the arteries allowed the development of several new kinds of flaps.

  8. Multispectral imaging burn wound tissue classification system: a comparison of test accuracies between several common machine learning algorithms

    Science.gov (United States)

    Squiers, John J.; Li, Weizhi; King, Darlene R.; Mo, Weirong; Zhang, Xu; Lu, Yang; Sellke, Eric W.; Fan, Wensheng; DiMaio, J. Michael; Thatcher, Jeffrey E.

    2016-03-01

    The clinical judgment of expert burn surgeons is currently the standard on which diagnostic and therapeutic decisionmaking regarding burn injuries is based. Multispectral imaging (MSI) has the potential to increase the accuracy of burn depth assessment and the intraoperative identification of viable wound bed during surgical debridement of burn injuries. A highly accurate classification model must be developed using machine-learning techniques in order to translate MSI data into clinically-relevant information. An animal burn model was developed to build an MSI training database and to study the burn tissue classification ability of several models trained via common machine-learning algorithms. The algorithms tested, from least to most complex, were: K-nearest neighbors (KNN), decision tree (DT), linear discriminant analysis (LDA), weighted linear discriminant analysis (W-LDA), quadratic discriminant analysis (QDA), ensemble linear discriminant analysis (EN-LDA), ensemble K-nearest neighbors (EN-KNN), and ensemble decision tree (EN-DT). After the ground-truth database of six tissue types (healthy skin, wound bed, blood, hyperemia, partial injury, full injury) was generated by histopathological analysis, we used 10-fold cross validation to compare the algorithms' performances based on their accuracies in classifying data against the ground truth, and each algorithm was tested 100 times. The mean test accuracy of the algorithms were KNN 68.3%, DT 61.5%, LDA 70.5%, W-LDA 68.1%, QDA 68.9%, EN-LDA 56.8%, EN-KNN 49.7%, and EN-DT 36.5%. LDA had the highest test accuracy, reflecting the bias-variance tradeoff over the range of complexities inherent to the algorithms tested. Several algorithms were able to match the current standard in burn tissue classification, the clinical judgment of expert burn surgeons. These results will guide further development of an MSI burn tissue classification system. Given that there are few surgeons and facilities specializing in burn care

  9. The classification of benign and malignant human prostate tissue by multivariate analysis of {sup 1}H magnetic resonance spectra

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, P.; Smith, I.; Leboldus, L.; Littman, C.; Somorjai, L.; Bezabeh, T. [Institute for Biodiagnostic, National Research Council, Manitoba (Canada)

    1998-04-01

    {sup 1}H magnetic resonance spectroscopy studies (360 MHz) were performed on specimens of benign (n = 66) and malignant (n = 21) human prostate tissue from 50 patients and the spectral data were subjected to multivariate analysis, specifically linear-discriminant analysis. On the basis of histopathological assessments, an overall classification accuracy of 96.6 % was achieved, with a sensitivity of 100 % and a specificity of 95.5 % in classifying benign prostatic hyperplasia from prostatic cancer. Resonances due to citrate, glutamate, and taurine were among the six spectral subregions identified by our algorithm as having diagnostic potential. Significantly higher levels of citrate were observed in glandular than in stromal benign prostatic hyperplasia (P < 0.05). This method shows excellent promise for the possibility of in vivo assessment of prostate tissue by magnetic resonance. (author)

  10. Automatic multi-modal MR tissue classification for the assessment of response to bevacizumab in patients with glioblastoma

    International Nuclear Information System (INIS)

    Liberman, Gilad; Louzoun, Yoram; Aizenstein, Orna; Blumenthal, Deborah T.; Bokstein, Felix; Palmon, Mika; Corn, Benjamin W.; Ben Bashat, Dafna

    2013-01-01

    Background: Current methods for evaluation of treatment response in glioblastoma are inaccurate, limited and time-consuming. This study aimed to develop a multi-modal MRI automatic classification method to improve accuracy and efficiency of treatment response assessment in patients with recurrent glioblastoma (GB). Materials and methods: A modification of the k-Nearest-Neighbors (kNN) classification method was developed and applied to 59 longitudinal MR data sets of 13 patients with recurrent GB undergoing bevacizumab (anti-angiogenic) therapy. Changes in the enhancing tumor volume were assessed using the proposed method and compared with Macdonald's criteria and with manual volumetric measurements. The edema-like area was further subclassified into peri- and non-peri-tumoral edema, using both the kNN method and an unsupervised method, to monitor longitudinal changes. Results: Automatic classification using the modified kNN method was applicable in all scans, even when the tumors were infiltrative with unclear borders. The enhancing tumor volume obtained using the automatic method was highly correlated with manual measurements (N = 33, r = 0.96, p < 0.0001), while standard radiographic assessment based on Macdonald's criteria matched manual delineation and automatic results in only 68% of cases. A graded pattern of tumor infiltration within the edema-like area was revealed by both automatic methods, showing high agreement. All classification results were confirmed by a senior neuro-radiologist and validated using MR spectroscopy. Conclusion: This study emphasizes the important role of automatic tools based on a multi-modal view of the tissue in monitoring therapy response in patients with high grade gliomas specifically under anti-angiogenic therapy

  11. High throughput assessment of cells and tissues: Bayesian classification of spectral metrics from infrared vibrational spectroscopic imaging data.

    Science.gov (United States)

    Bhargava, Rohit; Fernandez, Daniel C; Hewitt, Stephen M; Levin, Ira W

    2006-07-01

    Vibrational spectroscopy allows a visualization of tissue constituents based on intrinsic chemical composition and provides a potential route to obtaining diagnostic markers of diseases. Characterizations utilizing infrared vibrational spectroscopy, in particular, are conventionally low throughput in data acquisition, generally lacking in spatial resolution with the resulting data requiring intensive numerical computations to extract information. These factors impair the ability of infrared spectroscopic measurements to represent accurately the spatial heterogeneity in tissue, to incorporate robustly the diversity introduced by patient cohorts or preparative artifacts and to validate developed protocols in large population studies. In this manuscript, we demonstrate a combination of Fourier transform infrared (FTIR) spectroscopic imaging, tissue microarrays (TMAs) and fast numerical analysis as a paradigm for the rapid analysis, development and validation of high throughput spectroscopic characterization protocols. We provide an extended description of the data treatment algorithm and a discussion of various factors that may influence decision-making using this approach. Finally, a number of prostate tissue biopsies, arranged in an array modality, are employed to examine the efficacy of this approach in histologic recognition of epithelial cell polarization in patients displaying a variety of normal, malignant and hyperplastic conditions. An index of epithelial cell polarization, derived from a combined spectral and morphological analysis, is determined to be a potentially useful diagnostic marker.

  12. Biased visualization of hypoperfused tissue by computed tomography due to short imaging duration: improved classification by image down-sampling and vascular models

    Energy Technology Data Exchange (ETDEWEB)

    Mikkelsen, Irene Klaerke; Ribe, Lars Riisgaard; Bekke, Susanne Lise; Tietze, Anna; Oestergaard, Leif; Mouridsen, Kim [Aarhus University Hospital, Center of Functionally Integrative Neuroscience, Aarhus C (Denmark); Jones, P.S.; Alawneh, Josef [University of Cambridge, Department of Clinical Neurosciences, Cambridge (United Kingdom); Puig, Josep; Pedraza, Salva [Dr. Josep Trueta Girona University Hospitals, Department of Radiology, Girona Biomedical Research Institute, Girona (Spain); Gillard, Jonathan H. [University of Cambridge, Department of Radiology, Cambridge (United Kingdom); Warburton, Elisabeth A. [Cambrigde University Hospitals, Addenbrooke, Stroke Unit, Cambridge (United Kingdom); Baron, Jean-Claude [University of Cambridge, Department of Clinical Neurosciences, Cambridge (United Kingdom); Centre Hospitalier Sainte Anne, INSERM U894, Paris (France)

    2015-07-15

    Lesion detection in acute stroke by computed-tomography perfusion (CTP) can be affected by incomplete bolus coverage in veins and hypoperfused tissue, so-called bolus truncation (BT), and low contrast-to-noise ratio (CNR). We examined the BT-frequency and hypothesized that image down-sampling and a vascular model (VM) for perfusion calculation would improve normo- and hypoperfused tissue classification. CTP datasets from 40 acute stroke patients were retrospectively analysed for BT. In 16 patients with hypoperfused tissue but no BT, repeated 2-by-2 image down-sampling and uniform filtering was performed, comparing CNR to perfusion-MRI levels and tissue classification to that of unprocessed data. By simulating reduced scan duration, the minimum scan-duration at which estimated lesion volumes came within 10 % of their true volume was compared for VM and state-of-the-art algorithms. BT in veins and hypoperfused tissue was observed in 9/40 (22.5 %) and 17/40 patients (42.5 %), respectively. Down-sampling to 128 x 128 resolution yielded CNR comparable to MR data and improved tissue classification (p = 0.0069). VM reduced minimum scan duration, providing reliable maps of cerebral blood flow and mean transit time: 5 s (p = 0.03) and 7 s (p < 0.0001), respectively. BT is not uncommon in stroke CTP with 40-s scan duration. Applying image down-sampling and VM improve tissue classification. (orig.)

  13. Biased visualization of hypoperfused tissue by computed tomography due to short imaging duration: improved classification by image down-sampling and vascular models

    International Nuclear Information System (INIS)

    Mikkelsen, Irene Klaerke; Ribe, Lars Riisgaard; Bekke, Susanne Lise; Tietze, Anna; Oestergaard, Leif; Mouridsen, Kim; Jones, P.S.; Alawneh, Josef; Puig, Josep; Pedraza, Salva; Gillard, Jonathan H.; Warburton, Elisabeth A.; Baron, Jean-Claude

    2015-01-01

    Lesion detection in acute stroke by computed-tomography perfusion (CTP) can be affected by incomplete bolus coverage in veins and hypoperfused tissue, so-called bolus truncation (BT), and low contrast-to-noise ratio (CNR). We examined the BT-frequency and hypothesized that image down-sampling and a vascular model (VM) for perfusion calculation would improve normo- and hypoperfused tissue classification. CTP datasets from 40 acute stroke patients were retrospectively analysed for BT. In 16 patients with hypoperfused tissue but no BT, repeated 2-by-2 image down-sampling and uniform filtering was performed, comparing CNR to perfusion-MRI levels and tissue classification to that of unprocessed data. By simulating reduced scan duration, the minimum scan-duration at which estimated lesion volumes came within 10 % of their true volume was compared for VM and state-of-the-art algorithms. BT in veins and hypoperfused tissue was observed in 9/40 (22.5 %) and 17/40 patients (42.5 %), respectively. Down-sampling to 128 x 128 resolution yielded CNR comparable to MR data and improved tissue classification (p = 0.0069). VM reduced minimum scan duration, providing reliable maps of cerebral blood flow and mean transit time: 5 s (p = 0.03) and 7 s (p < 0.0001), respectively. BT is not uncommon in stroke CTP with 40-s scan duration. Applying image down-sampling and VM improve tissue classification. (orig.)

  14. The differentiation of malignant and benign human breast tissue at surgical margins and biopsy using x-ray interaction data and Bayesian classification

    International Nuclear Information System (INIS)

    Mersov, A.; Mersov, G.; Al-Ebraheem, A.; Cornacchi, S.; Gohla, G.; Lovrics, P.; Farquharson, M.J.

    2014-01-01

    Worldwide, about 1.3 million women are diagnosed with breast cancer annually with an estimated 465,000 deaths. Accordingly, there is a need for high accuracy and speed in diagnosis of lesions suspected of being cancerous. This study assesses the interaction data collected from low energy x-rays within breast tissue samples. Trace element concentrations are assessed using x-ray fluorescence, as well as electron density, and molecular structure which are examined using incoherent and coherent scatter, respectively. Our work to date has shown that such data can provide a quantitative measure of certain tissue characterising parameters and hence, through appropriate modelling, could be used to classify samples for uses such as surgical margin detection and biopsy examination. The parameters used in this study for comparing the normal and tumour tissue sample populations are: levels of elements Ca, Cu, Fe, Br, Zn, Rb, K; the area, FWHM and amplitude from peaks fitted to the coherent scatter profile that are associated with fat, fibre and water content; the ratio of the Compton and coherent scatter peak area, FWHM and amplitude from the incoherent scatter profile. The novelty of the approach to this work lies in the fact that the classification process does not rely on one source of data but combines several measurements, the data from which in this application are modelled using a method based on Bayesian classification. The reliability of the classifications was assessed by its application to diagnostically known data that was not itself included in the thresholds determination. The results of the classification of over 70 breast tissue samples will be presented in this study. Bayesian modelling was carried out using selected significant parameters for classification resulting in 71% of normal tissue samples (n=35) and 66% of tumour tissue samples (n=35) being correctly classified when using all the samples. Bayesian classification using the same variables on all

  15. Automated tissue classification of pediatric brains from magnetic resonance images using age-specific atlases

    Science.gov (United States)

    Metzger, Andrew; Benavides, Amanda; Nopoulos, Peg; Magnotta, Vincent

    2016-03-01

    The goal of this project was to develop two age appropriate atlases (neonatal and one year old) that account for the rapid growth and maturational changes that occur during early development. Tissue maps from this age group were initially created by manually correcting the resulting tissue maps after applying an expectation maximization (EM) algorithm and an adult atlas to pediatric subjects. The EM algorithm classified each voxel into one of ten possible tissue types including several subcortical structures. This was followed by a novel level set segmentation designed to improve differentiation between distal cortical gray matter and white matter. To minimize the req uired manual corrections, the adult atlas was registered to the pediatric scans using high -dimensional, symmetric image normalization (SyN) registration. The subject images were then mapped to an age specific atlas space, again using SyN registration, and the resulting transformation applied to the manually corrected tissue maps. The individual maps were averaged in the age specific atlas space and blurred to generate the age appropriate anatomical priors. The resulting anatomical priors were then used by the EM algorithm to re-segment the initial training set as well as an independent testing set. The results from the adult and age-specific anatomical priors were compared to the manually corrected results. The age appropriate atlas provided superior results as compared to the adult atlas. The image analysis pipeline used in this work was built using the open source software package BRAINSTools.

  16. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    Science.gov (United States)

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  17. Lichenoid tissue reaction/interface dermatitis: Recognition, classification, etiology, and clinicopathological overtones

    Directory of Open Access Journals (Sweden)

    Virendra N Sehgal

    2011-01-01

    Full Text Available Lichenoid tissue reaction or interface dermatitis embrace several clinical conditions, the prototype of which is lichen planus and its variants, drug induced lichenoid dermatitis, special forms of lichenoid dermatitis, lichenoid dermatitis in lupus erythematosus, and miscellaneous disorders showing lichenoid dermatitis, the salient clinical and histological features of which are described to facilitate their diagnosis. Background of lichenoid reaction pattern has been briefly outlined to enlighten those interested in this entity.

  18. Influence of keratinized tissue on spontaneous exposure of submerged implants: classification and clinical observations

    Directory of Open Access Journals (Sweden)

    G. Mendoza

    2014-10-01

    Full Text Available Aim: The reasons for spontaneous early exposure (SEE of dental implants during healing have not been established yet. The objective of this study was to assess whether the width of keratinized tissue (KT and other site-related conditions could be associated with implants’ SEE. Materials and methods: Data from 500 implants placed in 138 non-smoking patients, between September 2009 and June 2010, were evaluated. Implants were submerged and allowed to heal for 3 to 6 months. At baseline, the following conditions were documented: the presence of keratinized tissue width > 2 mm; the type of implant site (i.e. fresh extraction socket or edentulous alveolar ridge; concomitant use of guided tissue regeneration. During the healing period, the occurrence of partial or total implants SEE was recorded; thus, a mixed-effects logistic regression analysis was performed to investigate the association between implant site conditions and implant exposure. Results: One hundred and eighty-five implants (37.0% remained submerged after healing and were classified as Class I, whereas 215 (43.0% showed partial spontaneous early exposure (SEE at the first week after implant placement (Class II, and 100 implants (20.0% developed more extensive exposures (Class III. The variables, baseline width of KT (p = 0.18, fresh extraction socket (p = 0.88 and guided tissue regeneration (GTR plus bone substitutes (p = 0.42, were not found to be correlated with implants` SEE, with an odds ratio (OR of 1.29 (95% confidence interval: -0.12–0.63, 1.03 (95% confidence interval: -0.46–0.53 and 1.22 (95% confidence interval: -0.29–0.68, respectively. Conclusion: It was not possible to establish an association between SEE and some implant-related factors; therefore, further investigations focused on the reasons associated to implants’ SEE are needed.

  19. Identification of immune cell infiltration in hematoxylin-eosin stained breast cancer samples: texture-based classification of tissue morphologies

    Science.gov (United States)

    Turkki, Riku; Linder, Nina; Kovanen, Panu E.; Pellinen, Teijo; Lundin, Johan

    2016-03-01

    The characteristics of immune cells in the tumor microenvironment of breast cancer capture clinically important information. Despite the heterogeneity of tumor-infiltrating immune cells, it has been shown that the degree of infiltration assessed by visual evaluation of hematoxylin-eosin (H and E) stained samples has prognostic and possibly predictive value. However, quantification of the infiltration in H and E-stained tissue samples is currently dependent on visual scoring by an expert. Computer vision enables automated characterization of the components of the tumor microenvironment, and texture-based methods have successfully been used to discriminate between different tissue morphologies and cell phenotypes. In this study, we evaluate whether local binary pattern texture features with superpixel segmentation and classification with support vector machine can be utilized to identify immune cell infiltration in H and E-stained breast cancer samples. Guided with the pan-leukocyte CD45 marker, we annotated training and test sets from 20 primary breast cancer samples. In the training set of arbitrary sized image regions (n=1,116) a 3-fold cross-validation resulted in 98% accuracy and an area under the receiver-operating characteristic curve (AUC) of 0.98 to discriminate between immune cell -rich and - poor areas. In the test set (n=204), we achieved an accuracy of 96% and AUC of 0.99 to label cropped tissue regions correctly into immune cell -rich and -poor categories. The obtained results demonstrate strong discrimination between immune cell -rich and -poor tissue morphologies. The proposed method can provide a quantitative measurement of the degree of immune cell infiltration and applied to digitally scanned H and E-stained breast cancer samples for diagnostic purposes.

  20. Machine Learning of Human Pluripotent Stem Cell-Derived Engineered Cardiac Tissue Contractility for Automated Drug Classification

    Directory of Open Access Journals (Sweden)

    Eugene K. Lee

    2017-11-01

    Full Text Available Accurately predicting cardioactive effects of new molecular entities for therapeutics remains a daunting challenge. Immense research effort has been focused toward creating new screening platforms that utilize human pluripotent stem cell (hPSC-derived cardiomyocytes and three-dimensional engineered cardiac tissue constructs to better recapitulate human heart function and drug responses. As these new platforms become increasingly sophisticated and high throughput, the drug screens result in larger multidimensional datasets. Improved automated analysis methods must therefore be developed in parallel to fully comprehend the cellular response across a multidimensional parameter space. Here, we describe the use of machine learning to comprehensively analyze 17 functional parameters derived from force readouts of hPSC-derived ventricular cardiac tissue strips (hvCTS electrically paced at a range of frequencies and exposed to a library of compounds. A generated metric is effective for then determining the cardioactivity of a given drug. Furthermore, we demonstrate a classification model that can automatically predict the mechanistic action of an unknown cardioactive drug.

  1. Segmentation and labeling of the ventricular system in normal pressure hydrocephalus using patch-based tissue classification and multi-atlas labeling

    Science.gov (United States)

    Ellingsen, Lotta M.; Roy, Snehashis; Carass, Aaron; Blitz, Ari M.; Pham, Dzung L.; Prince, Jerry L.

    2016-03-01

    Normal pressure hydrocephalus (NPH) affects older adults and is thought to be caused by obstruction of the normal flow of cerebrospinal fluid (CSF). NPH typically presents with cognitive impairment, gait dysfunction, and urinary incontinence, and may account for more than five percent of all cases of dementia. Unlike most other causes of dementia, NPH can potentially be treated and the neurological dysfunction reversed by shunt surgery or endoscopic third ventriculostomy (ETV), which drain excess CSF. However, a major diagnostic challenge remains to robustly identify shunt-responsive NPH patients from patients with enlarged ventricles due to other neurodegenerative diseases. Currently, radiologists grade the severity of NPH by detailed examination and measurement of the ventricles based on stacks of 2D magnetic resonance images (MRIs). Here we propose a new method to automatically segment and label different compartments of the ventricles in NPH patients from MRIs. While this task has been achieved in healthy subjects, the ventricles in NPH are both enlarged and deformed, causing current algorithms to fail. Here we combine a patch-based tissue classification method with a registration-based multi-atlas labeling method to generate a novel algorithm that labels the lateral, third, and fourth ventricles in subjects with ventriculomegaly. The method is also applicable to other neurodegenerative diseases such as Alzheimer's disease; a condition considered in the differential diagnosis of NPH. Comparison with state of the art segmentation techniques demonstrate substantial improvements in labeling the enlarged ventricles, indicating that this strategy may be a viable option for the diagnosis and characterization of NPH.

  2. A robust approach to identifying tissue-specific gene expression regulatory variants using personalized human induced pluripotent stem cells.

    Directory of Open Access Journals (Sweden)

    Je-Hyuk Lee

    2009-11-01

    Full Text Available Normal variation in gene expression due to regulatory polymorphisms is often masked by biological and experimental noise. In addition, some regulatory polymorphisms may become apparent only in specific tissues. We derived human induced pluripotent stem (iPS cells from adult skin primary fibroblasts and attempted to detect tissue-specific cis-regulatory variants using in vitro cell differentiation. We used padlock probes and high-throughput sequencing for digital RNA allelotyping and measured allele-specific gene expression in primary fibroblasts, lymphoblastoid cells, iPS cells, and their differentiated derivatives. We show that allele-specific expression is both cell type and genotype-dependent, but the majority of detectable allele-specific expression loci remains consistent despite large changes in the cell type or the experimental condition following iPS reprogramming, except on the X-chromosome. We show that our approach to mapping cis-regulatory variants reduces in vitro experimental noise and reveals additional tissue-specific variants using skin-derived human iPS cells.

  3. Classification of trace elements in tissues from organic and conventional French pig production.

    Science.gov (United States)

    Parinet, Julien; Royer, Eric; Saint-Hilaire, Mailie; Chafey, Claude; Noël, Laurent; Minvielle, Brice; Dervilly-Pinel, Gaud; Engel, Erwan; Guérin, Thierry

    2018-07-01

    This study assesses the impact of the farming system on the levels of copper, zinc, arsenic, cadmium, lead and mercury in pig tissues from three types of production (Organic (n = 28), Label Rouge (n = 12) and Conventional (n = 30)) randomly sampled in different slaughterhouses. All the concentrations were below regulatory limits. In muscles, Cu, Zn and As were measured at slightly higher levels in organic samples but no differences between organic and Label Rouge was observed. Livers from conventional and Label Rouge pig farms exhibited higher Zn and Cd contents than the organic ones, probably due to different practice in zinc or phytase supplementation of fattening diets. Principal component analysis indicated a correlation between Cu and As concentrations in liver and carcass weight, and between Zn and Cd liver levels and lean meat percentage. The linear discriminant analysis succeeded in predicting the farming process on the basis of the lean meat percentage and the liver Cd level. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Experiments on automatic classification of tissue malignancy in the field of digital pathology

    Science.gov (United States)

    Pereira, J.; Barata, R.; Furtado, Pedro

    2017-06-01

    Automated analysis of histological images helps diagnose and further classify breast cancer. Totally automated approaches can be used to pinpoint images for further analysis by the medical doctor. But tissue images are especially challenging for either manual or automated approaches, due to mixed patterns and textures, where malignant regions are sometimes difficult to detect unless they are in very advanced stages. Some of the major challenges are related to irregular and very diffuse patterns, as well as difficulty to define winning features and classifier models. Although it is also hard to segment correctly into regions, due to the diffuse nature, it is still crucial to take low-level features over individualized regions instead of the whole image, and to select those with the best outcomes. In this paper we report on our experiments building a region classifier with a simple subspace division and a feature selection model that improves results over image-wide and/or limited feature sets. Experimental results show modest accuracy for a set of classifiers applied over the whole image, while the conjunction of image division, per-region low-level extraction of features and selection of features, together with the use of a neural network classifier achieved the best levels of accuracy for the dataset and settings we used in the experiments. Future work involves deep learning techniques, adding structures semantics and embedding the approach as a tumor finding helper in a practical Medical Imaging Application.

  5. Classification of Laser Induced Fluorescence Spectra from Normal and Malignant bladder tissues using Learning Vector Quantization Neural Network in Bladder Cancer Diagnosis

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Mascarenhas, Kim Komal; Patil, Choudhary

    2008-01-01

    In the present work we discuss the potential of recently developed classification algorithm, Learning Vector Quantization (LVQ), for the analysis of Laser Induced Fluorescence (LIF) Spectra, recorded from normal and malignant bladder tissue samples. The algorithm is prototype based and inherently...

  6. Sugar beet and volunteer potato classification using Bag-of-Visual-Words model, Scale-Invariant Feature Transform, or Speeded Up Robust Feature descriptors and crop row information

    NARCIS (Netherlands)

    Suh, Hyun K.; Hofstee, Jan Willem; IJsselmuiden, Joris; Henten, van Eldert J.

    2018-01-01

    One of the most important steps in vision-based weed detection systems is the classification of weeds growing amongst crops. In the EU SmartBot project it was required to effectively control more than 95% of volunteer potatoes and ensure less than 5% of damage of sugar beet. Classification features

  7. Feasibility of a novel deformable image registration technique to facilitate classification, targeting, and monitoring of tumor and normal tissue

    International Nuclear Information System (INIS)

    Brock, Kristy K.; Dawson, Laura A.; Sharpe, Michael B.; Moseley, Douglas J.; Jaffray, David A.

    2006-01-01

    Purpose: To investigate the feasibility of a biomechanical-based deformable image registration technique for the integration of multimodality imaging, image guided treatment, and response monitoring. Methods and Materials: A multiorgan deformable image registration technique based on finite element modeling (FEM) and surface projection alignment of selected regions of interest with biomechanical material and interface models has been developed. FEM also provides an inherent method for direct tracking specified regions through treatment and follow-up. Results: The technique was demonstrated on 5 liver cancer patients. Differences of up to 1 cm of motion were seen between the diaphragm and the tumor center of mass after deformable image registration of exhale and inhale CT scans. Spatial differences of 5 mm or more were observed for up to 86% of the surface of the defined tumor after deformable image registration of the computed tomography (CT) and magnetic resonance images. Up to 6.8 mm of motion was observed for the tumor after deformable image registration of the CT and cone-beam CT scan after rigid registration of the liver. Deformable registration of the CT to the follow-up CT allowed a more accurate assessment of tumor response. Conclusions: This biomechanical-based deformable image registration technique incorporates classification, targeting, and monitoring of tumor and normal tissue using one methodology

  8. The application of the central limit theorem and the law of large numbers to facial soft tissue depths: T-Table robustness and trends since 2008.

    Science.gov (United States)

    Stephan, Carl N

    2014-03-01

    By pooling independent study means (x¯), the T-Tables use the central limit theorem and law of large numbers to average out study-specific sampling bias and instrument errors and, in turn, triangulate upon human population means (μ). Since their first publication in 2008, new data from >2660 adults have been collected (c.30% of the original sample) making a review of the T-Table's robustness timely. Updated grand means show that the new data have negligible impact on the previously published statistics: maximum change = 1.7 mm at gonion; and ≤1 mm at 93% of all landmarks measured. This confirms the utility of the 2008 T-Table as a proxy to soft tissue depth population means and, together with updated sample sizes (8851 individuals at pogonion), earmarks the 2013 T-Table as the premier mean facial soft tissue depth standard for craniofacial identification casework. The utility of the T-Table, in comparison with shorths and 75-shormaxes, is also discussed. © 2013 American Academy of Forensic Sciences.

  9. Plexiform neurofibroma tissue classification

    Science.gov (United States)

    Weizman, L.; Hoch, L.; Ben Sira, L.; Joskowicz, L.; Pratt, L.; Constantini, S.; Ben Bashat, D.

    2011-03-01

    Plexiform Neurofibroma (PN) is a major complication of NeuroFibromatosis-1 (NF1), a common genetic disease that involving the nervous system. PNs are peripheral nerve sheath tumors extending along the length of the nerve in various parts of the body. Treatment decision is based on tumor volume assessment using MRI, which is currently time consuming and error prone, with limited semi-automatic segmentation support. We present in this paper a new method for the segmentation and tumor mass quantification of PN from STIR MRI scans. The method starts with a user-based delineation of the tumor area in a single slice and automatically detects the PN lesions in the entire image based on the tumor connectivity. Experimental results on seven datasets yield a mean volume overlap difference of 25% as compared to manual segmentation by expert radiologist with a mean computation and interaction time of 12 minutes vs. over an hour for manual annotation. Since the user interaction in the segmentation process is minimal, our method has the potential to successfully become part of the clinical workflow.

  10. Classification of breast tissue for protocols development in mammography exam;Classificacao dos tecidos mamarios para elaboracao de protocolos em exames de mamografia

    Energy Technology Data Exchange (ETDEWEB)

    Teixeira, M. Ines; Caldas, Linda V.E. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Chohfi, Adriana C.S.; Figueiredo, Zenaide A. [Universidade Nove de Julho (UNINOVE), Sao Paulo, SP (Brazil). Dept. da Saude. Radiologia Medica

    2009-07-01

    For accomplishment of a mammogram should be considered several factors such as equipment used and the classification of breast cancer patients. The density of the breast is different of patient to patient, thus making it difficult many precocious diagnostic of breast cancer. There is a significant variation in the breast to be radiographed, depending on several factors presented in the breast tissue of breast prosthesis with or without breasts or with implants. This work had as objective main to make a survey of the factors that influence in the classification of mammary fabrics. With this survey and classification it can be better adjusted the protocols which will be more efficient in the mammography exam, preventing excess of radiation dose, stress for the patient, economy of raw material, control of the quality and rapidity in the process of image attainment. (author)

  11. Spectral multi-energy CT texture analysis with machine learning for tissue classification: an investigation using classification of benign parotid tumours as a testing paradigm.

    Science.gov (United States)

    Al Ajmi, Eiman; Forghani, Behzad; Reinhold, Caroline; Bayat, Maryam; Forghani, Reza

    2018-06-01

    There is a rich amount of quantitative information in spectral datasets generated from dual-energy CT (DECT). In this study, we compare the performance of texture analysis performed on multi-energy datasets to that of virtual monochromatic images (VMIs) at 65 keV only, using classification of the two most common benign parotid neoplasms as a testing paradigm. Forty-two patients with pathologically proven Warthin tumour (n = 25) or pleomorphic adenoma (n = 17) were evaluated. Texture analysis was performed on VMIs ranging from 40 to 140 keV in 5-keV increments (multi-energy analysis) or 65-keV VMIs only, which is typically considered equivalent to single-energy CT. Random forest (RF) models were constructed for outcome prediction using separate randomly selected training and testing sets or the entire patient set. Using multi-energy texture analysis, tumour classification in the independent testing set had accuracy, sensitivity, specificity, positive predictive value, and negative predictive value of 92%, 86%, 100%, 100%, and 83%, compared to 75%, 57%, 100%, 100%, and 63%, respectively, for single-energy analysis. Multi-energy texture analysis demonstrates superior performance compared to single-energy texture analysis of VMIs at 65 keV for classification of benign parotid tumours. • We present and validate a paradigm for texture analysis of DECT scans. • Multi-energy dataset texture analysis is superior to single-energy dataset texture analysis. • DECT texture analysis has high accura\\cy for diagnosis of benign parotid tumours. • DECT texture analysis with machine learning can enhance non-invasive diagnostic tumour evaluation.

  12. Tissue

    Directory of Open Access Journals (Sweden)

    David Morrissey

    2012-01-01

    Full Text Available Purpose. In vivo gene therapy directed at tissues of mesenchymal origin could potentially augment healing. We aimed to assess the duration and magnitude of transene expression in vivo in mice and ex vivo in human tissues. Methods. Using bioluminescence imaging, plasmid and adenoviral vector-based transgene expression in murine quadriceps in vivo was examined. Temporal control was assessed using a doxycycline-inducible system. An ex vivo model was developed and optimised using murine tissue, and applied in ex vivo human tissue. Results. In vivo plasmid-based transgene expression did not silence in murine muscle, unlike in liver. Although maximum luciferase expression was higher in muscle with adenoviral delivery compared with plasmid, expression reduced over time. The inducible promoter cassette successfully regulated gene expression with maximum levels a factor of 11 greater than baseline. Expression was re-induced to a similar level on a temporal basis. Luciferase expression was readily detected ex vivo in human muscle and tendon. Conclusions. Plasmid constructs resulted in long-term in vivo gene expression in skeletal muscle, in a controllable fashion utilising an inducible promoter in combination with oral agents. Successful plasmid gene transfection in human ex vivo mesenchymal tissue was demonstrated for the first time.

  13. Combining multiple hypothesis testing and affinity propagation clustering leads to accurate, robust and sample size independent classification on gene expression data

    Directory of Open Access Journals (Sweden)

    Sakellariou Argiris

    2012-10-01

    Full Text Available Abstract Background A feature selection method in microarray gene expression data should be independent of platform, disease and dataset size. Our hypothesis is that among the statistically significant ranked genes in a gene list, there should be clusters of genes that share similar biological functions related to the investigated disease. Thus, instead of keeping N top ranked genes, it would be more appropriate to define and keep a number of gene cluster exemplars. Results We propose a hybrid FS method (mAP-KL, which combines multiple hypothesis testing and affinity propagation (AP-clustering algorithm along with the Krzanowski & Lai cluster quality index, to select a small yet informative subset of genes. We applied mAP-KL on real microarray data, as well as on simulated data, and compared its performance against 13 other feature selection approaches. Across a variety of diseases and number of samples, mAP-KL presents competitive classification results, particularly in neuromuscular diseases, where its overall AUC score was 0.91. Furthermore, mAP-KL generates concise yet biologically relevant and informative N-gene expression signatures, which can serve as a valuable tool for diagnostic and prognostic purposes, as well as a source of potential disease biomarkers in a broad range of diseases. Conclusions mAP-KL is a data-driven and classifier-independent hybrid feature selection method, which applies to any disease classification problem based on microarray data, regardless of the available samples. Combining multiple hypothesis testing and AP leads to subsets of genes, which classify unknown samples from both, small and large patient cohorts with high accuracy.

  14. Determining the Most Robust Dimensional Structure of Categories from the International Classification of Functioning, Disability and Health Across Subgroups of Persons With Spinal Cord Injury to Build the Basis for Future Clinical Measures

    DEFF Research Database (Denmark)

    Ballert, Carolina S; Stucki, Gerold; Biering-Sørensen, Fin

    2014-01-01

    OBJECTIVE: To determine the most robust dimensional structure of the International Classification of Functioning, Disability and Health (ICF) categories relevant to spinal cord injury (SCI) across subgroups of lesion level, health care context, sex, age, and resources of the country. DESIGN......: A multidimensional between-item response Rasch model was used. The choice of the dimensions was conceptually driven using the ICF components from the functioning chapters and splits of the activity and participation component described in the ICF. SETTING: Secondary analysis of data from an international, cross....... The model fit improvement from the unidimensional to the 2-dimensional and from the 2-dimensional to the 3-dimensional model was significant in all groups (P

  15. Development and validation of Raman spectroscopic classification models to discriminate tongue squamous cell carcinoma from non-tumorous tissue

    NARCIS (Netherlands)

    F.L.J. Cals (Froukje); S. Koljenović (Senada); J.A.U. Hardillo (José); R.J. Baatenburg de Jong (Robert Jan); T.C. Bakker Schut (Tom); G.J. Puppels (Gerwin)

    2016-01-01

    markdownabstractBackground Currently, up to 85% of the oral resection specimens have inadequate resection margins, of which the majority is located in the deeper soft tissue layers. The prognosis of patients with oral cavity squamous cell carcinoma (OCSCC) of the tongue is negatively affected by

  16. Robust Scientists

    DEFF Research Database (Denmark)

    Gorm Hansen, Birgitte

    their core i nterests, 2) developing a selfsupply of industry interests by becoming entrepreneurs and thus creating their own compliant industry partner and 3) balancing resources within a larger collective of researchers, thus countering changes in the influx of funding caused by shifts in political...... knowledge", Danish research policy seems to have helped develop politically and economically "robust scientists". Scientific robustness is acquired by way of three strategies: 1) tasting and discriminating between resources so as to avoid funding that erodes academic profiles and push scientists away from...

  17. SU-F-R-40: Robustness Test of Computed Tomography Textures of Lung Tissues to Varying Scanning Protocols Using a Realistic Phantom Environment

    International Nuclear Information System (INIS)

    Lee, S; Markel, D; Hegyi, G; El Naqa, I

    2016-01-01

    Purpose: The reliability of computed tomography (CT) textures is an important element of radiomics analysis. This study investigates the dependency of lung CT textures on different breathing phases and changes in CT image acquisition protocols in a realistic phantom setting. Methods: We investigated 11 CT texture features for radiation-induced lung disease from 3 categories (first-order, grey level co-ocurrence matrix (GLCM), and Law’s filter). A biomechanical swine lung phantom was scanned at two breathing phases (inhale/exhale) and two scanning protocols set for PET/CT and diagnostic CT scanning. Lung volumes acquired from the CT images were divided into 2-dimensional sub-regions with a grid spacing of 31 mm. The distribution of the evaluated texture features from these sub-regions were compared between the two scanning protocols and two breathing phases. The significance of each factor on feature values were tested at 95% significance level using analysis of covariance (ANCOVA) model with interaction terms included. Robustness of a feature to a scanning factor was defined as non-significant dependence on the factor. Results: Three GLCM textures (variance, sum entropy, difference entropy) were robust to breathing changes. Two GLCM (variance, sum entropy) and 3 Law’s filter textures (S5L5, E5L5, W5L5) were robust to scanner changes. Moreover, the two GLCM textures (variance, sum entropy) were consistent across all 4 scanning conditions. First-order features, especially Hounsfield unit intensity features, presented the most drastic variation up to 39%. Conclusion: Amongst the studied features, GLCM and Law’s filter texture features were more robust than first-order features. However, the majority of the features were modified by either breathing phase or scanner changes, suggesting a need for calibration when retrospectively comparing scans obtained at different conditions. Further investigation is necessary to identify the sensitivity of individual image

  18. SU-F-R-40: Robustness Test of Computed Tomography Textures of Lung Tissues to Varying Scanning Protocols Using a Realistic Phantom Environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S; Markel, D; Hegyi, G [Medical Physics Unit, McGill University, Montreal, Quebec (Canada); El Naqa, I [University of Michigan, Ann Arbor, MI (United States)

    2016-06-15

    Purpose: The reliability of computed tomography (CT) textures is an important element of radiomics analysis. This study investigates the dependency of lung CT textures on different breathing phases and changes in CT image acquisition protocols in a realistic phantom setting. Methods: We investigated 11 CT texture features for radiation-induced lung disease from 3 categories (first-order, grey level co-ocurrence matrix (GLCM), and Law’s filter). A biomechanical swine lung phantom was scanned at two breathing phases (inhale/exhale) and two scanning protocols set for PET/CT and diagnostic CT scanning. Lung volumes acquired from the CT images were divided into 2-dimensional sub-regions with a grid spacing of 31 mm. The distribution of the evaluated texture features from these sub-regions were compared between the two scanning protocols and two breathing phases. The significance of each factor on feature values were tested at 95% significance level using analysis of covariance (ANCOVA) model with interaction terms included. Robustness of a feature to a scanning factor was defined as non-significant dependence on the factor. Results: Three GLCM textures (variance, sum entropy, difference entropy) were robust to breathing changes. Two GLCM (variance, sum entropy) and 3 Law’s filter textures (S5L5, E5L5, W5L5) were robust to scanner changes. Moreover, the two GLCM textures (variance, sum entropy) were consistent across all 4 scanning conditions. First-order features, especially Hounsfield unit intensity features, presented the most drastic variation up to 39%. Conclusion: Amongst the studied features, GLCM and Law’s filter texture features were more robust than first-order features. However, the majority of the features were modified by either breathing phase or scanner changes, suggesting a need for calibration when retrospectively comparing scans obtained at different conditions. Further investigation is necessary to identify the sensitivity of individual image

  19. Facial Symmetry in Robust Anthropometrics

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 57, č. 3 (2012), s. 691-698 ISSN 0022-1198 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : forensic science * anthropology * robust image analysis * correlation analysis * multivariate data * classification Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.244, year: 2012

  20. Classification of 27 Tumor-Associated Antigens by Histochemical Analysis of 36 Freshly Resected Lung Cancer Tissues

    Directory of Open Access Journals (Sweden)

    Gene Kurosawa

    2016-11-01

    Full Text Available In previous studies, we identified 29 tumor-associated antigens (TAAs and isolated 488 human monoclonal antibodies (mAbs that specifically bind to one of the 29 TAAs. In the present study, we performed histochemical analysis of 36 freshly resected lung cancer tissues by using 60 mAbs against 27 TAAs. Comparison of the staining patterns of tumor cells, bronchial epithelial cells, and normal pulmonary alveolus cells and interalveolar septum allowed us to determine the type and location of cells that express target molecules, as well as the degree of expression. The patterns were classified into 7 categories. While multiple Abs were used against certain TAAs, the differences observed among them should be derived from differences in the binding activity and/or the epitope. Thus, such data indicate the versatility of respective clones as anti-cancer drugs. Although the information obtained was limited to the lung and bronchial tube, bronchial epithelial cells represent normal growing cells, and therefore, the data are informative. The results indicate that 9 of the 27 TAAs are suitable targets for therapeutic Abs. These 9 Ags include EGFR, HER2, TfR, and integrin α6β4. Based on our findings, a pharmaceutical company has started to develop anti-cancer drugs by using Abs to TfR and integrin α6β4. HGFR, PTP-LAR, CD147, CDCP1, and integrin αvβ3 are also appropriate targets for therapeutic purposes.

  1. Machine-learning-based classification of real-time tissue elastography for hepatic fibrosis in patients with chronic hepatitis B.

    Science.gov (United States)

    Chen, Yang; Luo, Yan; Huang, Wei; Hu, Die; Zheng, Rong-Qin; Cong, Shu-Zhen; Meng, Fan-Kun; Yang, Hong; Lin, Hong-Jun; Sun, Yan; Wang, Xiu-Yan; Wu, Tao; Ren, Jie; Pei, Shu-Fang; Zheng, Ying; He, Yun; Hu, Yu; Yang, Na; Yan, Hongmei

    2017-10-01

    Hepatic fibrosis is a common middle stage of the pathological processes of chronic liver diseases. Clinical intervention during the early stages of hepatic fibrosis can slow the development of liver cirrhosis and reduce the risk of developing liver cancer. Performing a liver biopsy, the gold standard for viral liver disease management, has drawbacks such as invasiveness and a relatively high sampling error rate. Real-time tissue elastography (RTE), one of the most recently developed technologies, might be promising imaging technology because it is both noninvasive and provides accurate assessments of hepatic fibrosis. However, determining the stage of liver fibrosis from RTE images in a clinic is a challenging task. In this study, in contrast to the previous liver fibrosis index (LFI) method, which predicts the stage of diagnosis using RTE images and multiple regression analysis, we employed four classical classifiers (i.e., Support Vector Machine, Naïve Bayes, Random Forest and K-Nearest Neighbor) to build a decision-support system to improve the hepatitis B stage diagnosis performance. Eleven RTE image features were obtained from 513 subjects who underwent liver biopsies in this multicenter collaborative research. The experimental results showed that the adopted classifiers significantly outperformed the LFI method and that the Random Forest(RF) classifier provided the highest average accuracy among the four machine algorithms. This result suggests that sophisticated machine-learning methods can be powerful tools for evaluating the stage of hepatic fibrosis and show promise for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Real-time target tracking of soft tissues in 3D ultrasound images based on robust visual information and mechanical simulation.

    Science.gov (United States)

    Royer, Lucas; Krupa, Alexandre; Dardenne, Guillaume; Le Bras, Anthony; Marchand, Eric; Marchal, Maud

    2017-01-01

    In this paper, we present a real-time approach that allows tracking deformable structures in 3D ultrasound sequences. Our method consists in obtaining the target displacements by combining robust dense motion estimation and mechanical model simulation. We perform evaluation of our method through simulated data, phantom data, and real-data. Results demonstrate that this novel approach has the advantage of providing correct motion estimation regarding different ultrasound shortcomings including speckle noise, large shadows and ultrasound gain variation. Furthermore, we show the good performance of our method with respect to state-of-the-art techniques by testing on the 3D databases provided by MICCAI CLUST'14 and CLUST'15 challenges. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  4. Fourier Transform Infrared (FT-IR) and Laser Ablation Inductively Coupled Plasma-Mass Spectrometry (LA-ICP-MS) Imaging of Cerebral Ischemia: Combined Analysis of Rat Brain Thin Cuts Toward Improved Tissue Classification.

    Science.gov (United States)

    Balbekova, Anna; Lohninger, Hans; van Tilborg, Geralda A F; Dijkhuizen, Rick M; Bonta, Maximilian; Limbeck, Andreas; Lendl, Bernhard; Al-Saad, Khalid A; Ali, Mohamed; Celikic, Minja; Ofner, Johannes

    2018-02-01

    Microspectroscopic techniques are widely used to complement histological studies. Due to recent developments in the field of chemical imaging, combined chemical analysis has become attractive. This technique facilitates a deepened analysis compared to single techniques or side-by-side analysis. In this study, rat brains harvested one week after induction of photothrombotic stroke were investigated. Adjacent thin cuts from rats' brains were imaged using Fourier transform infrared (FT-IR) microspectroscopy and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). The LA-ICP-MS data were normalized using an internal standard (a thin gold layer). The acquired hyperspectral data cubes were fused and subjected to multivariate analysis. Brain regions affected by stroke as well as unaffected gray and white matter were identified and classified using a model based on either partial least squares discriminant analysis (PLS-DA) or random decision forest (RDF) algorithms. The RDF algorithm demonstrated the best results for classification. Improved classification was observed in the case of fused data in comparison to individual data sets (either FT-IR or LA-ICP-MS). Variable importance analysis demonstrated that both molecular and elemental content contribute to the improved RDF classification. Univariate spectral analysis identified biochemical properties of the assigned tissue types. Classification of multisensor hyperspectral data sets using an RDF algorithm allows access to a novel and in-depth understanding of biochemical processes and solid chemical allocation of different brain regions.

  5. The analysis of image feature robustness using cometcloud

    Directory of Open Access Journals (Sweden)

    Xin Qi

    2012-01-01

    Full Text Available The robustness of image features is a very important consideration in quantitative image analysis. The objective of this paper is to investigate the robustness of a range of image texture features using hematoxylin stained breast tissue microarray slides which are assessed while simulating different imaging challenges including out of focus, changes in magnification and variations in illumination, noise, compression, distortion, and rotation. We employed five texture analysis methods and tested them while introducing all of the challenges listed above. The texture features that were evaluated include co-occurrence matrix, center-symmetric auto-correlation, texture feature coding method, local binary pattern, and texton. Due to the independence of each transformation and texture descriptor, a network structured combination was proposed and deployed on the Rutgers private cloud. The experiments utilized 20 randomly selected tissue microarray cores. All the combinations of the image transformations and deformations are calculated, and the whole feature extraction procedure was completed in 70 minutes using a cloud equipped with 20 nodes. Center-symmetric auto-correlation outperforms all the other four texture descriptors but also requires the longest computational time. It is roughly 10 times slower than local binary pattern and texton. From a speed perspective, both the local binary pattern and texton features provided excellent performance for classification and content-based image retrieval.

  6. Methods for robustness programming

    NARCIS (Netherlands)

    Olieman, N.J.

    2008-01-01

    Robustness of an object is defined as the probability that an object will have properties as required. Robustness Programming (RP) is a mathematical approach for Robustness estimation and Robustness optimisation. An example in the context of designing a food product, is finding the best composition

  7. Robustness in laying hens

    NARCIS (Netherlands)

    Star, L.

    2008-01-01

    The aim of the project ‘The genetics of robustness in laying hens’ was to investigate nature and regulation of robustness in laying hens under sub-optimal conditions and the possibility to increase robustness by using animal breeding without loss of production. At the start of the project, a robust

  8. S.E. Mitchell Vascular Anomalies Flow Chart (SEMVAFC): A visual pathway combining clinical and imaging findings for classification of soft-tissue vascular anomalies

    International Nuclear Information System (INIS)

    Tekes, A.; Koshy, J.; Kalayci, T.O.; Puttgen, K.; Cohen, B.; Redett, R.; Mitchell, S.E.

    2014-01-01

    Classification of vascular anomalies (VAs) is challenging due to overlapping clinical symptoms, confusing terminology in the literature and unfamiliarity with this complex entity. It is important to recognize that VAs include two distinct entities, vascular tumours (VTs) and vascular malformations (VaMs). In this article, we describe SE Mitchell Vascular Anomalies Flow Chart (SEMVAFC), which arises from a multidisciplinary approach that incorporates clinical symptoms, physical examination and magnetic resonance imaging (MRI) findings to establish International Society for the Study of Vascular Anomalies (ISSVA)-based classification of the VAs. SEMVAFC provides a clear visual pathway for physicians to accurately diagnose Vas, which is important as treatment, management, and prognosis differ between VTs and VaMs

  9. Magnetic resonance imaging-based cerebral tissue classification reveals distinct spatiotemporal patterns of changes after stroke in non-human primates.

    Science.gov (United States)

    Bouts, Mark J R J; Westmoreland, Susan V; de Crespigny, Alex J; Liu, Yutong; Vangel, Mark; Dijkhuizen, Rick M; Wu, Ona; D'Arceuil, Helen E

    2015-12-15

    Spatial and temporal changes in brain tissue after acute ischemic stroke are still poorly understood. Aims of this study were three-fold: (1) to determine unique temporal magnetic resonance imaging (MRI) patterns at the acute, subacute and chronic stages after stroke in macaques by combining quantitative T2 and diffusion MRI indices into MRI 'tissue signatures', (2) to evaluate temporal differences in these signatures between transient (n = 2) and permanent (n = 2) middle cerebral artery occlusion, and (3) to correlate histopathology findings in the chronic stroke period to the acute and subacute MRI derived tissue signatures. An improved iterative self-organizing data analysis algorithm was used to combine T2, apparent diffusion coefficient (ADC), and fractional anisotropy (FA) maps across seven successive timepoints (1, 2, 3, 24, 72, 144, 240 h) which revealed five temporal MRI signatures, that were different from the normal tissue pattern (P MRI signatures associated with specific tissue fates may further aid in assessing and monitoring the efficacy of novel pharmaceutical treatments for stroke in a pre-clinical and clinical setting.

  10. Influence of Structure and Composition on Dynamic Viscoelastic Property of Cartilaginous Tissue: Criteria for Classification between Hyaline Cartilage and Fibrocartilage Based on Mechanical Function

    Science.gov (United States)

    Miyata, Shogo; Tateishi, Tetsuya; Furukawa, Katsuko; Ushida, Takashi

    Recently, many types of methodologies have been developed to regenerate articular cartilage. It is important to assess whether the reconstructed cartilaginous tissue has the appropriate mechanical functions to qualify as hyaline (articular) cartilage. In some cases, the reconstructed tissue may become fibrocartilage and not hyaline cartilage. In this study, we determined the dynamic viscoelastic properties of these two types of cartilage by using compression and shear tests, respectively. Hyaline cartilage specimens were harvested from the articular surface of bovine knee joints and fibrocartilage specimens were harvested from the meniscus tissue of the same. The results of this study revealed that the compressive energy dissipation of hyaline cartilage showed a strong dependence on testing frequency at low frequencies, while that of fibrocartilage did not. Therefore, the compressive energy dissipation that is indicated by the loss tangent could become the criterion for the in vitro assessment of the mechanical function of regenerated cartilage.

  11. Perceptual Robust Design

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard

    The research presented in this PhD thesis has focused on a perceptual approach to robust design. The results of the research and the original contribution to knowledge is a preliminary framework for understanding, positioning, and applying perceptual robust design. Product quality is a topic...... been presented. Therefore, this study set out to contribute to the understanding and application of perceptual robust design. To achieve this, a state-of-the-art and current practice review was performed. From the review two main research problems were identified. Firstly, a lack of tools...... for perceptual robustness was found to overlap with the optimum for functional robustness and at most approximately 2.2% out of the 14.74% could be ascribed solely to the perceptual robustness optimisation. In conclusion, the thesis have offered a new perspective on robust design by merging robust design...

  12. Magnetic resonance imaging-based cerebral tissue classification reveals distinct spatiotemporal patterns of changes after stroke in non-human primates

    NARCIS (Netherlands)

    Bouts, Mark. J. R. J.; Westmoreland, Susan. V.; de Crespigny, Alex J.; Liu, Yutong; Vangel, Mark; Dijkhuizen, Rick M.; Wu, Ona; D'Arceuil, Helen E.

    2015-01-01

    Background: Spatial and temporal changes in brain tissue after acute ischemic stroke are still poorly understood. Aims of this study were three-fold: (1) to determine unique temporal magnetic resonance imaging (MRI) patterns at the acute, subacute and chronic stages after stroke in macaques by

  13. How to Reduce Dimensionality of Data: Robustness Point of View

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Rensová, D.

    2015-01-01

    Roč. 10, č. 1 (2015), s. 131-140 ISSN 1452-4864 R&D Projects: GA ČR GA13-17187S Institutional support: RVO:67985807 Keywords : data analysis * dimensionality reduction * robust statistics * principal component analysis * robust classification analysis Subject RIV: BB - Applied Statistics, Operational Research

  14. Robustness of Structural Systems

    DEFF Research Database (Denmark)

    Canisius, T.D.G.; Sørensen, John Dalsgaard; Baker, J.W.

    2007-01-01

    The importance of robustness as a property of structural systems has been recognised following several structural failures, such as that at Ronan Point in 1968,where the consequenceswere deemed unacceptable relative to the initiating damage. A variety of research efforts in the past decades have...... attempted to quantify aspects of robustness such as redundancy and identify design principles that can improve robustness. This paper outlines the progress of recent work by the Joint Committee on Structural Safety (JCSS) to develop comprehensive guidance on assessing and providing robustness in structural...... systems. Guidance is provided regarding the assessment of robustness in a framework that considers potential hazards to the system, vulnerability of system components, and failure consequences. Several proposed methods for quantifying robustness are reviewed, and guidelines for robust design...

  15. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  16. Is overall similarity classification less effortful than single-dimension classification?

    Science.gov (United States)

    Wills, Andy J; Milton, Fraser; Longmore, Christopher A; Hester, Sarah; Robinson, Jo

    2013-01-01

    It is sometimes argued that the implementation of an overall similarity classification is less effortful than the implementation of a single-dimension classification. In the current article, we argue that the evidence securely in support of this view is limited, and report additional evidence in support of the opposite proposition--overall similarity classification is more effortful than single-dimension classification. Using a match-to-standards procedure, Experiments 1A, 1B and 2 demonstrate that concurrent load reduces the prevalence of overall similarity classification, and that this effect is robust to changes in the concurrent load task employed, the level of time pressure experienced, and the short-term memory requirements of the classification task. Experiment 3 demonstrates that participants who produced overall similarity classifications from the outset have larger working memory capacities than those who produced single-dimension classifications initially, and Experiment 4 demonstrates that instructions to respond meticulously increase the prevalence of overall similarity classification.

  17. Robustness of Structures

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Vrouwenvelder, A.C.W.M.; Sørensen, John Dalsgaard

    2011-01-01

    In 2005, the Joint Committee on Structural Safety (JCSS) together with Working Commission (WC) 1 of the International Association of Bridge and Structural Engineering (IABSE) organized a workshop on robustness of structures. Two important decisions resulted from this workshop, namely...... ‘COST TU0601: Robustness of Structures’ was initiated in February 2007, aiming to provide a platform for exchanging and promoting research in the area of structural robustness and to provide a basic framework, together with methods, strategies and guidelines enhancing robustness of structures...... the development of a joint European project on structural robustness under the COST (European Cooperation in Science and Technology) programme and the decision to develop a more elaborate document on structural robustness in collaboration between experts from the JCSS and the IABSE. Accordingly, a project titled...

  18. Robust Growth Determinants

    OpenAIRE

    Doppelhofer, Gernot; Weeks, Melvyn

    2011-01-01

    This paper investigates the robustness of determinants of economic growth in the presence of model uncertainty, parameter heterogeneity and outliers. The robust model averaging approach introduced in the paper uses a flexible and parsi- monious mixture modeling that allows for fat-tailed errors compared to the normal benchmark case. Applying robust model averaging to growth determinants, the paper finds that eight out of eighteen variables found to be significantly related to economic growth ...

  19. Robust Programming by Example

    OpenAIRE

    Bishop , Matt; Elliott , Chip

    2011-01-01

    Part 2: WISE 7; International audience; Robust programming lies at the heart of the type of coding called “secure programming”. Yet it is rarely taught in academia. More commonly, the focus is on how to avoid creating well-known vulnerabilities. While important, that misses the point: a well-structured, robust program should anticipate where problems might arise and compensate for them. This paper discusses one view of robust programming and gives an example of how it may be taught.

  20. Influence of Nucleoshuttling of the ATM Protein in the Healthy Tissues Response to Radiation Therapy: Toward a Molecular Classification of Human Radiosensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Granzotto, Adeline [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Benadjaoud, Mohamed Amine [INSERM UMRS 1018, Institut Gustave-Roussy, Villejuif (France); Vogin, Guillaume [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Institut de Cancérologie de Lorraine, Vandoeuvre-les-Nancy (France); Devic, Clément; Ferlazzo, Mélanie L. [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Bodgi, Larry [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Université Saint-Joseph, Beirut (Lebanon); Pereira, Sandrine; Sonzogni, Laurène [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Forcheron, Fabien [Institut de Recherche Biomédicale des Armées, Brétigny-sur-Orge (France); Viau, Muriel; Etaix, Aurélie; Malek, Karim; Mengue-Bindjeme, Laurence [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Escoffier, Clémence; Rouvet, Isabelle; Zabot, Marie-Thérèse [Centre de Biotechnologie Cellulaire et Biothèque, Groupement Hospitalier Est, Hospices Civils de Lyon, Bron (France); Joubert, Aurélie [Institut de Radioprotection et de Sûreté Nucléaire, Fontenay-aux-Roses (France); Vincent, Anne; Venezia, Nicole Dalla [INSERM, UMR1052, Cancer Research Centre of Lyon, Lyon (France); Bourguignon, Michel [Institut de Radioprotection et de Sûreté Nucléaire, Fontenay-aux-Roses (France); and others

    2016-03-01

    Purpose: Whereas post–radiation therapy overreactions (OR) represent a clinical and societal issue, there is still no consensual radiobiological endpoint to predict clinical radiosensitivity. Since 2003, skin biopsy specimens have been collected from patients treated by radiation therapy against different tumor localizations and showing a wide range of OR. Here, we aimed to establish quantitative links between radiobiological factors and OR severity grades that would be relevant to radioresistant and genetic hyperradiosensitive cases. Methods and Materials: Immunofluorescence experiments were performed on a collection of skin fibroblasts from 12 radioresistant, 5 hyperradiosensitive, and 100 OR patients irradiated at 2 Gy. The numbers of micronuclei, γH2AX, and pATM foci that reflect different steps of DNA double-strand breaks (DSB) recognition and repair were assessed from 10 minutes to 24 hours after irradiation and plotted against the severity grades established by the Common Terminology Criteria for Adverse Events and the Radiation Therapy Oncology Group. Results: OR patients did not necessarily show a gross DSB repair defect but a systematic delay in the nucleoshuttling of the ATM protein required for complete DSB recognition. Among the radiobiological factors, the maximal number of pATM foci provided the best discrimination among OR patients and a significant correlation with each OR severity grade, independently of tumor localization and of the early or late nature of reactions. Conclusions: Our results are consistent with a general classification of human radiosensitivity based on 3 groups: radioresistance (group I); moderate radiosensitivity caused by delay of nucleoshuttling of ATM, which includes OR patients (group II); and hyperradiosensitivity caused by a gross DSB repair defect, which includes fatal cases (group III).

  1. Robust procedures in chemometrics

    DEFF Research Database (Denmark)

    Kotwa, Ewelina

    properties of the analysed data. The broad theoretical background of robust procedures was given as a very useful supplement to the classical methods, and a new tool, based on robust PCA, aiming at identifying Rayleigh and Raman scatters in excitation-mission (EEM) data was developed. The results show...

  2. Robustness Beamforming Algorithms

    Directory of Open Access Journals (Sweden)

    Sajad Dehghani

    2014-04-01

    Full Text Available Adaptive beamforming methods are known to degrade in the presence of steering vector and covariance matrix uncertinity. In this paper, a new approach is presented to robust adaptive minimum variance distortionless response beamforming make robust against both uncertainties in steering vector and covariance matrix. This method minimize a optimization problem that contains a quadratic objective function and a quadratic constraint. The optimization problem is nonconvex but is converted to a convex optimization problem in this paper. It is solved by the interior-point method and optimum weight vector to robust beamforming is achieved.

  3. Deep learning for image classification

    Science.gov (United States)

    McCoppin, Ryan; Rizki, Mateen

    2014-06-01

    This paper provides an overview of deep learning and introduces the several subfields of deep learning including a specific tutorial of convolutional neural networks. Traditional methods for learning image features are compared to deep learning techniques. In addition, we present our preliminary classification results, our basic implementation of a convolutional restricted Boltzmann machine on the Mixed National Institute of Standards and Technology database (MNIST), and we explain how to use deep learning networks to assist in our development of a robust gender classification system.

  4. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  5. Robustness of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2008-01-01

    This paper describes the background of the robustness requirements implemented in the Danish Code of Practice for Safety of Structures and in the Danish National Annex to the Eurocode 0, see (DS-INF 146, 2003), (DS 409, 2006), (EN 1990 DK NA, 2007) and (Sørensen and Christensen, 2006). More...... frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure combined with increased requirements to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new structures essential....... According to Danish design rules robustness shall be documented for all structures in high consequence class. The design procedure to document sufficient robustness consists of: 1) Review of loads and possible failure modes / scenarios and determination of acceptable collapse extent; 2) Review...

  6. Robustness of structures

    DEFF Research Database (Denmark)

    Vrouwenvelder, T.; Sørensen, John Dalsgaard

    2009-01-01

    After the collapse of the World Trade Centre towers in 2001 and a number of collapses of structural systems in the beginning of the century, robustness of structural systems has gained renewed interest. Despite many significant theoretical, methodical and technological advances, structural...... of robustness for structural design such requirements are not substantiated in more detail, nor have the engineering profession been able to agree on an interpretation of robustness which facilitates for its uantification. A European COST action TU 601 on ‘Robustness of structures' has started in 2007...... by a group of members of the CSS. This paper describes the ongoing work in this action, with emphasis on the development of a theoretical and risk based quantification and optimization procedure on the one side and a practical pre-normative guideline on the other....

  7. Robust Approaches to Forecasting

    OpenAIRE

    Jennifer Castle; David Hendry; Michael P. Clements

    2014-01-01

    We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods ar...

  8. Robustness - theoretical framework

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Rizzuto, Enrico; Faber, Michael H.

    2010-01-01

    More frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure combined with increased requirements to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new struct...... of this fact sheet is to describe a theoretical and risk based framework to form the basis for quantification of robustness and for pre-normative guidelines....

  9. Qualitative Robustness in Estimation

    Directory of Open Access Journals (Sweden)

    Mohammed Nasser

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Times New Roman","serif";} Qualitative robustness, influence function, and breakdown point are three main concepts to judge an estimator from the viewpoint of robust estimation. It is important as well as interesting to study relation among them. This article attempts to present the concept of qualitative robustness as forwarded by first proponents and its later development. It illustrates intricacies of qualitative robustness and its relation with consistency, and also tries to remove commonly believed misunderstandings about relation between influence function and qualitative robustness citing some examples from literature and providing a new counter-example. At the end it places a useful finite and a simulated version of   qualitative robustness index (QRI. In order to assess the performance of the proposed measures, we have compared fifteen estimators of correlation coefficient using simulated as well as real data sets.

  10. Fuzzy One-Class Classification Model Using Contamination Neighborhoods

    Directory of Open Access Journals (Sweden)

    Lev V. Utkin

    2012-01-01

    Full Text Available A fuzzy classification model is studied in the paper. It is based on the contaminated (robust model which produces fuzzy expected risk measures characterizing classification errors. Optimal classification parameters of the models are derived by minimizing the fuzzy expected risk. It is shown that an algorithm for computing the classification parameters is reduced to a set of standard support vector machine tasks with weighted data points. Experimental results with synthetic data illustrate the proposed fuzzy model.

  11. Gender classification under extended operating conditions

    Science.gov (United States)

    Rude, Howard N.; Rizki, Mateen

    2014-06-01

    Gender classification is a critical component of a robust image security system. Many techniques exist to perform gender classification using facial features. In contrast, this paper explores gender classification using body features extracted from clothed subjects. Several of the most effective types of features for gender classification identified in literature were implemented and applied to the newly developed Seasonal Weather And Gender (SWAG) dataset. SWAG contains video clips of approximately 2000 samples of human subjects captured over a period of several months. The subjects are wearing casual business attire and outer garments appropriate for the specific weather conditions observed in the Midwest. The results from a series of experiments are presented that compare the classification accuracy of systems that incorporate various types and combinations of features applied to multiple looks at subjects at different image resolutions to determine a baseline performance for gender classification.

  12. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  13. Ensemble of randomized soft decision trees for robust classification

    Indian Academy of Sciences (India)

    G KISHOR KUMAR

    1Department of Information Technology, Rajeev Gandhi Memorial College of Engineering and Technology,. Nandyal ... Data Mining is a process which discovers knowledge from ... for many applications like diagnosis systems [23], video.

  14. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  15. Robustness in econometrics

    CERN Document Server

    Sriboonchitta, Songsak; Huynh, Van-Nam

    2017-01-01

    This book presents recent research on robustness in econometrics. Robust data processing techniques – i.e., techniques that yield results minimally affected by outliers – and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.

  16. Robust Manufacturing Control

    CERN Document Server

    2013-01-01

    This contributed volume collects research papers, presented at the CIRP Sponsored Conference Robust Manufacturing Control: Innovative and Interdisciplinary Approaches for Global Networks (RoMaC 2012, Jacobs University, Bremen, Germany, June 18th-20th 2012). These research papers present the latest developments and new ideas focusing on robust manufacturing control for global networks. Today, Global Production Networks (i.e. the nexus of interconnected material and information flows through which products and services are manufactured, assembled and distributed) are confronted with and expected to adapt to: sudden and unpredictable large-scale changes of important parameters which are occurring more and more frequently, event propagation in networks with high degree of interconnectivity which leads to unforeseen fluctuations, and non-equilibrium states which increasingly characterize daily business. These multi-scale changes deeply influence logistic target achievement and call for robust planning and control ...

  17. Robust plasmonic substrates

    DEFF Research Database (Denmark)

    Kostiučenko, Oksana; Fiutowski, Jacek; Tamulevicius, Tomas

    2014-01-01

    Robustness is a key issue for the applications of plasmonic substrates such as tip-enhanced Raman spectroscopy, surface-enhanced spectroscopies, enhanced optical biosensing, optical and optoelectronic plasmonic nanosensors and others. A novel approach for the fabrication of robust plasmonic...... substrates is presented, which relies on the coverage of gold nanostructures with diamond-like carbon (DLC) thin films of thicknesses 25, 55 and 105 nm. DLC thin films were grown by direct hydrocarbon ion beam deposition. In order to find the optimum balance between optical and mechanical properties...

  18. Robust Self Tuning Controllers

    DEFF Research Database (Denmark)

    Poulsen, Niels Kjølstad

    1985-01-01

    The present thesis concerns robustness properties of adaptive controllers. It is addressed to methods for robustifying self tuning controllers with respect to abrupt changes in the plant parameters. In the thesis an algorithm for estimating abruptly changing parameters is presented. The estimator...... has several operation modes and a detector for controlling the mode. A special self tuning controller has been developed to regulate plant with changing time delay.......The present thesis concerns robustness properties of adaptive controllers. It is addressed to methods for robustifying self tuning controllers with respect to abrupt changes in the plant parameters. In the thesis an algorithm for estimating abruptly changing parameters is presented. The estimator...

  19. Robust surgery loading

    NARCIS (Netherlands)

    Hans, Elias W.; Wullink, Gerhard; van Houdenhoven, Mark; Kazemier, Geert

    2008-01-01

    We consider the robust surgery loading problem for a hospital’s operating theatre department, which concerns assigning surgeries and sufficient planned slack to operating room days. The objective is to maximize capacity utilization and minimize the risk of overtime, and thus cancelled patients. This

  20. Robustness Envelopes of Networks

    NARCIS (Netherlands)

    Trajanovski, S.; Martín-Hernández, J.; Winterbach, W.; Van Mieghem, P.

    2013-01-01

    We study the robustness of networks under node removal, considering random node failure, as well as targeted node attacks based on network centrality measures. Whilst both of these have been studied in the literature, existing approaches tend to study random failure in terms of average-case

  1. Robust boosting via convex optimization

    Science.gov (United States)

    Rätsch, Gunnar

    2001-12-01

    In this work we consider statistical learning problems. A learning machine aims to extract information from a set of training examples such that it is able to predict the associated label on unseen examples. We consider the case where the resulting classification or regression rule is a combination of simple rules - also called base hypotheses. The so-called boosting algorithms iteratively find a weighted linear combination of base hypotheses that predict well on unseen data. We address the following issues: o The statistical learning theory framework for analyzing boosting methods. We study learning theoretic guarantees on the prediction performance on unseen examples. Recently, large margin classification techniques emerged as a practical result of the theory of generalization, in particular Boosting and Support Vector Machines. A large margin implies a good generalization performance. Hence, we analyze how large the margins in boosting are and find an improved algorithm that is able to generate the maximum margin solution. o How can boosting methods be related to mathematical optimization techniques? To analyze the properties of the resulting classification or regression rule, it is of high importance to understand whether and under which conditions boosting converges. We show that boosting can be used to solve large scale constrained optimization problems, whose solutions are well characterizable. To show this, we relate boosting methods to methods known from mathematical optimization, and derive convergence guarantees for a quite general family of boosting algorithms. o How to make Boosting noise robust? One of the problems of current boosting techniques is that they are sensitive to noise in the training sample. In order to make boosting robust, we transfer the soft margin idea from support vector learning to boosting. We develop theoretically motivated regularized algorithms that exhibit a high noise robustness. o How to adapt boosting to regression problems

  2. Design of Robust Neural Network Classifiers

    DEFF Research Database (Denmark)

    Larsen, Jan; Andersen, Lars Nonboe; Hintz-Madsen, Mads

    1998-01-01

    This paper addresses a new framework for designing robust neural network classifiers. The network is optimized using the maximum a posteriori technique, i.e., the cost function is the sum of the log-likelihood and a regularization term (prior). In order to perform robust classification, we present...... a modified likelihood function which incorporates the potential risk of outliers in the data. This leads to the introduction of a new parameter, the outlier probability. Designing the neural classifier involves optimization of network weights as well as outlier probability and regularization parameters. We...... suggest to adapt the outlier probability and regularisation parameters by minimizing the error on a validation set, and a simple gradient descent scheme is derived. In addition, the framework allows for constructing a simple outlier detector. Experiments with artificial data demonstrate the potential...

  3. Robust photometric stereo using structural light sources

    Science.gov (United States)

    Han, Tian-Qi; Cheng, Yue; Shen, Hui-Liang; Du, Xin

    2014-05-01

    We propose a robust photometric stereo method by using structural arrangement of light sources. In the arrangement, light sources are positioned on a planar grid and form a set of collinear combinations. The shadow pixels are detected by adaptive thresholding. The specular highlight and diffuse pixels are distinguished according to their intensity deviations of the collinear combinations, thanks to the special arrangement of light sources. The highlight detection problem is cast as a pattern classification problem and is solved using support vector machine classifiers. Considering the possible misclassification of highlight pixels, the ℓ1 regularization is further employed in normal map estimation. Experimental results on both synthetic and real-world scenes verify that the proposed method can robustly recover the surface normal maps in the case of heavy specular reflection and outperforms the state-of-the-art techniques.

  4. Competition improves robustness against loss of information

    Directory of Open Access Journals (Sweden)

    Arash eKermani Kolankeh

    2015-03-01

    Full Text Available A substantial number of works aimed at modeling the receptive field properties of the primary visual cortex (V1. Their evaluation criterion is usually the similarity of the model response properties to the recorded responses from biological organisms. However, as several algorithms were able to demonstrate some degree of similarity to biological data based on the existing criteria, we focus on the robustness against loss of information in the form of occlusions as an additional constraint for better understanding the algorithmic level of early vision in the brain. We try to investigate the influence of competition mechanisms on the robustness. Therefore, we compared four methods employing different competition mechanisms, namely, independent component analysis, non-negative matrix factorization with sparseness constraint, predictive coding/biased competition, and a Hebbian neural network with lateral inhibitory connections. Each of those methods is known to be capable of developing receptive fields comparable to those of V1 simple-cells. Since measuring the robustness of methods having simple-cell like receptive fields against occlusion is difficult, we measure the robustness using the classification accuracy on the MNIST hand written digit dataset. For this we trained all methods on the training set of the MNIST hand written digits dataset and tested them on a MNIST test set with different levels of occlusions. We observe that methods which employ competitive mechanisms have higher robustness against loss of information. Also the kind of the competition mechanisms plays an important role in robustness. Global feedback inhibition as employed in predictive coding/biased competition has an advantage compared to local lateral inhibition learned by an anti-Hebb rule.

  5. A robust classic.

    Science.gov (United States)

    Kutzner, Florian; Vogel, Tobias; Freytag, Peter; Fiedler, Klaus

    2011-01-01

    In the present research, we argue for the robustness of illusory correlations (ICs, Hamilton & Gifford, 1976) regarding two boundary conditions suggested in previous research. First, we argue that ICs are maintained under extended experience. Using simulations, we derive conflicting predictions. Whereas noise-based accounts predict ICs to be maintained (Fielder, 2000; Smith, 1991), a prominent account based on discrepancy-reducing feedback learning predicts ICs to disappear (Van Rooy et al., 2003). An experiment involving 320 observations with majority and minority members supports the claim that ICs are maintained. Second, we show that actively using the stereotype to make predictions that are met with reward and punishment does not eliminate the bias. In addition, participants' operant reactions afford a novel online measure of ICs. In sum, our findings highlight the robustness of ICs that can be explained as a result of unbiased but noisy learning.

  6. Bread crumb classification using fractal and multifractal features

    OpenAIRE

    Baravalle, Rodrigo Guillermo; Delrieux, Claudio Augusto; Gómez, Juan Carlos

    2017-01-01

    Adequate image descriptors are fundamental in image classification and object recognition. Main requirements for image features are robustness and low dimensionality which would lead to low classification errors in a variety of situations and with a reasonable computational cost. In this context, the identification of materials poses a significant challenge, since typical (geometric and/or differential) feature extraction methods are not robust enough. Texture features based on Fourier or wav...

  7. Robust Airline Schedules

    OpenAIRE

    Eggenberg, Niklaus; Salani, Matteo; Bierlaire, Michel

    2010-01-01

    Due to economic pressure industries, when planning, tend to focus on optimizing the expected profit or the yield. The consequence of highly optimized solutions is an increased sensitivity to uncertainty. This generates additional "operational" costs, incurred by possible modifications of the original plan to be performed when reality does not reflect what was expected in the planning phase. The modern research trend focuses on "robustness" of solutions instead of yield or profit. Although ro...

  8. Robust rooftop extraction from visible band images using higher order CRF

    KAUST Repository

    Li, Er; Femiani, John; Xu, Shibiao; Zhang, Xiaopeng; Wonka, Peter

    2015-01-01

    In this paper, we propose a robust framework for building extraction in visible band images. We first get an initial classification of the pixels based on an unsupervised presegmentation. Then, we develop a novel conditional random field (CRF

  9. The Crane Robust Control

    Directory of Open Access Journals (Sweden)

    Marek Hicar

    2004-01-01

    Full Text Available The article is about a control design for complete structure of the crane: crab, bridge and crane uplift.The most important unknown parameters for simulations are burden weight and length of hanging rope. We will use robustcontrol for crab and bridge control to ensure adaptivity for burden weight and rope length. Robust control will be designed for current control of the crab and bridge, necessary is to know the range of unknown parameters. Whole robust will be splitto subintervals and after correct identification of unknown parameters the most suitable robust controllers will be chosen.The most important condition at the crab and bridge motion is avoiding from burden swinging in the final position. Crab and bridge drive is designed by asynchronous motor fed from frequency converter. We will use crane uplift with burden weightobserver in combination for uplift, crab and bridge drive with cooperation of their parameters: burden weight, rope length and crab and bridge position. Controllers are designed by state control method. We will use preferably a disturbance observerwhich will identify burden weight as a disturbance. The system will be working in both modes at empty hook as well asat maximum load: burden uplifting and dropping down.

  10. Classification of first branchial cleft anomalies: is it clinically relevant ...

    African Journals Online (AJOL)

    Background: There are three classification systems for first branchial cleft anomalies currently in use. The Arnot, Work and Olsen classifications describe these lesions on the basis of morphology, tissue of origin and clinical appearance. However, the clinical relevance of these classifications is debated, as they may not be ...

  11. Engineering Robust Nanocomposite Networks

    Science.gov (United States)

    2014-12-16

    information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 12...mucous secreted by the marine worm Chaetopterus sp. behaves rheologically as a yield stress gel. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF...measurements  of  the  early-­‐stage  aggregation  process   were   modeled  by  collision  driven  growth  of  porous  structures

  12. Robust surface roughness indices and morphological interpretation

    Science.gov (United States)

    Trevisani, Sebastiano; Rocca, Michele

    2016-04-01

    Geostatistical-based image/surface texture indices based on variogram (Atkison and Lewis, 2000; Herzfeld and Higginson, 1996; Trevisani et al., 2012) and on its robust variant MAD (median absolute differences, Trevisani and Rocca, 2015) offer powerful tools for the analysis and interpretation of surface morphology (potentially not limited to solid earth). In particular, the proposed robust index (Trevisani and Rocca, 2015) with its implementation based on local kernels permits the derivation of a wide set of robust and customizable geomorphometric indices capable to outline specific aspects of surface texture. The stability of MAD in presence of signal noise and abrupt changes in spatial variability is well suited for the analysis of high-resolution digital terrain models. Moreover, the implementation of MAD by means of a pixel-centered perspective based on local kernels, with some analogies to the local binary pattern approach (Lucieer and Stein, 2005; Ojala et al., 2002), permits to create custom roughness indices capable to outline different aspects of surface roughness (Grohmann et al., 2011; Smith, 2015). In the proposed poster, some potentialities of the new indices in the context of geomorphometry and landscape analysis will be presented. At same time, challenges and future developments related to the proposed indices will be outlined. Atkinson, P.M., Lewis, P., 2000. Geostatistical classification for remote sensing: an introduction. Computers & Geosciences 26, 361-371. Grohmann, C.H., Smith, M.J., Riccomini, C., 2011. Multiscale Analysis of Topographic Surface Roughness in the Midland Valley, Scotland. IEEE Transactions on Geoscience and Remote Sensing 49, 1220-1213. Herzfeld, U.C., Higginson, C.A., 1996. Automated geostatistical seafloor classification - Principles, parameters, feature vectors, and discrimination criteria. Computers and Geosciences, 22 (1), pp. 35-52. Lucieer, A., Stein, A., 2005. Texture-based landform segmentation of LiDAR imagery

  13. Lauren classification and individualized chemotherapy in gastric cancer

    OpenAIRE

    MA, JUNLI; SHEN, HONG; KAPESA, LINDA; ZENG, SHAN

    2016-01-01

    Gastric cancer is one of the most common malignancies worldwide. During the last 50 years, the histological classification of gastric carcinoma has been largely based on Lauren's criteria, in which gastric cancer is classified into two major histological subtypes, namely intestinal type and diffuse type adenocarcinoma. This classification was introduced in 1965, and remains currently widely accepted and employed, since it constitutes a simple and robust classification approach. The two histol...

  14. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  15. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  16. Robust efficient video fingerprinting

    Science.gov (United States)

    Puri, Manika; Lubin, Jeffrey

    2009-02-01

    We have developed a video fingerprinting system with robustness and efficiency as the primary and secondary design criteria. In extensive testing, the system has shown robustness to cropping, letter-boxing, sub-titling, blur, drastic compression, frame rate changes, size changes and color changes, as well as to the geometric distortions often associated with camcorder capture in cinema settings. Efficiency is afforded by a novel two-stage detection process in which a fast matching process first computes a number of likely candidates, which are then passed to a second slower process that computes the overall best match with minimal false alarm probability. One key component of the algorithm is a maximally stable volume computation - a three-dimensional generalization of maximally stable extremal regions - that provides a content-centric coordinate system for subsequent hash function computation, independent of any affine transformation or extensive cropping. Other key features include an efficient bin-based polling strategy for initial candidate selection, and a final SIFT feature-based computation for final verification. We describe the algorithm and its performance, and then discuss additional modifications that can provide further improvement to efficiency and accuracy.

  17. When machine vision meets histology: A comparative evaluation of model architecture for classification of histology sections.

    Science.gov (United States)

    Zhong, Cheng; Han, Ju; Borowsky, Alexander; Parvin, Bahram; Wang, Yunfu; Chang, Hang

    2017-01-01

    Classification of histology sections in large cohorts, in terms of distinct regions of microanatomy (e.g., stromal) and histopathology (e.g., tumor, necrosis), enables the quantification of tumor composition, and the construction of predictive models of genomics and clinical outcome. To tackle the large technical variations and biological heterogeneities, which are intrinsic in large cohorts, emerging systems utilize either prior knowledge from pathologists or unsupervised feature learning for invariant representation of the underlying properties in the data. However, to a large degree, the architecture for tissue histology classification remains unexplored and requires urgent systematical investigation. This paper is the first attempt to provide insights into three fundamental questions in tissue histology classification: I. Is unsupervised feature learning preferable to human engineered features? II. Does cellular saliency help? III. Does the sparse feature encoder contribute to recognition? We show that (a) in I, both Cellular Morphometric Feature and features from unsupervised feature learning lead to superior performance when compared to SIFT and [Color, Texture]; (b) in II, cellular saliency incorporation impairs the performance for systems built upon pixel-/patch-level features; and (c) in III, the effect of the sparse feature encoder is correlated with the robustness of features, and the performance can be consistently improved by the multi-stage extension of systems built upon both Cellular Morphmetric Feature and features from unsupervised feature learning. These insights are validated with two cohorts of Glioblastoma Multiforme (GBM) and Kidney Clear Cell Carcinoma (KIRC). Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Training echo state networks for rotation-invariant bone marrow cell classification.

    Science.gov (United States)

    Kainz, Philipp; Burgsteiner, Harald; Asslaber, Martin; Ahammer, Helmut

    2017-01-01

    The main principle of diagnostic pathology is the reliable interpretation of individual cells in context of the tissue architecture. Especially a confident examination of bone marrow specimen is dependent on a valid classification of myeloid cells. In this work, we propose a novel rotation-invariant learning scheme for multi-class echo state networks (ESNs), which achieves very high performance in automated bone marrow cell classification. Based on representing static images as temporal sequence of rotations, we show how ESNs robustly recognize cells of arbitrary rotations by taking advantage of their short-term memory capacity. The performance of our approach is compared to a classification random forest that learns rotation-invariance in a conventional way by exhaustively training on multiple rotations of individual samples. The methods were evaluated on a human bone marrow image database consisting of granulopoietic and erythropoietic cells in different maturation stages. Our ESN approach to cell classification does not rely on segmentation of cells or manual feature extraction and can therefore directly be applied to image data.

  19. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  20. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  1. Passion, Robustness and Perseverance

    DEFF Research Database (Denmark)

    Lim, Miguel Antonio; Lund, Rebecca

    2016-01-01

    Evaluation and merit in the measured university are increasingly based on taken-for-granted assumptions about the “ideal academic”. We suggest that the scholar now needs to show that she is passionate about her work and that she gains pleasure from pursuing her craft. We suggest that passion...... and pleasure achieve an exalted status as something compulsory. The scholar ought to feel passionate about her work and signal that she takes pleasure also in the difficult moments. Passion has become a signal of robustness and perseverance in a job market characterised by funding shortages, increased pressure...... way to demonstrate their potential and, crucially, their passion for their work. Drawing on the literature on technologies of governance, we reflect on what is captured and what is left out by these two evaluation instruments. We suggest that bibliometric analysis at the individual level is deeply...

  2. Robust Optical Flow Estimation

    Directory of Open Access Journals (Sweden)

    Javier Sánchez Pérez

    2013-10-01

    Full Text Available n this work, we describe an implementation of the variational method proposed by Brox etal. in 2004, which yields accurate optical flows with low running times. It has several benefitswith respect to the method of Horn and Schunck: it is more robust to the presence of outliers,produces piecewise-smooth flow fields and can cope with constant brightness changes. Thismethod relies on the brightness and gradient constancy assumptions, using the information ofthe image intensities and the image gradients to find correspondences. It also generalizes theuse of continuous L1 functionals, which help mitigate the effect of outliers and create a TotalVariation (TV regularization. Additionally, it introduces a simple temporal regularizationscheme that enforces a continuous temporal coherence of the flow fields.

  3. Robust Multimodal Dictionary Learning

    Science.gov (United States)

    Cao, Tian; Jojic, Vladimir; Modla, Shannon; Powell, Debbie; Czymmek, Kirk; Niethammer, Marc

    2014-01-01

    We propose a robust multimodal dictionary learning method for multimodal images. Joint dictionary learning for both modalities may be impaired by lack of correspondence between image modalities in training data, for example due to areas of low quality in one of the modalities. Dictionaries learned with such non-corresponding data will induce uncertainty about image representation. In this paper, we propose a probabilistic model that accounts for image areas that are poorly corresponding between the image modalities. We cast the problem of learning a dictionary in presence of problematic image patches as a likelihood maximization problem and solve it with a variant of the EM algorithm. Our algorithm iterates identification of poorly corresponding patches and re-finements of the dictionary. We tested our method on synthetic and real data. We show improvements in image prediction quality and alignment accuracy when using the method for multimodal image registration. PMID:24505674

  4. Robust snapshot interferometric spectropolarimetry.

    Science.gov (United States)

    Kim, Daesuk; Seo, Yoonho; Yoon, Yonghee; Dembele, Vamara; Yoon, Jae Woong; Lee, Kyu Jin; Magnusson, Robert

    2016-05-15

    This Letter describes a Stokes vector measurement method based on a snapshot interferometric common-path spectropolarimeter. The proposed scheme, which employs an interferometric polarization-modulation module, can extract the spectral polarimetric parameters Ψ(k) and Δ(k) of a transmissive anisotropic object by which an accurate Stokes vector can be calculated in the spectral domain. It is inherently strongly robust to the object 3D pose variation, since it is designed distinctly so that the measured object can be placed outside of the interferometric module. Experiments are conducted to verify the feasibility of the proposed system. The proposed snapshot scheme enables us to extract the spectral Stokes vector of a transmissive anisotropic object within tens of msec with high accuracy.

  5. International Conference on Robust Statistics

    CERN Document Server

    Filzmoser, Peter; Gather, Ursula; Rousseeuw, Peter

    2003-01-01

    Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.

  6. Robustness Analyses of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Hald, Frederik

    2013-01-01

    The robustness of structural systems has obtained a renewed interest arising from a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures, many mo...... with respect to robustness of timber structures and will discuss the consequences of such robustness issues related to the future development of timber structures.......The robustness of structural systems has obtained a renewed interest arising from a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures, many...... modern building codes consider the need for the robustness of structures and provide strategies and methods to obtain robustness. Therefore, a structural engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper summaries issues...

  7. Weakly supervised classification in high energy physics

    International Nuclear Information System (INIS)

    Dery, Lucio Mwinmaarong; Nachman, Benjamin; Rubbo, Francesco; Schwartzman, Ariel

    2017-01-01

    As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.

  8. Weakly supervised classification in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Dery, Lucio Mwinmaarong [Physics Department, Stanford University,Stanford, CA, 94305 (United States); Nachman, Benjamin [Physics Division, Lawrence Berkeley National Laboratory,1 Cyclotron Rd, Berkeley, CA, 94720 (United States); Rubbo, Francesco; Schwartzman, Ariel [SLAC National Accelerator Laboratory, Stanford University,2575 Sand Hill Rd, Menlo Park, CA, 94025 (United States)

    2017-05-29

    As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. This paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics — quark versus gluon tagging — we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervised classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.

  9. Dynamics robustness of cascading systems.

    Directory of Open Access Journals (Sweden)

    Jonathan T Young

    2017-03-01

    Full Text Available A most important property of biochemical systems is robustness. Static robustness, e.g., homeostasis, is the insensitivity of a state against perturbations, whereas dynamics robustness, e.g., homeorhesis, is the insensitivity of a dynamic process. In contrast to the extensively studied static robustness, dynamics robustness, i.e., how a system creates an invariant temporal profile against perturbations, is little explored despite transient dynamics being crucial for cellular fates and are reported to be robust experimentally. For example, the duration of a stimulus elicits different phenotypic responses, and signaling networks process and encode temporal information. Hence, robustness in time courses will be necessary for functional biochemical networks. Based on dynamical systems theory, we uncovered a general mechanism to achieve dynamics robustness. Using a three-stage linear signaling cascade as an example, we found that the temporal profiles and response duration post-stimulus is robust to perturbations against certain parameters. Then analyzing the linearized model, we elucidated the criteria of when signaling cascades will display dynamics robustness. We found that changes in the upstream modules are masked in the cascade, and that the response duration is mainly controlled by the rate-limiting module and organization of the cascade's kinetics. Specifically, we found two necessary conditions for dynamics robustness in signaling cascades: 1 Constraint on the rate-limiting process: The phosphatase activity in the perturbed module is not the slowest. 2 Constraints on the initial conditions: The kinase activity needs to be fast enough such that each module is saturated even with fast phosphatase activity and upstream changes are attenuated. We discussed the relevance of such robustness to several biological examples and the validity of the above conditions therein. Given the applicability of dynamics robustness to a variety of systems, it

  10. Robust continuous clustering.

    Science.gov (United States)

    Shah, Sohil Atul; Koltun, Vladlen

    2017-09-12

    Clustering is a fundamental procedure in the analysis of scientific data. It is used ubiquitously across the sciences. Despite decades of research, existing clustering algorithms have limited effectiveness in high dimensions and often require tuning parameters for different domains and datasets. We present a clustering algorithm that achieves high accuracy across multiple domains and scales efficiently to high dimensions and large datasets. The presented algorithm optimizes a smooth continuous objective, which is based on robust statistics and allows heavily mixed clusters to be untangled. The continuous nature of the objective also allows clustering to be integrated as a module in end-to-end feature learning pipelines. We demonstrate this by extending the algorithm to perform joint clustering and dimensionality reduction by efficiently optimizing a continuous global objective. The presented approach is evaluated on large datasets of faces, hand-written digits, objects, newswire articles, sensor readings from the Space Shuttle, and protein expression levels. Our method achieves high accuracy across all datasets, outperforming the best prior algorithm by a factor of 3 in average rank.

  11. Robust Trust in Expert Testimony

    Directory of Open Access Journals (Sweden)

    Christian Dahlman

    2015-05-01

    Full Text Available The standard of proof in criminal trials should require that the evidence presented by the prosecution is robust. This requirement of robustness says that it must be unlikely that additional information would change the probability that the defendant is guilty. Robustness is difficult for a judge to estimate, as it requires the judge to assess the possible effect of information that the he or she does not have. This article is concerned with expert witnesses and proposes a method for reviewing the robustness of expert testimony. According to the proposed method, the robustness of expert testimony is estimated with regard to competence, motivation, external strength, internal strength and relevance. The danger of trusting non-robust expert testimony is illustrated with an analysis of the Thomas Quick Case, a Swedish legal scandal where a patient at a mental institution was wrongfully convicted for eight murders.

  12. Functional Basis of Microorganism Classification.

    Science.gov (United States)

    Zhu, Chengsheng; Delmont, Tom O; Vogel, Timothy M; Bromberg, Yana

    2015-08-01

    Correctly identifying nearest "neighbors" of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned with

  13. Functional Basis of Microorganism Classification

    Science.gov (United States)

    Zhu, Chengsheng; Delmont, Tom O.; Vogel, Timothy M.; Bromberg, Yana

    2015-01-01

    Correctly identifying nearest “neighbors” of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned

  14. Classification across gene expression microarray studies

    Directory of Open Access Journals (Sweden)

    Kuner Ruprecht

    2009-12-01

    Full Text Available Abstract Background The increasing number of gene expression microarray studies represents an important resource in biomedical research. As a result, gene expression based diagnosis has entered clinical practice for patient stratification in breast cancer. However, the integration and combined analysis of microarray studies remains still a challenge. We assessed the potential benefit of data integration on the classification accuracy and systematically evaluated the generalization performance of selected methods on four breast cancer studies comprising almost 1000 independent samples. To this end, we introduced an evaluation framework which aims to establish good statistical practice and a graphical way to monitor differences. The classification goal was to correctly predict estrogen receptor status (negative/positive and histological grade (low/high of each tumor sample in an independent study which was not used for the training. For the classification we chose support vector machines (SVM, predictive analysis of microarrays (PAM, random forest (RF and k-top scoring pairs (kTSP. Guided by considerations relevant for classification across studies we developed a generalization of kTSP which we evaluated in addition. Our derived version (DV aims to improve the robustness of the intrinsic invariance of kTSP with respect to technologies and preprocessing. Results For each individual study the generalization error was benchmarked via complete cross-validation and was found to be similar for all classification methods. The misclassification rates were substantially higher in classification across studies, when each single study was used as an independent test set while all remaining studies were combined for the training of the classifier. However, with increasing number of independent microarray studies used in the training, the overall classification performance improved. DV performed better than the average and showed slightly less variance. In

  15. ACCUWIND - Methods for classification of cup anemometers

    Energy Technology Data Exchange (ETDEWEB)

    Dahlberg, J.Aa.; Friis Pedersen, T.; Busche, P.

    2006-05-15

    Errors associated with the measurement of wind speed are the major sources of uncertainties in power performance testing of wind turbines. Field comparisons of well-calibrated anemometers show significant and not acceptable difference. The European CLASSCUP project posed the objectives to quantify the errors associated with the use of cup anemometers, and to develop a classification system for quantification of systematic errors of cup anemometers. This classification system has now been implemented in the IEC 61400-12-1 standard on power performance measurements in annex I and J. The classification of cup anemometers requires general external climatic operational ranges to be applied for the analysis of systematic errors. A Class A category classification is connected to reasonably flat sites, and another Class B category is connected to complex terrain, General classification indices are the result of assessment of systematic deviations. The present report focuses on methods that can be applied for assessment of such systematic deviations. A new alternative method for torque coefficient measurements at inclined flow have been developed, which have then been applied and compared to the existing methods developed in the CLASSCUP project and earlier. A number of approaches including the use of two cup anemometer models, two methods of torque coefficient measurement, two angular response measurements, and inclusion and exclusion of influence of friction have been implemented in the classification process in order to assess the robustness of methods. The results of the analysis are presented as classification indices, which are compared and discussed. (au)

  16. Robust Robot Grasp Detection in Multimodal Fusion

    Directory of Open Access Journals (Sweden)

    Zhang Qiang

    2017-01-01

    Full Text Available Accurate robot grasp detection for model free objects plays an important role in robotics. With the development of RGB-D sensors, object perception technology has made great progress. Reach feature expression by the colour and the depth data is a critical problem that needs to be addressed in order to accomplish the grasping task. To solve the problem of data fusion, this paper proposes a convolutional neural networks (CNN based approach combined with regression and classification. In the CNN model, the colour and the depth modal data are deeply fused together to achieve accurate feature expression. Additionally, Welsch function is introduced into the approach to enhance robustness of the training process. Experiment results demonstrates the superiority of the proposed method.

  17. A robust dataset-agnostic heart disease classifier from Phonocardiogram.

    Science.gov (United States)

    Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M

    2017-07-01

    Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.

  18. High-dimensional data in economics and their (robust) analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 12, č. 1 (2017), s. 171-183 ISSN 1452-4864 R&D Projects: GA ČR GA17-07384S Institutional support: RVO:67985556 Keywords : econometrics * high-dimensional data * dimensionality reduction * linear regression * classification analysis * robustness Subject RIV: BA - General Mathematics OBOR OECD: Business and management http://library.utia.cas.cz/separaty/2017/SI/kalina-0474076.pdf

  19. High-dimensional Data in Economics and their (Robust) Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2017-01-01

    Roč. 12, č. 1 (2017), s. 171-183 ISSN 1452-4864 R&D Projects: GA ČR GA17-07384S Grant - others:GA ČR(CZ) GA13-01930S Institutional support: RVO:67985807 Keywords : econometrics * high-dimensional data * dimensionality reduction * linear regression * classification analysis * robustness Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability

  20. SAW Classification Algorithm for Chinese Text Classification

    OpenAIRE

    Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang

    2015-01-01

    Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...

  1. Wagner classification and culture analysis of diabetic foot infection

    Directory of Open Access Journals (Sweden)

    Fatma Bozkurt

    2011-03-01

    Full Text Available The aim of this study was to determine the concordance ratio between microorganisms isolated from deep tissue culture and those from superficial culture in patients with diabetic foot according to Wagner’s wound classification method.Materials and methods: A total of 63 patients with Diabetic foot infection, who were admitted to Dicle University Hospital between October 2006 and November 2007, were included into the study. Wagner’s classification method was used for wound classification. For microbiologic studies superficial and deep tissue specimens were obtained from each patient, and were rapidly sent to laboratory for aerob and anaerob cultures. Microbiologic data were analyzed and interpreted in line with sensitivity and specifity formula.Results: Thirty-eight (60% of the patients were in Wagner’s classification ≤2, while 25 (40% patients were Wagner’s classification ≥3. According to our culture results, 66 (69% Gr (+ and 30 (31% Gr (- microorganisms grew in Wagner classification ≤2 patients. While in Wagner classification ≥3; 25 (35% Gr (+ and 46 (65% Gr (- microorganisms grew. Microorganisms grew in 89% of superficial cultures and 64% of the deep tissue cultures in patients with Wagner classification ≤2, while microorganism grew in 64% of Wagner classification ≥3.Conclusion: In ulcers of diabetic food infections, initial treatment should be started according to result of sterile superficial culture, but deep tissue culture should be taken, if unresponsive to initial treatment.

  2. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  3. Robustness of IPTV business models

    NARCIS (Netherlands)

    Bouwman, H.; Zhengjia, M.; Duin, P. van der; Limonard, S.

    2008-01-01

    The final stage in the STOF method is an evaluation of the robustness of the design, for which the method provides some guidelines. For many innovative services, the future holds numerous uncertainties, which makes evaluating the robustness of a business model a difficult task. In this chapter, we

  4. Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2009-01-01

    Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure....

  5. Gynecomastia Classification for Surgical Management: A Systematic Review and Novel Classification System.

    Science.gov (United States)

    Waltho, Daniel; Hatchell, Alexandra; Thoma, Achilleas

    2017-03-01

    Gynecomastia is a common deformity of the male breast, where certain cases warrant surgical management. There are several surgical options, which vary depending on the breast characteristics. To guide surgical management, several classification systems for gynecomastia have been proposed. A systematic review was performed to (1) identify all classification systems for the surgical management of gynecomastia, and (2) determine the adequacy of these classification systems to appropriately categorize the condition for surgical decision-making. The search yielded 1012 articles, and 11 articles were included in the review. Eleven classification systems in total were ascertained, and a total of 10 unique features were identified: (1) breast size, (2) skin redundancy, (3) breast ptosis, (4) tissue predominance, (5) upper abdominal laxity, (6) breast tuberosity, (7) nipple malposition, (8) chest shape, (9) absence of sternal notch, and (10) breast skin elasticity. On average, classification systems included two or three of these features. Breast size and ptosis were the most commonly included features. Based on their review of the current classification systems, the authors believe the ideal classification system should be universal and cater to all causes of gynecomastia; be surgically useful and easy to use; and should include a comprehensive set of clinically appropriate patient-related features, such as breast size, breast ptosis, tissue predominance, and skin redundancy. None of the current classification systems appears to fulfill these criteria.

  6. Robust Fringe Projection Profilometry via Sparse Representation.

    Science.gov (United States)

    Budianto; Lun, Daniel P K

    2016-04-01

    In this paper, a robust fringe projection profilometry (FPP) algorithm using the sparse dictionary learning and sparse coding techniques is proposed. When reconstructing the 3D model of objects, traditional FPP systems often fail to perform if the captured fringe images have a complex scene, such as having multiple and occluded objects. It introduces great difficulty to the phase unwrapping process of an FPP system that can result in serious distortion in the final reconstructed 3D model. For the proposed algorithm, it encodes the period order information, which is essential to phase unwrapping, into some texture patterns and embeds them to the projected fringe patterns. When the encoded fringe image is captured, a modified morphological component analysis and a sparse classification procedure are performed to decode and identify the embedded period order information. It is then used to assist the phase unwrapping process to deal with the different artifacts in the fringe images. Experimental results show that the proposed algorithm can significantly improve the robustness of an FPP system. It performs equally well no matter the fringe images have a simple or complex scene, or are affected due to the ambient lighting of the working environment.

  7. Robust statistical methods with R

    CERN Document Server

    Jureckova, Jana

    2005-01-01

    Robust statistical methods were developed to supplement the classical procedures when the data violate classical assumptions. They are ideally suited to applied research across a broad spectrum of study, yet most books on the subject are narrowly focused, overly theoretical, or simply outdated. Robust Statistical Methods with R provides a systematic treatment of robust procedures with an emphasis on practical application.The authors work from underlying mathematical tools to implementation, paying special attention to the computational aspects. They cover the whole range of robust methods, including differentiable statistical functions, distance of measures, influence functions, and asymptotic distributions, in a rigorous yet approachable manner. Highlighting hands-on problem solving, many examples and computational algorithms using the R software supplement the discussion. The book examines the characteristics of robustness, estimators of real parameter, large sample properties, and goodness-of-fit tests. It...

  8. Robust Short-Lag Spatial Coherence Imaging.

    Science.gov (United States)

    Nair, Arun Asokan; Tran, Trac Duy; Bell, Muyinatu A Lediju

    2018-03-01

    Short-lag spatial coherence (SLSC) imaging displays the spatial coherence between backscattered ultrasound echoes instead of their signal amplitudes and is more robust to noise and clutter artifacts when compared with traditional delay-and-sum (DAS) B-mode imaging. However, SLSC imaging does not consider the content of images formed with different lags, and thus does not exploit the differences in tissue texture at each short-lag value. Our proposed method improves SLSC imaging by weighting the addition of lag values (i.e., M-weighting) and by applying robust principal component analysis (RPCA) to search for a low-dimensional subspace for projecting coherence images created with different lag values. The RPCA-based projections are considered to be denoised versions of the originals that are then weighted and added across lags to yield a final robust SLSC (R-SLSC) image. Our approach was tested on simulation, phantom, and in vivo liver data. Relative to DAS B-mode images, the mean contrast, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) improvements with R-SLSC images are 21.22 dB, 2.54, and 2.36, respectively, when averaged over simulated, phantom, and in vivo data and over all lags considered, which corresponds to mean improvements of 96.4%, 121.2%, and 120.5%, respectively. When compared with SLSC images, the corresponding mean improvements with R-SLSC images were 7.38 dB, 1.52, and 1.30, respectively (i.e., mean improvements of 14.5%, 50.5%, and 43.2%, respectively). Results show great promise for smoothing out the tissue texture of SLSC images and enhancing anechoic or hypoechoic target visibility at higher lag values, which could be useful in clinical tasks such as breast cyst visualization, liver vessel tracking, and obese patient imaging.

  9. Asteroid taxonomic classifications

    International Nuclear Information System (INIS)

    Tholen, D.J.

    1989-01-01

    This paper reports on three taxonomic classification schemes developed and applied to the body of available color and albedo data. Asteroid taxonomic classifications according to two of these schemes are reproduced

  10. Robust loss functions for boosting.

    Science.gov (United States)

    Kanamori, Takafumi; Takenouchi, Takashi; Eguchi, Shinto; Murata, Noboru

    2007-08-01

    Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

  11. Theoretical Framework for Robustness Evaluation

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for evaluation of robustness of structural systems, incl. bridges and buildings. Typically modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, although...... the importance of robustness for structural design is widely recognized the code requirements are not specified in detail, which makes the practical use difficult. This paper describes a theoretical and risk based framework to form the basis for quantification of robustness and for pre-normative guidelines...

  12. Robustness of airline route networks

    Science.gov (United States)

    Lordan, Oriol; Sallan, Jose M.; Escorihuela, Nuria; Gonzalez-Prieto, David

    2016-03-01

    Airlines shape their route network by defining their routes through supply and demand considerations, paying little attention to network performance indicators, such as network robustness. However, the collapse of an airline network can produce high financial costs for the airline and all its geographical area of influence. The aim of this study is to analyze the topology and robustness of the network route of airlines following Low Cost Carriers (LCCs) and Full Service Carriers (FSCs) business models. Results show that FSC hubs are more central than LCC bases in their route network. As a result, LCC route networks are more robust than FSC networks.

  13. Hand eczema classification

    DEFF Research Database (Denmark)

    Diepgen, T L; Andersen, Klaus Ejner; Brandao, F M

    2008-01-01

    of the disease is rarely evidence based, and a classification system for different subdiagnoses of hand eczema is not agreed upon. Randomized controlled trials investigating the treatment of hand eczema are called for. For this, as well as for clinical purposes, a generally accepted classification system...... A classification system for hand eczema is proposed. Conclusions It is suggested that this classification be used in clinical work and in clinical trials....

  14. Classification with support hyperplanes

    NARCIS (Netherlands)

    G.I. Nalbantov (Georgi); J.C. Bioch (Cor); P.J.F. Groenen (Patrick)

    2006-01-01

    textabstractA new classification method is proposed, called Support Hy- perplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using

  15. Standard classification: Physics

    International Nuclear Information System (INIS)

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  16. Robust Portfolio Optimization Using Pseudodistances.

    Science.gov (United States)

    Toma, Aida; Leoni-Aubin, Samuela

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature.

  17. Classification and clinical assessment

    Directory of Open Access Journals (Sweden)

    F. Cantini

    2012-06-01

    Full Text Available There are at least nine classification criteria for psoriatic arthritis (PsA that have been proposed and used in clinical studies. With the exception of the ESSG and Bennett rules, all of the other criteria sets have a good performance in identifying PsA patients. As the CASPAR criteria are based on a robust study methodology, they are considered the current reference standard. However, if there seems to be no doubt that they are very good to classify PsA patients (very high specificity, they might be not sensitive enough to diagnose patients with unknown early PsA. The vast clinical heterogeneity of PsA makes its assessment very challenging. Peripheral joint involvement is measured by 78/76 joint counts, spine involvement by the instruments used for ankylosing spondylitis (AS, dactylitis by involved digit count or by the Leeds dactylitis index, enthesitis by the number of affected entheses (several indices available and psoriasis by the Psoriasis Area and Severity Index (PASI. Peripheral joint damage can be assessed by a modified van der Heijde-Sharp scoring system and axial damage by the methods used for AS or by the Psoriatic Arthritis Spondylitis Radiology Index (PASRI. As in other arthritides, global evaluation of disease activity and severity by patient and physician and assessment of disability and quality of life are widely used. Finally, composite indices that capture several clinical manifestations of PsA have been proposed and a new instrument, the Psoriatic ARthritis Disease Activity Score (PASDAS, is currently being developed.

  18. Robust methods for data reduction

    CERN Document Server

    Farcomeni, Alessio

    2015-01-01

    Robust Methods for Data Reduction gives a non-technical overview of robust data reduction techniques, encouraging the use of these important and useful methods in practical applications. The main areas covered include principal components analysis, sparse principal component analysis, canonical correlation analysis, factor analysis, clustering, double clustering, and discriminant analysis.The first part of the book illustrates how dimension reduction techniques synthesize available information by reducing the dimensionality of the data. The second part focuses on cluster and discriminant analy

  19. Classification of refrigerants; Classification des fluides frigorigenes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)

  20. Classification, disease, and diagnosis.

    Science.gov (United States)

    Jutel, Annemarie

    2011-01-01

    Classification shapes medicine and guides its practice. Understanding classification must be part of the quest to better understand the social context and implications of diagnosis. Classifications are part of the human work that provides a foundation for the recognition and study of illness: deciding how the vast expanse of nature can be partitioned into meaningful chunks, stabilizing and structuring what is otherwise disordered. This article explores the aims of classification, their embodiment in medical diagnosis, and the historical traditions of medical classification. It provides a brief overview of the aims and principles of classification and their relevance to contemporary medicine. It also demonstrates how classifications operate as social framing devices that enable and disable communication, assert and refute authority, and are important items for sociological study.

  1. Robust design optimization using the price of robustness, robust least squares and regularization methods

    Science.gov (United States)

    Bukhari, Hassan J.

    2017-12-01

    In this paper a framework for robust optimization of mechanical design problems and process systems that have parametric uncertainty is presented using three different approaches. Robust optimization problems are formulated so that the optimal solution is robust which means it is minimally sensitive to any perturbations in parameters. The first method uses the price of robustness approach which assumes the uncertain parameters to be symmetric and bounded. The robustness for the design can be controlled by limiting the parameters that can perturb.The second method uses the robust least squares method to determine the optimal parameters when data itself is subjected to perturbations instead of the parameters. The last method manages uncertainty by restricting the perturbation on parameters to improve sensitivity similar to Tikhonov regularization. The methods are implemented on two sets of problems; one linear and the other non-linear. This methodology will be compared with a prior method using multiple Monte Carlo simulation runs which shows that the approach being presented in this paper results in better performance.

  2. Robust medical image segmentation for hyperthermia treatment planning

    International Nuclear Information System (INIS)

    Neufeld, E.; Chavannes, N.; Kuster, N.; Samaras, T.

    2005-01-01

    Full text: This work is part of an ongoing effort to develop a comprehensive hyperthermia treatment planning (HTP) tool. The goal is to unify all the steps necessary to perform treatment planning - from image segmentation to optimization of the energy deposition pattern - in a single tool. The bases of the HTP software are the routines and know-how developed in our TRINTY project that resulted the commercial EM platform SEMCAD-X. It incorporates the non-uniform finite-difference time-domain (FDTD) method, permitting the simulation of highly detailed models. Subsequently, in order to create highly resolved patient models, a powerful and robust segmentation tool is needed. A toolbox has been created that allows the flexible combination of various segmentation methods as well as several pre-and postprocessing functions. It works primarily with CT and MRI images, which it can read in various formats. A wide variety of segmentation methods has been implemented. This includes thresholding techniques (k-means classification, expectation maximization and modal histogram analysis for automatic threshold detection, multi-dimensional if required), region growing methods (with hysteretic behavior and simultaneous competitive growing), an interactive marker based watershed transformation, level-set methods (homogeneity and edge based, fast-marching), a flexible live-wire implementation as well as fuzzy connectedness. Due to the large number of tissues that need to be segmented for HTP, no methods that rely on prior knowledge have been implemented. Various edge extraction routines, distance transforms, smoothing techniques (convolutions, anisotropic diffusion, sigma filter...), connected component analysis, topologically flexible interpolation, image algebra and morphological operations are available. Moreover, contours or surfaces can be extracted, simplified and exported. Using these different techniques on several samples, the following conclusions have been drawn: Due to the

  3. Security classification of information

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  4. Computational Modeling in Tissue Engineering

    CERN Document Server

    2013-01-01

    One of the major challenges in tissue engineering is the translation of biological knowledge on complex cell and tissue behavior into a predictive and robust engineering process. Mastering this complexity is an essential step towards clinical applications of tissue engineering. This volume discusses computational modeling tools that allow studying the biological complexity in a more quantitative way. More specifically, computational tools can help in:  (i) quantifying and optimizing the tissue engineering product, e.g. by adapting scaffold design to optimize micro-environmental signals or by adapting selection criteria to improve homogeneity of the selected cell population; (ii) quantifying and optimizing the tissue engineering process, e.g. by adapting bioreactor design to improve quality and quantity of the final product; and (iii) assessing the influence of the in vivo environment on the behavior of the tissue engineering product, e.g. by investigating vascular ingrowth. The book presents examples of each...

  5. Visualization and classification in biomedical terahertz pulsed imaging

    International Nuclear Information System (INIS)

    Loeffler, Torsten; Siebert, Karsten; Czasch, Stephanie; Bauer, Tobias; Roskos, Hartmut G

    2002-01-01

    'Visualization' in imaging is the process of extracting useful information from raw data in such a way that meaningful physical contrasts are developed. 'Classification' is the subsequent process of defining parameter ranges which allow us to identify elements of images such as different tissues or different objects. In this paper, we explore techniques for visualization and classification in terahertz pulsed imaging (TPI) for biomedical applications. For archived (formalin-fixed, alcohol-dehydrated and paraffin-mounted) test samples, we investigate both time- and frequency-domain methods based on bright- and dark-field TPI. Successful tissue classification is demonstrated

  6. Robust pattern decoding in shape-coded structured light

    Science.gov (United States)

    Tang, Suming; Zhang, Xu; Song, Zhan; Song, Lifang; Zeng, Hai

    2017-09-01

    Decoding is a challenging and complex problem in a coded structured light system. In this paper, a robust pattern decoding method is proposed for the shape-coded structured light in which the pattern is designed as grid shape with embedded geometrical shapes. In our decoding method, advancements are made at three steps. First, a multi-template feature detection algorithm is introduced to detect the feature point which is the intersection of each two orthogonal grid-lines. Second, pattern element identification is modelled as a supervised classification problem and the deep neural network technique is applied for the accurate classification of pattern elements. Before that, a training dataset is established, which contains a mass of pattern elements with various blurring and distortions. Third, an error correction mechanism based on epipolar constraint, coplanarity constraint and topological constraint is presented to reduce the false matches. In the experiments, several complex objects including human hand are chosen to test the accuracy and robustness of the proposed method. The experimental results show that our decoding method not only has high decoding accuracy, but also owns strong robustness to surface color and complex textures.

  7. Advances in robust fractional control

    CERN Document Server

    Padula, Fabrizio

    2015-01-01

    This monograph presents design methodologies for (robust) fractional control systems. It shows the reader how to take advantage of the superior flexibility of fractional control systems compared with integer-order systems in achieving more challenging control requirements. There is a high degree of current interest in fractional systems and fractional control arising from both academia and industry and readers from both milieux are catered to in the text. Different design approaches having in common a trade-off between robustness and performance of the control system are considered explicitly. The text generalizes methodologies, techniques and theoretical results that have been successfully applied in classical (integer) control to the fractional case. The first part of Advances in Robust Fractional Control is the more industrially-oriented. It focuses on the design of fractional controllers for integer processes. In particular, it considers fractional-order proportional-integral-derivative controllers, becau...

  8. Robustness of digital artist authentication

    DEFF Research Database (Denmark)

    Jacobsen, Robert; Nielsen, Morten

    In many cases it is possible to determine the authenticity of a painting from digital reproductions of the paintings; this has been demonstrated for a variety of artists and with different approaches. Common to all these methods in digital artist authentication is that the potential of the method...... is in focus, while the robustness has not been considered, i.e. the degree to which the data collection process influences the decision of the method. However, in order for an authentication method to be successful in practice, it needs to be robust to plausible error sources from the data collection....... In this paper we investigate the robustness of the newly proposed authenticity method introduced by the authors based on second generation multiresolution analysis. This is done by modelling a number of realistic factors that can occur in the data collection....

  9. Attractive ellipsoids in robust control

    CERN Document Server

    Poznyak, Alexander; Azhmyakov, Vadim

    2014-01-01

    This monograph introduces a newly developed robust-control design technique for a wide class of continuous-time dynamical systems called the “attractive ellipsoid method.” Along with a coherent introduction to the proposed control design and related topics, the monograph studies nonlinear affine control systems in the presence of uncertainty and presents a constructive and easily implementable control strategy that guarantees certain stability properties. The authors discuss linear-style feedback control synthesis in the context of the above-mentioned systems. The development and physical implementation of high-performance robust-feedback controllers that work in the absence of complete information is addressed, with numerous examples to illustrate how to apply the attractive ellipsoid method to mechanical and electromechanical systems. While theorems are proved systematically, the emphasis is on understanding and applying the theory to real-world situations. Attractive Ellipsoids in Robust Control will a...

  10. Robustness of holonomic quantum gates

    International Nuclear Information System (INIS)

    Solinas, P.; Zanardi, P.; Zanghi, N.

    2005-01-01

    Full text: If the driving field fluctuates during the quantum evolution this produces errors in the applied operator. The holonomic (and geometrical) quantum gates are believed to be robust against some kind of noise. Because of the geometrical dependence of the holonomic operators can be robust against this kind of noise; in fact if the fluctuations are fast enough they cancel out leaving the final operator unchanged. I present the numerical studies of holonomic quantum gates subject to this parametric noise, the fidelity of the noise and ideal evolution is calculated for different noise correlation times. The holonomic quantum gates seem robust not only for fast fluctuating fields but also for slow fluctuating fields. These results can be explained as due to the geometrical feature of the holonomic operator: for fast fluctuating fields the fluctuations are canceled out, for slow fluctuating fields the fluctuations do not perturb the loop in the parameter space. (author)

  11. Robust estimation and hypothesis testing

    CERN Document Server

    Tiku, Moti L

    2004-01-01

    In statistical theory and practice, a certain distribution is usually assumed and then optimal solutions sought. Since deviations from an assumed distribution are very common, one cannot feel comfortable with assuming a particular distribution and believing it to be exactly correct. That brings the robustness issue in focus. In this book, we have given statistical procedures which are robust to plausible deviations from an assumed mode. The method of modified maximum likelihood estimation is used in formulating these procedures. The modified maximum likelihood estimators are explicit functions of sample observations and are easy to compute. They are asymptotically fully efficient and are as efficient as the maximum likelihood estimators for small sample sizes. The maximum likelihood estimators have computational problems and are, therefore, elusive. A broad range of topics are covered in this book. Solutions are given which are easy to implement and are efficient. The solutions are also robust to data anomali...

  12. Robustness in Railway Operations (RobustRailS)

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker

    This study considers the problem of enhancing railway timetable robustness without adding slack time, hence increasing the travel time. The approach integrates a transit assignment model to assess how passengers adapt their behaviour whenever operations are changed. First, the approach considers...

  13. Catchment Classification: Connecting Climate, Structure and Function

    Science.gov (United States)

    Sawicz, K. A.; Wagener, T.; Sivapalan, M.; Troch, P. A.; Carrillo, G. A.

    2010-12-01

    Hydrology does not yet possess a generally accepted catchment classification framework. Such a classification framework needs to: [1] give names to things, i.e. the main classification step, [2] permit transfer of information, i.e. regionalization of information, [3] permit development of generalizations, i.e. to develop new theory, and [4] provide a first order environmental change impact assessment, i.e., the hydrologic implications of climate, land use and land cover change. One strategy is to create a catchment classification framework based on the notion of catchment functions (partitioning, storage, and release). Results of an empirical study presented here connects climate and structure to catchment function (in the form of select hydrologic signatures), based on analyzing over 300 US catchments. Initial results indicate a wide assortment of signature relationships with properties of climate, geology, and vegetation. The uncertainty in the different regionalized signatures varies widely, and therefore there is variability in the robustness of classifying ungauged basins. This research provides insight into the controls of hydrologic behavior of a catchment, and enables a classification framework applicable to gauged and ungauged across the study domain. This study sheds light on what we can expect to achieve in mapping climate, structure and function in a top-down manner. Results of this study complement work done using a bottom-up physically-based modeling framework to generalize this approach (Carrillo et al., this session).

  14. Robust tumor morphometry in multispectral fluorescence microscopy

    Science.gov (United States)

    Tabesh, Ali; Vengrenyuk, Yevgen; Teverovskiy, Mikhail; Khan, Faisal M.; Sapir, Marina; Powell, Douglas; Mesa-Tejada, Ricardo; Donovan, Michael J.; Fernandez, Gerardo

    2009-02-01

    Morphological and architectural characteristics of primary tissue compartments, such as epithelial nuclei (EN) and cytoplasm, provide important cues for cancer diagnosis, prognosis, and therapeutic response prediction. We propose two feature sets for the robust quantification of these characteristics in multiplex immunofluorescence (IF) microscopy images of prostate biopsy specimens. To enable feature extraction, EN and cytoplasm regions were first segmented from the IF images. Then, feature sets consisting of the characteristics of the minimum spanning tree (MST) connecting the EN and the fractal dimension (FD) of gland boundaries were obtained from the segmented compartments. We demonstrated the utility of the proposed features in prostate cancer recurrence prediction on a multi-institution cohort of 1027 patients. Univariate analysis revealed that both FD and one of the MST features were highly effective for predicting cancer recurrence (p <= 0.0001). In multivariate analysis, an MST feature was selected for a model incorporating clinical and image features. The model achieved a concordance index (CI) of 0.73 on the validation set, which was significantly higher than the CI of 0.69 for the standard multivariate model based solely on clinical features currently used in clinical practice (p < 0.0001). The contributions of this work are twofold. First, it is the first demonstration of the utility of the proposed features in morphometric analysis of IF images. Second, this is the largest scale study of the efficacy and robustness of the proposed features in prostate cancer prognosis.

  15. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    to be applicable in organisations assigning a high importance to one or more factors that are known to be impacted by RD, while also experiencing a high level of occurrence of this factor. The RDAM supplements existing maturity models and metrics to provide a comprehensive set of data to support management......This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities...

  16. Ins-Robust Primitive Words

    OpenAIRE

    Srivastava, Amit Kumar; Kapoor, Kalpesh

    2017-01-01

    Let Q be the set of primitive words over a finite alphabet with at least two symbols. We characterize a class of primitive words, Q_I, referred to as ins-robust primitive words, which remain primitive on insertion of any letter from the alphabet and present some properties that characterizes words in the set Q_I. It is shown that the language Q_I is dense. We prove that the language of primitive words that are not ins-robust is not context-free. We also present a linear time algorithm to reco...

  17. Classification of Flotation Frothers

    Directory of Open Access Journals (Sweden)

    Jan Drzymala

    2018-02-01

    Full Text Available In this paper, a scheme of flotation frothers classification is presented. The scheme first indicates the physical system in which a frother is present and four of them i.e., pure state, aqueous solution, aqueous solution/gas system and aqueous solution/gas/solid system are distinguished. As a result, there are numerous classifications of flotation frothers. The classifications can be organized into a scheme described in detail in this paper. The frother can be present in one of four physical systems, that is pure state, aqueous solution, aqueous solution/gas and aqueous solution/gas/solid system. It results from the paper that a meaningful classification of frothers relies on choosing the physical system and next feature, trend, parameter or parameters according to which the classification is performed. The proposed classification can play a useful role in characterizing and evaluation of flotation frothers.

  18. Fault Tolerant Neural Network for ECG Signal Classification Systems

    Directory of Open Access Journals (Sweden)

    MERAH, M.

    2011-08-01

    Full Text Available The aim of this paper is to apply a new robust hardware Artificial Neural Network (ANN for ECG classification systems. This ANN includes a penalization criterion which makes the performances in terms of robustness. Specifically, in this method, the ANN weights are normalized using the auto-prune method. Simulations performed on the MIT ? BIH ECG signals, have shown that significant robustness improvements are obtained regarding potential hardware artificial neuron failures. Moreover, we show that the proposed design achieves better generalization performances, compared to the standard back-propagation algorithm.

  19. A New Method for Solving Supervised Data Classification Problems

    Directory of Open Access Journals (Sweden)

    Parvaneh Shabanzadeh

    2014-01-01

    Full Text Available Supervised data classification is one of the techniques used to extract nontrivial information from data. Classification is a widely used technique in various fields, including data mining, industry, medicine, science, and law. This paper considers a new algorithm for supervised data classification problems associated with the cluster analysis. The mathematical formulations for this algorithm are based on nonsmooth, nonconvex optimization. A new algorithm for solving this optimization problem is utilized. The new algorithm uses a derivative-free technique, with robustness and efficiency. To improve classification performance and efficiency in generating classification model, a new feature selection algorithm based on techniques of convex programming is suggested. Proposed methods are tested on real-world datasets. Results of numerical experiments have been presented which demonstrate the effectiveness of the proposed algorithms.

  20. Tissue engineering

    CERN Document Server

    Fisher, John P; Bronzino, Joseph D

    2007-01-01

    Increasingly viewed as the future of medicine, the field of tissue engineering is still in its infancy. As evidenced in both the scientific and popular press, there exists considerable excitement surrounding the strategy of regenerative medicine. To achieve its highest potential, a series of technological advances must be made. Putting the numerous breakthroughs made in this field into a broad context, Tissue Engineering disseminates current thinking on the development of engineered tissues. Divided into three sections, the book covers the fundamentals of tissue engineering, enabling technologies, and tissue engineering applications. It examines the properties of stem cells, primary cells, growth factors, and extracellular matrix as well as their impact on the development of tissue engineered devices. Contributions focus on those strategies typically incorporated into tissue engineered devices or utilized in their development, including scaffolds, nanocomposites, bioreactors, drug delivery systems, and gene t...

  1. Ontologies vs. Classification Systems

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2009-01-01

    What is an ontology compared to a classification system? Is a taxonomy a kind of classification system or a kind of ontology? These are questions that we meet when working with people from industry and public authorities, who need methods and tools for concept clarification, for developing meta...... data sets or for obtaining advanced search facilities. In this paper we will present an attempt at answering these questions. We will give a presentation of various types of ontologies and briefly introduce terminological ontologies. Furthermore we will argue that classification systems, e.g. product...... classification systems and meta data taxonomies, should be based on ontologies....

  2. Definition, classification, and epidemiology of pulmonary arterial hypertension.

    Science.gov (United States)

    Hoeper, Marius M

    2009-08-01

    Pulmonary arterial hypertension (PAH) is a distinct subgroup of pulmonary hypertension that comprises idiopathic PAH, familial/heritable forms, and PAH associated with connective tissue disease, congenital heart disease, portal hypertension, human immunodeficiency virus (HIV) infection, and some other conditions. The hemodynamic definition of PAH was recently revised: PAH is now defined by a mean pulmonary artery pressure at rest > or =25 mm Hg in the presence of a pulmonary capillary wedge pressure or =30 mm Hg during exercise) that was used in the old definition of PAH has been removed because there are no robust data that would allow defining an upper limit of normal for the pulmonary pressure during exercise. The revised classification of pulmonary hypertension still consists of five major groups: (1) PAH, (2) pulmonary hypertension due to left heart disease, (3) pulmonary hypertension due to chronic lung disease and/or hypoxia, (4) chronic thromboembolic pulmonary hypertension, and (5) miscellaneous forms. Modifications have been made in some of these groups, such as the addition of schistosomiasis-related pulmonary hypertension and pulmonary hypertension in patients with chronic hemolytic anemia to group 1.

  3. Essays on robust asset pricing

    NARCIS (Netherlands)

    Horváth, Ferenc

    2017-01-01

    The central concept of this doctoral dissertation is robustness. I analyze how model and parameter uncertainty affect financial decisions of investors and fund managers, and what their equilibrium consequences are. Chapter 1 gives an overview of the most important concepts and methodologies used in

  4. Robust visual hashing via ICA

    International Nuclear Information System (INIS)

    Fournel, Thierry; Coltuc, Daniela

    2010-01-01

    Designed to maximize information transmission in the presence of noise, independent component analysis (ICA) could appear in certain circumstances as a statistics-based tool for robust visual hashing. Several ICA-based scenarios can attempt to reach this goal. A first one is here considered.

  5. Robustness of raw quantum tomography

    Science.gov (United States)

    Asorey, M.; Facchi, P.; Florio, G.; Man'ko, V. I.; Marmo, G.; Pascazio, S.; Sudarshan, E. C. G.

    2011-01-01

    We scrutinize the effects of non-ideal data acquisition on the tomograms of quantum states. The presence of a weight function, schematizing the effects of a finite window or equivalently noise, only affects the state reconstruction procedure by a normalization constant. The results are extended to a discrete mesh and show that quantum tomography is robust under incomplete and approximate knowledge of tomograms.

  6. Robustness of raw quantum tomography

    Energy Technology Data Exchange (ETDEWEB)

    Asorey, M. [Departamento de Fisica Teorica, Facultad de Ciencias, Universidad de Zaragoza, 50009 Zaragoza (Spain); Facchi, P. [Dipartimento di Matematica, Universita di Bari, I-70125 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Florio, G. [Dipartimento di Fisica, Universita di Bari, I-70126 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Man' ko, V.I., E-mail: manko@lebedev.r [P.N. Lebedev Physical Institute, Leninskii Prospect 53, Moscow 119991 (Russian Federation); Marmo, G. [Dipartimento di Scienze Fisiche, Universita di Napoli ' Federico II' , I-80126 Napoli (Italy); INFN, Sezione di Napoli, I-80126 Napoli (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Pascazio, S. [Dipartimento di Fisica, Universita di Bari, I-70126 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Sudarshan, E.C.G. [Department of Physics, University of Texas, Austin, TX 78712 (United States)

    2011-01-31

    We scrutinize the effects of non-ideal data acquisition on the tomograms of quantum states. The presence of a weight function, schematizing the effects of a finite window or equivalently noise, only affects the state reconstruction procedure by a normalization constant. The results are extended to a discrete mesh and show that quantum tomography is robust under incomplete and approximate knowledge of tomograms.

  7. Aspects of robust linear regression

    NARCIS (Netherlands)

    Davies, P.L.

    1993-01-01

    Section 1 of the paper contains a general discussion of robustness. In Section 2 the influence function of the Hampel-Rousseeuw least median of squares estimator is derived. Linearly invariant weak metrics are constructed in Section 3. It is shown in Section 4 that $S$-estimators satisfy an exact

  8. Manipulation Robustness of Collaborative Filtering

    OpenAIRE

    Benjamin Van Roy; Xiang Yan

    2010-01-01

    A collaborative filtering system recommends to users products that similar users like. Collaborative filtering systems influence purchase decisions and hence have become targets of manipulation by unscrupulous vendors. We demonstrate that nearest neighbors algorithms, which are widely used in commercial systems, are highly susceptible to manipulation and introduce new collaborative filtering algorithms that are relatively robust.

  9. Robustness Regions for Dichotomous Decisions.

    Science.gov (United States)

    Vijn, Pieter; Molenaar, Ivo W.

    1981-01-01

    In the case of dichotomous decisions, the total set of all assumptions/specifications for which the decision would have been the same is the robustness region. Inspection of this (data-dependent) region is a form of sensitivity analysis which may lead to improved decision making. (Author/BW)

  10. Theoretical Framework for Robustness Evaluation

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for evaluation of robustness of structural systems, incl. bridges and buildings. Typically modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, althou...

  11. Robust control design with MATLAB

    CERN Document Server

    Gu, Da-Wei; Konstantinov, Mihail M

    2013-01-01

    Robust Control Design with MATLAB® (second edition) helps the student to learn how to use well-developed advanced robust control design methods in practical cases. To this end, several realistic control design examples from teaching-laboratory experiments, such as a two-wheeled, self-balancing robot, to complex systems like a flexible-link manipulator are given detailed presentation. All of these exercises are conducted using MATLAB® Robust Control Toolbox 3, Control System Toolbox and Simulink®. By sharing their experiences in industrial cases with minimum recourse to complicated theories and formulae, the authors convey essential ideas and useful insights into robust industrial control systems design using major H-infinity optimization and related methods allowing readers quickly to move on with their own challenges. The hands-on tutorial style of this text rests on an abundance of examples and features for the second edition: ·        rewritten and simplified presentation of theoretical and meth...

  12. Robust Portfolio Optimization Using Pseudodistances

    Science.gov (United States)

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature. PMID:26468948

  13. Sparse and Robust Factor Modelling

    NARCIS (Netherlands)

    C. Croux (Christophe); P. Exterkate (Peter)

    2011-01-01

    textabstractFactor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having

  14. Robust distributed cognitive relay beamforming

    KAUST Repository

    Pandarakkottilil, Ubaidulla; Aissa, Sonia

    2012-01-01

    design takes into account a parameter of the error in the channel state information (CSI) to render the performance of the beamformer robust in the presence of imperfect CSI. Though the original problem is non-convex, we show that the proposed design can

  15. Approximability of Robust Network Design

    NARCIS (Netherlands)

    Olver, N.K.; Shepherd, F.B.

    2014-01-01

    We consider robust (undirected) network design (RND) problems where the set of feasible demands may be given by an arbitrary convex body. This model, introduced by Ben-Ameur and Kerivin [Ben-Ameur W, Kerivin H (2003) New economical virtual private networks. Comm. ACM 46(6):69-73], generalizes the

  16. Robust optical sensors for safety critical automotive applications

    Science.gov (United States)

    De Locht, Cliff; De Knibber, Sven; Maddalena, Sam

    2008-02-01

    Optical sensors for the automotive industry need to be robust, high performing and low cost. This paper focuses on the impact of automotive requirements on optical sensor design and packaging. Main strategies to lower optical sensor entry barriers in the automotive market include: Perform sensor calibration and tuning by the sensor manufacturer, sensor test modes on chip to guarantee functional integrity at operation, and package technology is key. As a conclusion, optical sensor applications are growing in automotive. Optical sensor robustness matured to the level of safety critical applications like Electrical Power Assisted Steering (EPAS) and Drive-by-Wire by optical linear arrays based systems and Automated Cruise Control (ACC), Lane Change Assist and Driver Classification/Smart Airbag Deployment by camera imagers based systems.

  17. Robust Image Hashing Using Radon Transform and Invariant Features

    Directory of Open Access Journals (Sweden)

    Y.L. Liu

    2016-09-01

    Full Text Available A robust image hashing method based on radon transform and invariant features is proposed for image authentication, image retrieval, and image detection. Specifically, an input image is firstly converted into a counterpart with a normalized size. Then the invariant centroid algorithm is applied to obtain the invariant feature point and the surrounding circular area, and the radon transform is employed to acquire the mapping coefficient matrix of the area. Finally, the hashing sequence is generated by combining the feature vectors and the invariant moments calculated from the coefficient matrix. Experimental results show that this method not only can resist against the normal image processing operations, but also some geometric distortions. Comparisons of receiver operating characteristic (ROC curve indicate that the proposed method outperforms some existing methods in classification between perceptual robustness and discrimination.

  18. Classification of radiological procedures

    International Nuclear Information System (INIS)

    1989-01-01

    A classification for departments in Danish hospitals which use radiological procedures. The classification codes consist of 4 digits, where the first 2 are the codes for the main groups. The first digit represents the procedure's topographical object and the second the techniques. The last 2 digits describe individual procedures. (CLS)

  19. Colombia: Territorial classification

    International Nuclear Information System (INIS)

    Mendoza Morales, Alberto

    1998-01-01

    The article is about the approaches of territorial classification, thematic axes, handling principles and territorial occupation, politician and administrative units and administration regions among other topics. Understanding as Territorial Classification the space distribution on the territory of the country, of the geographical configurations, the human communities, the political-administrative units and the uses of the soil, urban and rural, existent and proposed

  20. Munitions Classification Library

    Science.gov (United States)

    2016-04-04

    members of the community to make their own additions to any, or all, of the classification libraries . The next phase entailed data collection over less......Include area code) 04/04/2016 Final Report August 2014 - August 2015 MUNITIONS CLASSIFICATION LIBRARY Mr. Craig Murray, Parsons Dr. Thomas H. Bell, Leidos

  1. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  2. Library Classification 2020

    Science.gov (United States)

    Harris, Christopher

    2013-01-01

    In this article the author explores how a new library classification system might be designed using some aspects of the Dewey Decimal Classification (DDC) and ideas from other systems to create something that works for school libraries in the year 2020. By examining what works well with the Dewey Decimal System, what features should be carried…

  3. Spectroscopic classification of transients

    DEFF Research Database (Denmark)

    Stritzinger, M. D.; Fraser, M.; Hummelmose, N. N.

    2017-01-01

    We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017.......We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017....

  4. Robust Reliability or reliable robustness? - Integrated consideration of robustness and reliability aspects

    DEFF Research Database (Denmark)

    Kemmler, S.; Eifler, Tobias; Bertsche, B.

    2015-01-01

    products are and vice versa. For a comprehensive understanding and to use existing synergies between both domains, this paper discusses the basic principles of Reliability- and Robust Design theory. The development of a comprehensive model will enable an integrated consideration of both domains...

  5. DOE LLW classification rationale

    International Nuclear Information System (INIS)

    Flores, A.Y.

    1991-01-01

    This report was about the rationale which the US Department of Energy had with low-level radioactive waste (LLW) classification. It is based on the Nuclear Regulatory Commission's classification system. DOE site operators met to review the qualifications and characteristics of the classification systems. They evaluated performance objectives, developed waste classification tables, and compiled dose limits on the waste. A goal of the LLW classification system was to allow each disposal site the freedom to develop limits to radionuclide inventories and concentrations according to its own site-specific characteristics. This goal was achieved with the adoption of a performance objectives system based on a performance assessment, with site-specific environmental conditions and engineered disposal systems

  6. Constructing criticality by classification

    DEFF Research Database (Denmark)

    Machacek, Erika

    2017-01-01

    " in the bureaucratic practice of classification: Experts construct material criticality in assessments as they allot information on the materials to the parameters of the assessment framework. In so doing, they ascribe a new set of connotations to the materials, namely supply risk, and their importance to clean energy......, legitimizing a criticality discourse.Specifically, the paper introduces a typology delineating the inferences made by the experts from their produced recommendations in the classification of rare earth element criticality. The paper argues that the classification is a specific process of constructing risk....... It proposes that the expert bureaucratic practice of classification legitimizes (i) the valorisation that was made in the drafting of the assessment framework for the classification, and (ii) political operationalization when enacted that might have (non-)distributive implications for the allocation of public...

  7. Tissue types (image)

    Science.gov (United States)

    ... are 4 basic types of tissue: connective tissue, epithelial tissue, muscle tissue, and nervous tissue. Connective tissue supports ... binds them together (bone, blood, and lymph tissues). Epithelial tissue provides a covering (skin, the linings of the ...

  8. Robust Watermarking of Video Streams

    Directory of Open Access Journals (Sweden)

    T. Polyák

    2006-01-01

    Full Text Available In the past few years there has been an explosion in the use of digital video data. Many people have personal computers at home, and with the help of the Internet users can easily share video files on their computer. This makes possible the unauthorized use of digital media, and without adequate protection systems the authors and distributors have no means to prevent it.Digital watermarking techniques can help these systems to be more effective by embedding secret data right into the video stream. This makes minor changes in the frames of the video, but these changes are almost imperceptible to the human visual system. The embedded information can involve copyright data, access control etc. A robust watermark is resistant to various distortions of the video, so it cannot be removed without affecting the quality of the host medium. In this paper I propose a video watermarking scheme that fulfills the requirements of a robust watermark. 

  9. Robust Decentralized Formation Flight Control

    Directory of Open Access Journals (Sweden)

    Zhao Weihua

    2011-01-01

    Full Text Available Motivated by the idea of multiplexed model predictive control (MMPC, this paper introduces a new framework for unmanned aerial vehicles (UAVs formation flight and coordination. Formulated using MMPC approach, the whole centralized formation flight system is considered as a linear periodic system with control inputs of each UAV subsystem as its periodic inputs. Divided into decentralized subsystems, the whole formation flight system is guaranteed stable if proper terminal cost and terminal constraints are added to each decentralized MPC formulation of the UAV subsystem. The decentralized robust MPC formulation for each UAV subsystem with bounded input disturbances and model uncertainties is also presented. Furthermore, an obstacle avoidance control scheme for any shape and size of obstacles, including the nonapriorily known ones, is integrated under the unified MPC framework. The results from simulations demonstrate that the proposed framework can successfully achieve robust collision-free formation flights.

  10. Inefficient but robust public leadership.

    OpenAIRE

    Matsumura, Toshihiro; Ogawa, Akira

    2014-01-01

    We investigate endogenous timing in a mixed duopoly in a differentiated product market. We find that private leadership is better than public leadership from a social welfare perspective if the private firm is domestic, regardless of the degree of product differentiation. Nevertheless, the public leadership equilibrium is risk-dominant, and it is thus robust if the degree of product differentiation is high. We also find that regardless of the degree of product differentiation, the public lead...

  11. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  12. Robust power system frequency control

    CERN Document Server

    Bevrani, Hassan

    2014-01-01

    This updated edition of the industry standard reference on power system frequency control provides practical, systematic and flexible algorithms for regulating load frequency, offering new solutions to the technical challenges introduced by the escalating role of distributed generation and renewable energy sources in smart electric grids. The author emphasizes the physical constraints and practical engineering issues related to frequency in a deregulated environment, while fostering a conceptual understanding of frequency regulation and robust control techniques. The resulting control strategi

  13. A dynamical classification of the cosmic web

    Science.gov (United States)

    Forero-Romero, J. E.; Hoffman, Y.; Gottlöber, S.; Klypin, A.; Yepes, G.

    2009-07-01

    In this paper, we propose a new dynamical classification of the cosmic web. Each point in space is classified in one of four possible web types: voids, sheets, filaments and knots. The classification is based on the evaluation of the deformation tensor (i.e. the Hessian of the gravitational potential) on a grid. The classification is based on counting the number of eigenvalues above a certain threshold, λth, at each grid point, where the case of zero, one, two or three such eigenvalues corresponds to void, sheet, filament or a knot grid point. The collection of neighbouring grid points, friends of friends, of the same web type constitutes voids, sheets, filaments and knots as extended web objects. A simple dynamical consideration of the emergence of the web suggests that the threshold should not be null, as in previous implementations of the algorithm. A detailed dynamical analysis would have found different threshold values for the collapse of sheets, filaments and knots. Short of such an analysis a phenomenological approach has been opted for, looking for a single threshold to be determined by analysing numerical simulations. Our cosmic web classification has been applied and tested against a suite of large (dark matter only) cosmological N-body simulations. In particular, the dependence of the volume and mass filling fractions on λth and on the resolution has been calculated for the four web types. We also study the percolation properties of voids and filaments. Our main findings are as follows. (i) Already at λth = 0.1 the resulting web classification reproduces the visual impression of the cosmic web. (ii) Between 0.2 net of interconnected filaments. This suggests a reasonable choice for λth as the parameter that defines the cosmic web. (iii) The dynamical nature of the suggested classification provides a robust framework for incorporating environmental information into galaxy formation models, and in particular to semi-analytical models.

  14. Classification of sports types from tracklets

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    Automatic analysis of video is important in order to process and exploit large amounts of data, e.g. for sports analysis. Classification of sports types is one of the first steps to- wards a fully automatic analysis of the activities performed at sports arenas. In this work we test the idea...... that sports types can be classified from features extracted from short trajectories of the players. From tracklets created by a Kalman filter tracker we extract four robust features; Total distance, lifespan, distance span and mean speed. For clas- sification we use a quadratic discriminant analysis. In our...... experiments we use 30 2-minutes thermal video sequences from each of five different sports types. By applying a 10- fold cross validation we obtain a correct classification rate of 94.5 %....

  15. Signal classification for acoustic neutrino detection

    International Nuclear Information System (INIS)

    Neff, M.; Anton, G.; Enzenhöfer, A.; Graf, K.; Hößl, J.; Katz, U.; Lahmann, R.; Richardt, C.

    2012-01-01

    This article focuses on signal classification for deep-sea acoustic neutrino detection. In the deep sea, the background of transient signals is very diverse. Approaches like matched filtering are not sufficient to distinguish between neutrino-like signals and other transient signals with similar signature, which are forming the acoustic background for neutrino detection in the deep-sea environment. A classification system based on machine learning algorithms is analysed with the goal to find a robust and effective way to perform this task. For a well-trained model, a testing error on the level of 1% is achieved for strong classifiers like Random Forest and Boosting Trees using the extracted features of the signal as input and utilising dense clusters of sensors instead of single sensors.

  16. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  17. Classification of movement disorders.

    Science.gov (United States)

    Fahn, Stanley

    2011-05-01

    The classification of movement disorders has evolved. Even the terminology has shifted, from an anatomical one of extrapyramidal disorders to a phenomenological one of movement disorders. The history of how this shift came about is described. The history of both the definitions and the classifications of the various neurologic conditions is then reviewed. First is a review of movement disorders as a group; then, the evolving classifications for 3 of them--parkinsonism, dystonia, and tremor--are covered in detail. Copyright © 2011 Movement Disorder Society.

  18. Robustness Analysis of Timber Truss Structure

    DEFF Research Database (Denmark)

    Rajčić, Vlatka; Čizmar, Dean; Kirkegaard, Poul Henning

    2010-01-01

    The present paper discusses robustness of structures in general and the robustness requirements given in the codes. Robustness of timber structures is also an issues as this is closely related to Working group 3 (Robustness of systems) of the COST E55 project. Finally, an example of a robustness...... evaluation of a widespan timber truss structure is presented. This structure was built few years ago near Zagreb and has a span of 45m. Reliability analysis of the main members and the system is conducted and based on this a robustness analysis is preformed....

  19. Robust Face Recognition Via Gabor Feature and Sparse Representation

    Directory of Open Access Journals (Sweden)

    Hao Yu-Juan

    2016-01-01

    Full Text Available Sparse representation based on compressed sensing theory has been widely used in the field of face recognition, and has achieved good recognition results. but the face feature extraction based on sparse representation is too simple, and the sparse coefficient is not sparse. In this paper, we improve the classification algorithm based on the fusion of sparse representation and Gabor feature, and then improved algorithm for Gabor feature which overcomes the problem of large dimension of the vector dimension, reduces the computation and storage cost, and enhances the robustness of the algorithm to the changes of the environment.The classification efficiency of sparse representation is determined by the collaborative representation,we simplify the sparse constraint based on L1 norm to the least square constraint, which makes the sparse coefficients both positive and reduce the complexity of the algorithm. Experimental results show that the proposed method is robust to illumination, facial expression and pose variations of face recognition, and the recognition rate of the algorithm is improved.

  20. Update on diabetes classification.

    Science.gov (United States)

    Thomas, Celeste C; Philipson, Louis H

    2015-01-01

    This article highlights the difficulties in creating a definitive classification of diabetes mellitus in the absence of a complete understanding of the pathogenesis of the major forms. This brief review shows the evolving nature of the classification of diabetes mellitus. No classification scheme is ideal, and all have some overlap and inconsistencies. The only diabetes in which it is possible to accurately diagnose by DNA sequencing, monogenic diabetes, remains undiagnosed in more than 90% of the individuals who have diabetes caused by one of the known gene mutations. The point of classification, or taxonomy, of disease, should be to give insight into both pathogenesis and treatment. It remains a source of frustration that all schemes of diabetes mellitus continue to fall short of this goal. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Learning Apache Mahout classification

    CERN Document Server

    Gupta, Ashish

    2015-01-01

    If you are a data scientist who has some experience with the Hadoop ecosystem and machine learning methods and want to try out classification on large datasets using Mahout, this book is ideal for you. Knowledge of Java is essential.

  2. CLASSIFICATION OF VIRUSES

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. CLASSIFICATION OF VIRUSES. On basis of morphology. On basis of chemical composition. On basis of structure of genome. On basis of mode of replication. Notes:

  3. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  4. Soliton robustness in optical fibers

    International Nuclear Information System (INIS)

    Menyuk, C.R.

    1993-01-01

    Simulations and experiments indicate that solitons in optical fibers are robust in the presence of Hamiltonian deformations such as higher-order dispersion and birefringence but are destroyed in the presence of non-Hamiltonian deformations such as attenuation and the Raman effect. Two hypotheses are introduced that generalize these observations and give a recipe for when deformations will be Hamiltonian. Concepts from nonlinear dynamics are used to make these two hypotheses plausible. Soliton stabilization with frequency filtering is also briefly discussed from this point of view

  5. Robust and Sparse Factor Modelling

    DEFF Research Database (Denmark)

    Croux, Christophe; Exterkate, Peter

    Factor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having relatively few...... nonzero factor loadings. Compared to the traditional factor construction method, we find that this procedure leads to a favorable forecasting performance in the presence of outliers and to better interpretable factors. We investigate the performance of the method in a Monte Carlo experiment...

  6. Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; čizmar, D.

    2010-01-01

    The present paper outlines results from working group 3 (WG3) in the EU COST Action E55 – ‘Modelling of the performance of timber structures’. The objectives of the project are related to the three main research activities: the identification and modelling of relevant load and environmental...... exposure scenarios, the improvement of knowledge concerning the behaviour of timber structural elements and the development of a generic framework for the assessment of the life-cycle vulnerability and robustness of timber structures....

  7. Sustainable Resilient, Robust & Resplendent Enterprises

    DEFF Research Database (Denmark)

    Edgeman, Rick

    to their impact. Resplendent enterprises are introduced with resplendence referring not to some sort of public or private façade, but instead refers to organizations marked by dual brilliance and nobility of strategy, governance and comportment that yields superior and sustainable triple bottom line performance....... Herein resilience, robustness, and resplendence (R3) are integrated with sustainable enterprise excellence (Edgeman and Eskildsen, 2013) or SEE and social-ecological innovation (Eskildsen and Edgeman, 2012) to aid progress of a firm toward producing continuously relevant performance that proceed from...

  8. Towards secondary fingerprint classification

    CSIR Research Space (South Africa)

    Msiza, IS

    2011-07-01

    Full Text Available an accuracy figure of 76.8%. This small difference between the two figures is indicative of the validity of the proposed secondary classification module. Keywords?fingerprint core; fingerprint delta; primary classifi- cation; secondary classification I..., namely, the fingerprint core and the fingerprint delta. Forensically, a fingerprint core is defined as the innermost turning point where the fingerprint ridges form a loop, while the fingerprint delta is defined as the point where these ridges form a...

  9. Expected Classification Accuracy

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2005-08-01

    Full Text Available Every time we make a classification based on a test score, we should expect some number..of misclassifications. Some examinees whose true ability is within a score range will have..observed scores outside of that range. A procedure for providing a classification table of..true and expected scores is developed for polytomously scored items under item response..theory and applied to state assessment data. A simplified procedure for estimating the..table entries is also presented.

  10. Latent classification models

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2005-01-01

    parametric family ofdistributions.  In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....

  11. 78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-11-18

    ...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and... response to requests from the U.S. cotton industry and ICE, AMS will offer a futures classification option...

  12. Supernova Photometric Lightcurve Classification

    Science.gov (United States)

    Zaidi, Tayeb; Narayan, Gautham

    2016-01-01

    This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).

  13. Robust Instrumentation[Water treatment for power plant]; Robust Instrumentering

    Energy Technology Data Exchange (ETDEWEB)

    Wik, Anders [Vattenfall Utveckling AB, Stockholm (Sweden)

    2003-08-01

    Cementa Slite Power Station is a heat recovery steam generator (HRSG) with moderate steam data; 3.0 MPa and 420 deg C. The heat is recovered from Cementa, a cement industry, without any usage of auxiliary fuel. The Power station commenced operation in 2001. The layout of the plant is unusual, there are no similar in Sweden and very few world-wide, so the operational experiences are limited. In connection with the commissioning of the power plant a R and D project was identified with the objective to minimise the manpower needed for chemistry management of the plant. The lean chemistry management is based on robust instrumentation and chemical-free water treatment plant. The concept with robust instrumentation consists of the following components; choice of on-line instrumentation with a minimum of O and M and a chemical-free water treatment. The parameters are specific conductivity, cation conductivity, oxygen and pH. In addition to that, two fairly new on-line instruments were included; corrosion monitors and differential pH calculated from specific and cation conductivity. The chemical-free water treatment plant consists of softening, reverse osmosis and electro-deionisation. The operational experience shows that the cycle chemistry is not within the guidelines due to major problems with the operation of the power plant. These problems have made it impossible to reach steady state and thereby not viable to fully verify and validate the concept with robust instrumentation. From readings on the panel of the online analysers some conclusions may be drawn, e.g. the differential pH measurements have fulfilled the expectations. The other on-line analysers have been working satisfactorily apart from contamination with turbine oil, which has been noticed at least twice. The corrosion monitors seem to be working but the lack of trend curves from the mainframe computer system makes it hard to draw any clear conclusions. The chemical-free water treatment has met all

  14. MO-DE-207B-03: Improved Cancer Classification Using Patient-Specific Biological Pathway Information Via Gene Expression Data

    Energy Technology Data Exchange (ETDEWEB)

    Young, M; Craft, D [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)

    2016-06-15

    Purpose: To develop an efficient, pathway-based classification system using network biology statistics to assist in patient-specific response predictions to radiation and drug therapies across multiple cancer types. Methods: We developed PICS (Pathway Informed Classification System), a novel two-step cancer classification algorithm. In PICS, a matrix m of mRNA expression values for a patient cohort is collapsed into a matrix p of biological pathways. The entries of p, which we term pathway scores, are obtained from either principal component analysis (PCA), normal tissue centroid (NTC), or gene expression deviation (GED). The pathway score matrix is clustered using both k-means and hierarchical clustering, and a clustering is judged by how well it groups patients into distinct survival classes. The most effective pathway scoring/clustering combination, per clustering p-value, thus generates various ‘signatures’ for conventional and functional cancer classification. Results: PICS successfully regularized large dimension gene data, separated normal and cancerous tissues, and clustered a large patient cohort spanning six cancer types. Furthermore, PICS clustered patient cohorts into distinct, statistically-significant survival groups. For a suboptimally-debulked ovarian cancer set, the pathway-classified Kaplan-Meier survival curve (p = .00127) showed significant improvement over that of a prior gene expression-classified study (p = .0179). For a pancreatic cancer set, the pathway-classified Kaplan-Meier survival curve (p = .00141) showed significant improvement over that of a prior gene expression-classified study (p = .04). Pathway-based classification confirmed biomarkers for the pyrimidine, WNT-signaling, glycerophosphoglycerol, beta-alanine, and panthothenic acid pathways for ovarian cancer. Despite its robust nature, PICS requires significantly less run time than current pathway scoring methods. Conclusion: This work validates the PICS method to improve

  15. Mechanical design in embryos: mechanical signalling, robustness and developmental defects.

    Science.gov (United States)

    Davidson, Lance A

    2017-05-19

    Embryos are shaped by the precise application of force against the resistant structures of multicellular tissues. Forces may be generated, guided and resisted by cells, extracellular matrix, interstitial fluids, and how they are organized and bound within the tissue's architecture. In this review, we summarize our current thoughts on the multiple roles of mechanics in direct shaping, mechanical signalling and robustness of development. Genetic programmes of development interact with environmental cues to direct the composition of the early embryo and endow cells with active force production. Biophysical advances now provide experimental tools to measure mechanical resistance and collective forces during morphogenesis and are allowing integration of this field with studies of signalling and patterning during development. We focus this review on concepts that highlight this integration, and how the unique contributions of mechanical cues and gradients might be tested side by side with conventional signalling systems. We conclude with speculation on the integration of large-scale programmes of development, and how mechanical responses may ensure robust development and serve as constraints on programmes of tissue self-assembly.This article is part of the themed issue 'Systems morphodynamics: understanding the development of tissue hardware'. © 2017 The Author(s).

  16. Classification of parotidectomy: a proposed modification to the European Salivary Gland Society classification system.

    Science.gov (United States)

    Wong, Wai Keat; Shetty, Subhaschandra

    2017-08-01

    Parotidectomy remains the mainstay of treatment for both benign and malignant lesions of the parotid gland. There exists a wide range of possible surgical options in parotidectomy in terms of extent of parotid tissue removed. There is increasing need for uniformity of terminology resulting from growing interest in modifications of the conventional parotidectomy. It is, therefore, of paramount importance for a standardized classification system in describing extent of parotidectomy. Recently, the European Salivary Gland Society (ESGS) proposed a novel classification system for parotidectomy. The aim of this study is to evaluate this system. A classification system proposed by the ESGS was critically re-evaluated and modified to increase its accuracy and its acceptability. Modifications mainly focused on subdividing Levels I and II into IA, IB, IIA, and IIB. From June 2006 to June 2016, 126 patients underwent 130 parotidectomies at our hospital. The classification system was tested in that cohort of patient. While the ESGS classification system is comprehensive, it does not cover all possibilities. The addition of Sublevels IA, IB, IIA, and IIB may help to address some of the clinical situations seen and is clinically relevant. We aim to test the modified classification system for partial parotidectomy to address some of the challenges mentioned.

  17. Perspective: Evolution and detection of genetic robustness

    NARCIS (Netherlands)

    Visser, de J.A.G.M.; Hermisson, J.; Wagner, G.P.; Ancel Meyers, L.; Bagheri-Chaichian, H.; Blanchard, J.L.; Chao, L.; Cheverud, J.M.; Elena, S.F.; Fontana, W.; Gibson, G.; Hansen, T.F.; Krakauer, D.; Lewontin, R.C.; Ofria, C.; Rice, S.H.; Dassow, von G.; Wagner, A.; Whitlock, M.C.

    2003-01-01

    Robustness is the invariance of phenotypes in the face of perturbation. The robustness of phenotypes appears at various levels of biological organization, including gene expression, protein folding, metabolic flux, physiological homeostasis, development, and even organismal fitness. The mechanisms

  18. Robust lyapunov controller for uncertain systems

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Elmetennani, Shahrazed

    2017-01-01

    Various examples of systems and methods are provided for Lyapunov control for uncertain systems. In one example, a system includes a process plant and a robust Lyapunov controller configured to control an input of the process plant. The robust

  19. Robust distributed cognitive relay beamforming

    KAUST Repository

    Pandarakkottilil, Ubaidulla

    2012-05-01

    In this paper, we present a distributed relay beamformer design for a cognitive radio network in which a cognitive (or secondary) transmit node communicates with a secondary receive node assisted by a set of cognitive non-regenerative relays. The secondary nodes share the spectrum with a licensed primary user (PU) node, and each node is assumed to be equipped with a single transmit/receive antenna. The interference to the PU resulting from the transmission from the cognitive nodes is kept below a specified limit. The proposed robust cognitive relay beamformer design seeks to minimize the total relay transmit power while ensuring that the transceiver signal-to-interference- plus-noise ratio and PU interference constraints are satisfied. The proposed design takes into account a parameter of the error in the channel state information (CSI) to render the performance of the beamformer robust in the presence of imperfect CSI. Though the original problem is non-convex, we show that the proposed design can be reformulated as a tractable convex optimization problem that can be solved efficiently. Numerical results are provided and illustrate the performance of the proposed designs for different network operating conditions and parameters. © 2012 IEEE.

  20. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  1. Robust canonical correlations: A comparative study

    OpenAIRE

    Branco, JA; Croux, Christophe; Filzmoser, P; Oliveira, MR

    2005-01-01

    Several approaches for robust canonical correlation analysis will be presented and discussed. A first method is based on the definition of canonical correlation analysis as looking for linear combinations of two sets of variables having maximal (robust) correlation. A second method is based on alternating robust regressions. These methods axe discussed in detail and compared with the more traditional approach to robust canonical correlation via covariance matrix estimates. A simulation study ...

  2. Robust adaptive synchronization of general dynamical networks ...

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 86; Issue 6. Robust ... A robust adaptive synchronization scheme for these general complex networks with multiple delays and uncertainties is established and raised by employing the robust adaptive control principle and the Lyapunov stability theory. We choose ...

  3. Robust portfolio selection under norm uncertainty

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2016-06-01

    Full Text Available Abstract In this paper, we consider the robust portfolio selection problem which has a data uncertainty described by the ( p , w $(p,w$ -norm in the objective function. We show that the robust formulation of this problem is equivalent to a linear optimization problem. Moreover, we present some numerical results concerning our robust portfolio selection problem.

  4. A robust standard deviation control chart

    NARCIS (Netherlands)

    Schoonhoven, M.; Does, R.J.M.M.

    2012-01-01

    This article studies the robustness of Phase I estimators for the standard deviation control chart. A Phase I estimator should be efficient in the absence of contaminations and resistant to disturbances. Most of the robust estimators proposed in the literature are robust against either diffuse

  5. Methodology in robust and nonparametric statistics

    CERN Document Server

    Jurecková, Jana; Picek, Jan

    2012-01-01

    Introduction and SynopsisIntroductionSynopsisPreliminariesIntroductionInference in Linear ModelsRobustness ConceptsRobust and Minimax Estimation of LocationClippings from Probability and Asymptotic TheoryProblemsRobust Estimation of Location and RegressionIntroductionM-EstimatorsL-EstimatorsR-EstimatorsMinimum Distance and Pitman EstimatorsDifferentiable Statistical FunctionsProblemsAsymptotic Representations for L-Estimators

  6. Robust linear discriminant analysis with distance based estimators

    Science.gov (United States)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  7. Supervised linear dimensionality reduction with robust margins for object recognition

    Science.gov (United States)

    Dornaika, F.; Assoum, A.

    2013-01-01

    Linear Dimensionality Reduction (LDR) techniques have been increasingly important in computer vision and pattern recognition since they permit a relatively simple mapping of data onto a lower dimensional subspace, leading to simple and computationally efficient classification strategies. Recently, many linear discriminant methods have been developed in order to reduce the dimensionality of visual data and to enhance the discrimination between different groups or classes. Many existing linear embedding techniques relied on the use of local margins in order to get a good discrimination performance. However, dealing with outliers and within-class diversity has not been addressed by margin-based embedding method. In this paper, we explored the use of different margin-based linear embedding methods. More precisely, we propose to use the concepts of Median miss and Median hit for building robust margin-based criteria. Based on such margins, we seek the projection directions (linear embedding) such that the sum of local margins is maximized. Our proposed approach has been applied to the problem of appearance-based face recognition. Experiments performed on four public face databases show that the proposed approach can give better generalization performance than the classic Average Neighborhood Margin Maximization (ANMM). Moreover, thanks to the use of robust margins, the proposed method down-grades gracefully when label outliers contaminate the training data set. In particular, we show that the concept of Median hit was crucial in order to get robust performance in the presence of outliers.

  8. Advances in the classification and treatment of mastocytosis

    DEFF Research Database (Denmark)

    Valent, Peter; Akin, Cem; Hartmann, Karin

    2017-01-01

    Mastocytosis is a term used to denote a heterogeneous group of conditions defined by the expansion and accumulation of clonal (neoplastic) tissue mast cells in various organs. The classification of the World Health Organization (WHO) divides the disease into cutaneous mastocytosis, systemic...... leukemia. The clinical impact and prognostic value of this classification has been confirmed in numerous studies, and its basic concept remains valid. However, refinements have recently been proposed by the consensus group, the WHO, and the European Competence Network on Mastocytosis. In addition, new...... of mastocytosis, with emphasis on classification, prognostication, and emerging new treatment options in advanced systemic mastocytosis....

  9. Lauren classification and individualized chemotherapy in gastric cancer.

    Science.gov (United States)

    Ma, Junli; Shen, Hong; Kapesa, Linda; Zeng, Shan

    2016-05-01

    Gastric cancer is one of the most common malignancies worldwide. During the last 50 years, the histological classification of gastric carcinoma has been largely based on Lauren's criteria, in which gastric cancer is classified into two major histological subtypes, namely intestinal type and diffuse type adenocarcinoma. This classification was introduced in 1965, and remains currently widely accepted and employed, since it constitutes a simple and robust classification approach. The two histological subtypes of gastric cancer proposed by the Lauren classification exhibit a number of distinct clinical and molecular characteristics, including histogenesis, cell differentiation, epidemiology, etiology, carcinogenesis, biological behaviors and prognosis. Gastric cancer exhibits varied sensitivity to chemotherapy drugs and significant heterogeneity; therefore, the disease may be a target for individualized therapy. The Lauren classification may provide the basis for individualized treatment for advanced gastric cancer, which is increasingly gaining attention in the scientific field. However, few studies have investigated individualized treatment that is guided by pathological classification. The aim of the current review is to analyze the two major histological subtypes of gastric cancer, as proposed by the Lauren classification, and to discuss the implications of this for personalized chemotherapy.

  10. Container Materials, Fabrication And Robustness

    International Nuclear Information System (INIS)

    Dunn, K.; Louthan, M.; Rawls, G.; Sindelar, R.; Zapp, P.; Mcclard, J.

    2009-01-01

    The multi-barrier 3013 container used to package plutonium-bearing materials is robust and thereby highly resistant to identified degradation modes that might cause failure. The only viable degradation mechanisms identified by a panel of technical experts were pressurization within and corrosion of the containers. Evaluations of the container materials and the fabrication processes and resulting residual stresses suggest that the multi-layered containers will mitigate the potential for degradation of the outer container and prevent the release of the container contents to the environment. Additionally, the ongoing surveillance programs and laboratory studies should detect any incipient degradation of containers in the 3013 storage inventory before an outer container is compromised.

  11. Robust matching for voice recognition

    Science.gov (United States)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  12. Robustness Assessment of Spatial Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    2012-01-01

    Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures many modern buildi...... to robustness of spatial timber structures and will discuss the consequences of such robustness issues related to the future development of timber structures.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures many modern building...... codes consider the need for robustness of structures and provide strategies and methods to obtain robustness. Therefore a structural engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper summaries issues with respect...

  13. Traffic sign classification with dataset augmentation and convolutional neural network

    Science.gov (United States)

    Tang, Qing; Kurnianggoro, Laksono; Jo, Kang-Hyun

    2018-04-01

    This paper presents a method for traffic sign classification using a convolutional neural network (CNN). In this method, firstly we transfer a color image into grayscale, and then normalize it in the range (-1,1) as the preprocessing step. To increase robustness of classification model, we apply a dataset augmentation algorithm and create new images to train the model. To avoid overfitting, we utilize a dropout module before the last fully connection layer. To assess the performance of the proposed method, the German traffic sign recognition benchmark (GTSRB) dataset is utilized. Experimental results show that the method is effective in classifying traffic signs.

  14. Classification of Gait Types Based on the Duty-factor

    DEFF Research Database (Denmark)

    Fihl, Preben; Moeslund, Thomas B.

    2007-01-01

    on the speed of the human, the cameras setup etc. and hence a robust descriptor for gait classification. The dutyfactor is basically a matter of measuring the ground support of the feet with respect to the stride. We estimate this by comparing the incoming silhouettes to a database of silhouettes with known...... ground support. Silhouettes are extracted using the Codebook method and represented using Shape Contexts. The matching with database silhouettes is done using the Hungarian method. While manually estimated duty-factors show a clear classification the presented system contains misclassifications due...

  15. Cellular image classification

    CERN Document Server

    Xu, Xiang; Lin, Feng

    2017-01-01

    This book introduces new techniques for cellular image feature extraction, pattern recognition and classification. The authors use the antinuclear antibodies (ANAs) in patient serum as the subjects and the Indirect Immunofluorescence (IIF) technique as the imaging protocol to illustrate the applications of the described methods. Throughout the book, the authors provide evaluations for the proposed methods on two publicly available human epithelial (HEp-2) cell datasets: ICPR2012 dataset from the ICPR'12 HEp-2 cell classification contest and ICIP2013 training dataset from the ICIP'13 Competition on cells classification by fluorescent image analysis. First, the reading of imaging results is significantly influenced by one’s qualification and reading systems, causing high intra- and inter-laboratory variance. The authors present a low-order LP21 fiber mode for optical single cell manipulation and imaging staining patterns of HEp-2 cells. A focused four-lobed mode distribution is stable and effective in optical...

  16. Bosniak classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2016-01-01

    BACKGROUND: The Bosniak classification was originally based on computed tomographic (CT) findings. Magnetic resonance (MR) and contrast-enhanced ultrasonography (CEUS) imaging may demonstrate findings that are not depicted at CT, and there may not always be a clear correlation between the findings...... at MR and CEUS imaging and those at CT. PURPOSE: To compare diagnostic accuracy of MR, CEUS, and CT when categorizing complex renal cystic masses according to the Bosniak classification. MATERIAL AND METHODS: From February 2011 to June 2012, 46 complex renal cysts were prospectively evaluated by three...... readers. Each mass was categorized according to the Bosniak classification and CT was chosen as gold standard. Kappa was calculated for diagnostic accuracy and data was compared with pathological results. RESULTS: CT images found 27 BII, six BIIF, seven BIII, and six BIV. Forty-three cysts could...

  17. Bosniak Classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2014-01-01

    Background: The Bosniak classification is a diagnostic tool for the differentiation of cystic changes in the kidney. The process of categorizing renal cysts may be challenging, involving a series of decisions that may affect the final diagnosis and clinical outcome such as surgical management....... Purpose: To investigate the inter- and intra-observer agreement among experienced uroradiologists when categorizing complex renal cysts according to the Bosniak classification. Material and Methods: The original categories of 100 cystic renal masses were chosen as “Gold Standard” (GS), established...... to the calculated weighted κ all readers performed “very good” for both inter-observer and intra-observer variation. Most variation was seen in cysts catagorized as Bosniak II, IIF, and III. These results show that radiologists who evaluate complex renal cysts routinely may apply the Bosniak classification...

  18. Robust path planning for flexible needle insertion using Markov decision processes.

    Science.gov (United States)

    Tan, Xiaoyu; Yu, Pengqian; Lim, Kah-Bin; Chui, Chee-Kong

    2018-05-11

    Flexible needle has the potential to accurately navigate to a treatment region in the least invasive manner. We propose a new planning method using Markov decision processes (MDPs) for flexible needle navigation that can perform robust path planning and steering under the circumstance of complex tissue-needle interactions. This method enhances the robustness of flexible needle steering from three different perspectives. First, the method considers the problem caused by soft tissue deformation. The method then resolves the common needle penetration failure caused by patterns of targets, while the last solution addresses the uncertainty issues in flexible needle motion due to complex and unpredictable tissue-needle interaction. Computer simulation and phantom experimental results show that the proposed method can perform robust planning and generate a secure control policy for flexible needle steering. Compared with a traditional method using MDPs, the proposed method achieves higher accuracy and probability of success in avoiding obstacles under complicated and uncertain tissue-needle interactions. Future work will involve experiment with biological tissue in vivo. The proposed robust path planning method can securely steer flexible needle within soft phantom tissues and achieve high adaptability in computer simulation.

  19. Acoustic classification of dwellings

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2014-01-01

    insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant diversity in terms...... exchanging experiences about constructions fulfilling different classes, reducing trade barriers, and finally increasing the sound insulation of dwellings.......Schemes for the classification of dwellings according to different building performances have been proposed in the last years worldwide. The general idea behind these schemes relates to the positive impact a higher label, and thus a better performance, should have. In particular, focusing on sound...

  20. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  1. Classification of iconic images

    OpenAIRE

    Zrianina, Mariia; Kopf, Stephan

    2016-01-01

    Iconic images represent an abstract topic and use a presentation that is intuitively understood within a certain cultural context. For example, the abstract topic “global warming” may be represented by a polar bear standing alone on an ice floe. Such images are widely used in media and their automatic classification can help to identify high-level semantic concepts. This paper presents a system for the classification of iconic images. It uses a variation of the Bag of Visual Words approach wi...

  2. Casemix classification systems.

    Science.gov (United States)

    Fetter, R B

    1999-01-01

    The idea of using casemix classification to manage hospital services is not new, but has been limited by available technology. It was not until after the introduction of Medicare in the United States in 1965 that serious attempts were made to measure hospital production in order to contain spiralling costs. This resulted in a system of casemix classification known as diagnosis related groups (DRGs). This paper traces the development of DRGs and their evolution from the initial version to the All Patient Refined DRGs developed in 1991.

  3. ILAE Classification of the Epilepsies Position Paper of the ILAE Commission for Classification and Terminology

    Science.gov (United States)

    Scheffer, Ingrid E; Berkovic, Samuel; Capovilla, Giuseppe; Connolly, Mary B; French, Jacqueline; Guilhoto, Laura; Hirsch, Edouard; Jain, Satish; Mathern, Gary W.; Moshé, Solomon L; Nordli, Douglas R; Perucca, Emilio; Tomson, Torbjörn; Wiebe, Samuel; Zhang, Yue-Hua; Zuberi, Sameer M

    2017-01-01

    Summary The ILAE Classification of the Epilepsies has been updated to reflect our gain in understanding of the epilepsies and their underlying mechanisms following the major scientific advances which have taken place since the last ratified classification in 1989. As a critical tool for the practising clinician, epilepsy classification must be relevant and dynamic to changes in thinking, yet robust and translatable to all areas of the globe. Its primary purpose is for diagnosis of patients, but it is also critical for epilepsy research, development of antiepileptic therapies and communication around the world. The new classification originates from a draft document submitted for public comments in 2013 which was revised to incorporate extensive feedback from the international epilepsy community over several rounds of consultation. It presents three levels, starting with seizure type where it assumes that the patient is having epileptic seizures as defined by the new 2017 ILAE Seizure Classification. After diagnosis of the seizure type, the next step is diagnosis of epilepsy type, including focal epilepsy, generalized epilepsy, combined generalized and focal epilepsy, and also an unknown epilepsy group. The third level is that of epilepsy syndrome where a specific syndromic diagnosis can be made. The new classification incorporates etiology along each stage, emphasizing the need to consider etiology at each step of diagnosis as it often carries significant treatment implications. Etiology is broken into six subgroups, selected because of their potential therapeutic consequences. New terminology is introduced such as developmental and epileptic encephalopathy. The term benign is replaced by the terms self-limited and pharmacoresponsive, to be used where appropriate. It is hoped that this new framework will assist in improving epilepsy care and research in the 21st century. PMID:28276062

  4. Information gathering for CLP classification

    Directory of Open Access Journals (Sweden)

    Ida Marcello

    2011-01-01

    Full Text Available Regulation 1272/2008 includes provisions for two types of classification: harmonised classification and self-classification. The harmonised classification of substances is decided at Community level and a list of harmonised classifications is included in the Annex VI of the classification, labelling and packaging Regulation (CLP. If a chemical substance is not included in the harmonised classification list it must be self-classified, based on available information, according to the requirements of Annex I of the CLP Regulation. CLP appoints that the harmonised classification will be performed for carcinogenic, mutagenic or toxic to reproduction substances (CMR substances and for respiratory sensitisers category 1 and for other hazard classes on a case-by-case basis. The first step of classification is the gathering of available and relevant information. This paper presents the procedure for gathering information and to obtain data. The data quality is also discussed.

  5. The paradox of atheoretical classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2016-01-01

    A distinction can be made between “artificial classifications” and “natural classifications,” where artificial classifications may adequately serve some limited purposes, but natural classifications are overall most fruitful by allowing inference and thus many different purposes. There is strong...... support for the view that a natural classification should be based on a theory (and, of course, that the most fruitful theory provides the most fruitful classification). Nevertheless, atheoretical (or “descriptive”) classifications are often produced. Paradoxically, atheoretical classifications may...... be very successful. The best example of a successful “atheoretical” classification is probably the prestigious Diagnostic and Statistical Manual of Mental Disorders (DSM) since its third edition from 1980. Based on such successes one may ask: Should the claim that classifications ideally are natural...

  6. Hyperspectral image classification using Support Vector Machine

    International Nuclear Information System (INIS)

    Moughal, T A

    2013-01-01

    Classification of land cover hyperspectral images is a very challenging task due to the unfavourable ratio between the number of spectral bands and the number of training samples. The focus in many applications is to investigate an effective classifier in terms of accuracy. The conventional multiclass classifiers have the ability to map the class of interest but the considerable efforts and large training sets are required to fully describe the classes spectrally. Support Vector Machine (SVM) is suggested in this paper to deal with the multiclass problem of hyperspectral imagery. The attraction to this method is that it locates the optimal hyper plane between the class of interest and the rest of the classes to separate them in a new high-dimensional feature space by taking into account only the training samples that lie on the edge of the class distributions known as support vectors and the use of the kernel functions made the classifier more flexible by making it robust against the outliers. A comparative study has undertaken to find an effective classifier by comparing Support Vector Machine (SVM) to the other two well known classifiers i.e. Maximum likelihood (ML) and Spectral Angle Mapper (SAM). At first, the Minimum Noise Fraction (MNF) was applied to extract the best possible features form the hyperspectral imagery and then the resulting subset of the features was applied to the classifiers. Experimental results illustrate that the integration of MNF and SVM technique significantly reduced the classification complexity and improves the classification accuracy.

  7. Morphological classification of plant cell deaths.

    Science.gov (United States)

    van Doorn, W G; Beers, E P; Dangl, J L; Franklin-Tong, V E; Gallois, P; Hara-Nishimura, I; Jones, A M; Kawai-Yamada, M; Lam, E; Mundy, J; Mur, L A J; Petersen, M; Smertenko, A; Taliansky, M; Van Breusegem, F; Wolpert, T; Woltering, E; Zhivotovsky, B; Bozhkov, P V

    2011-08-01

    Programmed cell death (PCD) is an integral part of plant development and of responses to abiotic stress or pathogens. Although the morphology of plant PCD is, in some cases, well characterised and molecular mechanisms controlling plant PCD are beginning to emerge, there is still confusion about the classification of PCD in plants. Here we suggest a classification based on morphological criteria. According to this classification, the use of the term 'apoptosis' is not justified in plants, but at least two classes of PCD can be distinguished: vacuolar cell death and necrosis. During vacuolar cell death, the cell contents are removed by a combination of autophagy-like process and release of hydrolases from collapsed lytic vacuoles. Necrosis is characterised by early rupture of the plasma membrane, shrinkage of the protoplast and absence of vacuolar cell death features. Vacuolar cell death is common during tissue and organ formation and elimination, whereas necrosis is typically found under abiotic stress. Some examples of plant PCD cannot be ascribed to either major class and are therefore classified as separate modalities. These are PCD associated with the hypersensitive response to biotrophic pathogens, which can express features of both necrosis and vacuolar cell death, PCD in starchy cereal endosperm and during self-incompatibility. The present classification is not static, but will be subject to further revision, especially when specific biochemical pathways are better defined.

  8. Noise-robust speech triage.

    Science.gov (United States)

    Bartos, Anthony L; Cipr, Tomas; Nelson, Douglas J; Schwarz, Petr; Banowetz, John; Jerabek, Ladislav

    2018-04-01

    A method is presented in which conventional speech algorithms are applied, with no modifications, to improve their performance in extremely noisy environments. It has been demonstrated that, for eigen-channel algorithms, pre-training multiple speaker identification (SID) models at a lattice of signal-to-noise-ratio (SNR) levels and then performing SID using the appropriate SNR dependent model was successful in mitigating noise at all SNR levels. In those tests, it was found that SID performance was optimized when the SNR of the testing and training data were close or identical. In this current effort multiple i-vector algorithms were used, greatly improving both processing throughput and equal error rate classification accuracy. Using identical approaches in the same noisy environment, performance of SID, language identification, gender identification, and diarization were significantly improved. A critical factor in this improvement is speech activity detection (SAD) that performs reliably in extremely noisy environments, where the speech itself is barely audible. To optimize SAD operation at all SNR levels, two algorithms were employed. The first maximized detection probability at low levels (-10 dB ≤ SNR < +10 dB) using just the voiced speech envelope, and the second exploited features extracted from the original speech to improve overall accuracy at higher quality levels (SNR ≥ +10 dB).

  9. Ecosystem classification, Chapter 2

    Science.gov (United States)

    M.J. Robin-Abbott; L.H. Pardo

    2011-01-01

    The ecosystem classification in this report is based on the ecoregions developed through the Commission for Environmental Cooperation (CEC) for North America (CEC 1997). Only ecosystems that occur in the United States are included. CEC ecoregions are described, with slight modifications, below (CEC 1997) and shown in Figures 2.1 and 2.2. We chose this ecosystem...

  10. The classification of phocomelia.

    Science.gov (United States)

    Tytherleigh-Strong, G; Hooper, G

    2003-06-01

    We studied 24 patients with 44 phocomelic upper limbs. Only 11 limbs could be grouped in the classification system of Frantz and O' Rahilly. The non-classifiable limbs were further studied and their characteristics identified. It is confirmed that phocomelia is not an intercalary defect.

  11. Principles for ecological classification

    Science.gov (United States)

    Dennis H. Grossman; Patrick Bourgeron; Wolf-Dieter N. Busch; David T. Cleland; William Platts; G. Ray; C. Robins; Gary Roloff

    1999-01-01

    The principal purpose of any classification is to relate common properties among different entities to facilitate understanding of evolutionary and adaptive processes. In the context of this volume, it is to facilitate ecosystem stewardship, i.e., to help support ecosystem conservation and management objectives.

  12. Mimicking human texture classification

    NARCIS (Netherlands)

    Rogowitz, B.E.; van Rikxoort, Eva M.; van den Broek, Egon; Pappas, T.N.; Schouten, Theo E.; Daly, S.J.

    2005-01-01

    In an attempt to mimic human (colorful) texture classification by a clustering algorithm three lines of research have been encountered, in which as test set 180 texture images (both their color and gray-scale equivalent) were drawn from the OuTex and VisTex databases. First, a k-means algorithm was

  13. Classification, confusion and misclassification

    African Journals Online (AJOL)

    The classification of objects and phenomena in science and nature has fascinated academics since Carl Linnaeus, the Swedish botanist and zoologist, created his binomial description of living things in the 1700s and probably long before in accounts of others in textbooks long since gone. It must have concerned human ...

  14. Classifications in popular music

    NARCIS (Netherlands)

    van Venrooij, A.; Schmutz, V.; Wright, J.D.

    2015-01-01

    The categorical system of popular music, such as genre categories, is a highly differentiated and dynamic classification system. In this article we present work that studies different aspects of these categorical systems in popular music. Following the work of Paul DiMaggio, we focus on four

  15. Shark Teeth Classification

    Science.gov (United States)

    Brown, Tom; Creel, Sally; Lee, Velda

    2009-01-01

    On a recent autumn afternoon at Harmony Leland Elementary in Mableton, Georgia, students in a fifth-grade science class investigated the essential process of classification--the act of putting things into groups according to some common characteristics or attributes. While they may have honed these skills earlier in the week by grouping their own…

  16. Text document classification

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana

    č. 62 (2005), s. 53-54 ISSN 0926-4981 R&D Projects: GA AV ČR IAA2075302; GA AV ČR KSK1019101; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : document representation * categorization * classification Subject RIV: BD - Theory of Information

  17. Classification in Medical Imaging

    DEFF Research Database (Denmark)

    Chen, Chen

    Classification is extensively used in the context of medical image analysis for the purpose of diagnosis or prognosis. In order to classify image content correctly, one needs to extract efficient features with discriminative properties and build classifiers based on these features. In addition...... on characterizing human faces and emphysema disease in lung CT images....

  18. Improving Student Question Classification

    Science.gov (United States)

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  19. NOUN CLASSIFICATION IN ESAHIE

    African Journals Online (AJOL)

    The present work deals with noun classification in Esahie (Kwa, Niger ... phonological information influences the noun (form) class system of Esahie. ... between noun classes and (grammatical) Gender is interrogated (in the light of ..... the (A) argument6 precedes the verb and the (P) argument7 follows the verb in a simple.

  20. Dynamic Latent Classification Model

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre

    as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...

  1. Classification of myocardial infarction

    DEFF Research Database (Denmark)

    Saaby, Lotte; Poulsen, Tina Svenstrup; Hosbond, Susanne Elisabeth

    2013-01-01

    The classification of myocardial infarction into 5 types was introduced in 2007 as an important component of the universal definition. In contrast to the plaque rupture-related type 1 myocardial infarction, type 2 myocardial infarction is considered to be caused by an imbalance between demand...

  2. Event Classification using Concepts

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2013-01-01

    The semantic gap is one of the challenges in the GOOSE project. In this paper a Semantic Event Classification (SEC) system is proposed as an initial step in tackling the semantic gap challenge in the GOOSE project. This system uses semantic text analysis, multiple feature detectors using the BoW

  3. Robust holographic storage system design.

    Science.gov (United States)

    Watanabe, Takahiro; Watanabe, Minoru

    2011-11-21

    Demand is increasing daily for large data storage systems that are useful for applications in spacecraft, space satellites, and space robots, which are all exposed to radiation-rich space environment. As candidates for use in space embedded systems, holographic storage systems are promising because they can easily provided the demanded large-storage capability. Particularly, holographic storage systems, which have no rotation mechanism, are demanded because they are virtually maintenance-free. Although a holographic memory itself is an extremely robust device even in a space radiation environment, its associated lasers and drive circuit devices are vulnerable. Such vulnerabilities sometimes engendered severe problems that prevent reading of all contents of the holographic memory, which is a turn-off failure mode of a laser array. This paper therefore presents a proposal for a recovery method for the turn-off failure mode of a laser array on a holographic storage system, and describes results of an experimental demonstration. © 2011 Optical Society of America

  4. Efficient robust conditional random fields.

    Science.gov (United States)

    Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A

    2015-10-01

    Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.

  5. NEW CLASSIFICATION OF ECOPOLICES

    Directory of Open Access Journals (Sweden)

    VOROBYOV V. V.

    2016-09-01

    Full Text Available Problem statement. Ecopolices are the newest stage of the urban planning. They have to be consideredsuchas material and energy informational structures, included to the dynamic-evolutionary matrix netsofex change processes in the ecosystems. However, there are not made the ecopolice classifications, developing on suchapproaches basis. And this determined the topicality of the article. Analysis of publications on theoretical and applied aspects of the ecopolices formation showed, that the work on them is managed mainly in the context of the latest scientific and technological achievements in the various knowledge fields. These settlements are technocratic. They are connected with the morphology of space, network structures of regional and local natural ecosystems, without independent stability, can not exist without continuous man support. Another words, they do not work in with an ecopolices idea. It is come to a head for objective, symbiotic searching of ecopolices concept with the development of their classifications. Purpose statement is to develop the objective evidence for ecopolices and to propose their new classification. Conclusion. On the base of the ecopolices classification have to lie an elements correlation idea of their general plans and men activity type according with natural mechanism of accepting, reworking and transmission of material, energy and information between geo-ecosystems, planet, man, ecopolices material part and Cosmos. New ecopolices classification should be based on the principles of multi-dimensional, time-spaced symbiotic clarity with exchange ecosystem networks. The ecopolice function with this approach comes not from the subjective anthropocentric economy but from the holistic objective of Genesis paradigm. Or, otherwise - not from the Consequence, but from the Cause.

  6. Efficient Fingercode Classification

    Science.gov (United States)

    Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang

    In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.

  7. Differential Classification of Dementia

    Directory of Open Access Journals (Sweden)

    E. Mohr

    1995-01-01

    Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.

  8. Robust AIC with High Breakdown Scale Estimate

    Directory of Open Access Journals (Sweden)

    Shokrya Saleh

    2014-01-01

    Full Text Available Akaike Information Criterion (AIC based on least squares (LS regression minimizes the sum of the squared residuals; LS is sensitive to outlier observations. Alternative criterion, which is less sensitive to outlying observation, has been proposed; examples are robust AIC (RAIC, robust Mallows Cp (RCp, and robust Bayesian information criterion (RBIC. In this paper, we propose a robust AIC by replacing the scale estimate with a high breakdown point estimate of scale. The robustness of the proposed methods is studied through its influence function. We show that, the proposed robust AIC is effective in selecting accurate models in the presence of outliers and high leverage points, through simulated and real data examples.

  9. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-09-09

    ... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: The... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S...

  10. Joint Concept Correlation and Feature-Concept Relevance Learning for Multilabel Classification.

    Science.gov (United States)

    Zhao, Xiaowei; Ma, Zhigang; Li, Zhi; Li, Zhihui

    2018-02-01

    In recent years, multilabel classification has attracted significant attention in multimedia annotation. However, most of the multilabel classification methods focus only on the inherent correlations existing among multiple labels and concepts and ignore the relevance between features and the target concepts. To obtain more robust multilabel classification results, we propose a new multilabel classification method aiming to capture the correlations among multiple concepts by leveraging hypergraph that is proved to be beneficial for relational learning. Moreover, we consider mining feature-concept relevance, which is often overlooked by many multilabel learning algorithms. To better show the feature-concept relevance, we impose a sparsity constraint on the proposed method. We compare the proposed method with several other multilabel classification methods and evaluate the classification performance by mean average precision on several data sets. The experimental results show that the proposed method outperforms the state-of-the-art methods.

  11. Automated Feature Design for Time Series Classification by Genetic Programming

    OpenAIRE

    Harvey, Dustin Yewell

    2014-01-01

    Time series classification (TSC) methods discover and exploit patterns in time series and other one-dimensional signals. Although many accurate, robust classifiers exist for multivariate feature sets, general approaches are needed to extend machine learning techniques to make use of signal inputs. Numerous applications of TSC can be found in structural engineering, especially in the areas of structural health monitoring and non-destructive evaluation. Additionally, the fields of process contr...

  12. Multimodel Robust Control for Hydraulic Turbine

    OpenAIRE

    Osuský, Jakub; Števo, Stanislav

    2014-01-01

    The paper deals with the multimodel and robust control system design and their combination based on M-Δ structure. Controller design will be done in the frequency domain with nominal performance specified by phase margin. Hydraulic turbine model is analyzed as system with unstructured uncertainty, and robust stability condition is included in controller design. Multimodel and robust control approaches are presented in detail on hydraulic turbine model. Control design approaches are compared a...

  13. Forecasting exchange rates: a robust regression approach

    OpenAIRE

    Preminger, Arie; Franck, Raphael

    2005-01-01

    The least squares estimation method as well as other ordinary estimation method for regression models can be severely affected by a small number of outliers, thus providing poor out-of-sample forecasts. This paper suggests a robust regression approach, based on the S-estimation method, to construct forecasting models that are less sensitive to data contamination by outliers. A robust linear autoregressive (RAR) and a robust neural network (RNN) models are estimated to study the predictabil...

  14. 32 CFR 2700.22 - Classification guides.

    Science.gov (United States)

    2010-07-01

    ... SECURITY INFORMATION REGULATIONS Derivative Classification § 2700.22 Classification guides. OMSN shall... direct derivative classification, shall identify the information to be protected in specific and uniform...

  15. Robustness: confronting lessons from physics and biology.

    Science.gov (United States)

    Lesne, Annick

    2008-11-01

    The term robustness is encountered in very different scientific fields, from engineering and control theory to dynamical systems to biology. The main question addressed herein is whether the notion of robustness and its correlates (stability, resilience, self-organisation) developed in physics are relevant to biology, or whether specific extensions and novel frameworks are required to account for the robustness properties of living systems. To clarify this issue, the different meanings covered by this unique term are discussed; it is argued that they crucially depend on the kind of perturbations that a robust system should by definition withstand. Possible mechanisms underlying robust behaviours are examined, either encountered in all natural systems (symmetries, conservation laws, dynamic stability) or specific to biological systems (feedbacks and regulatory networks). Special attention is devoted to the (sometimes counterintuitive) interrelations between robustness and noise. A distinction between dynamic selection and natural selection in the establishment of a robust behaviour is underlined. It is finally argued that nested notions of robustness, relevant to different time scales and different levels of organisation, allow one to reconcile the seemingly contradictory requirements for robustness and adaptability in living systems.

  16. Robustness of Long Span Reciprocal Timber Structures

    DEFF Research Database (Denmark)

    Balfroid, Nathalie; Kirkegaard, Poul Henning

    2011-01-01

    engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper makes a discussion of such robustness issues related to the future development of reciprocal timber structures. The paper concludes that these kind of structures can have...... a potential as long span timber structures in real projects if they are carefully designed with respect to the overall robustness strategies.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. The interest has also been facilitated due to recently severe structural failures...

  17. Biogeographic classification of the Caspian Sea

    Science.gov (United States)

    Fendereski, F.; Vogt, M.; Payne, M. R.; Lachkar, Z.; Gruber, N.; Salmanmahiny, A.; Hosseini, S. A.

    2014-11-01

    Like other inland seas, the Caspian Sea (CS) has been influenced by climate change and anthropogenic disturbance during recent decades, yet the scientific understanding of this water body remains poor. In this study, an eco-geographical classification of the CS based on physical information derived from space and in situ data is developed and tested against a set of biological observations. We used a two-step classification procedure, consisting of (i) a data reduction with self-organizing maps (SOMs) and (ii) a synthesis of the most relevant features into a reduced number of marine ecoregions using the hierarchical agglomerative clustering (HAC) method. From an initial set of 12 potential physical variables, 6 independent variables were selected for the classification algorithm, i.e., sea surface temperature (SST), bathymetry, sea ice, seasonal variation of sea surface salinity (DSSS), total suspended matter (TSM) and its seasonal variation (DTSM). The classification results reveal a robust separation between the northern and the middle/southern basins as well as a separation of the shallow nearshore waters from those offshore. The observed patterns in ecoregions can be attributed to differences in climate and geochemical factors such as distance from river, water depth and currents. A comparison of the annual and monthly mean Chl a concentrations between the different ecoregions shows significant differences (one-way ANOVA, P qualitative evaluation of differences in community composition based on recorded presence-absence patterns of 25 different species of plankton, fish and benthic invertebrate also confirms the relevance of the ecoregions as proxies for habitats with common biological characteristics.

  18. Robust optimization based upon statistical theory.

    Science.gov (United States)

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose

  19. A new circulation type classification based upon Lagrangian air trajectories

    Directory of Open Access Journals (Sweden)

    Alexandre M. Ramos

    2014-10-01

    Full Text Available A new classification method of the large-scale circulation characteristic for a specific target area (NW Iberian Peninsula is presented, based on the analysis of 90-h backward trajectories arriving in this area calculated with the 3-D Lagrangian particle dispersion model FLEXPART. A cluster analysis is applied to separate the backward trajectories in up to five representative air streams for each day. Specific measures are then used to characterise the distinct air streams (e.g., curvature of the trajectories, cyclonic or anticyclonic flow, moisture evolution, origin and length of the trajectories. The robustness of the presented method is demonstrated in comparison with the Eulerian Lamb weather type classification.A case study of the 2003 heatwave is discussed in terms of the new Lagrangian circulation and the Lamb weather type classifications. It is shown that the new classification method adds valuable information about the pertinent meteorological conditions, which are missing in an Eulerian approach. The new method is climatologically evaluated for the five-year time period from December 1999 to November 2004. The ability of the method to capture the inter-seasonal circulation variability in the target region is shown. Furthermore, the multi-dimensional character of the classification is shortly discussed, in particular with respect to inter-seasonal differences. Finally, the relationship between the new Lagrangian classification and the precipitation in the target area is studied.

  20. IAEA Classification of Uranium Deposits

    International Nuclear Information System (INIS)

    Bruneton, Patrice

    2014-01-01

    Classifications of uranium deposits follow two general approaches, focusing on: • descriptive features such as the geotectonic position, the host rock type, the orebody morphology, …… : « geologic classification »; • or on genetic aspects: « genetic classification »

  1. Classification of Osteogenesis Imperfecta revisited

    NARCIS (Netherlands)

    van Dijk, F. S.; Pals, G.; van Rijn, R. R.; Nikkels, P. G. J.; Cobben, J. M.

    2010-01-01

    In 1979 Sillence proposed a classification of Osteogenesis Imperfecta (OI) in OI types I, II, III and IV. In 2004 and 2007 this classification was expanded with OI types V-VIII because of distinct clinical features and/or different causative gene mutations. We propose a revised classification of OI

  2. The future of general classification

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2013-01-01

    Discusses problems related to accessing multiple collections using a single retrieval language. Surveys the concepts of interoperability and switching language. Finds that mapping between more indexing languages always will be an approximation. Surveys the issues related to general classification...... and contrasts that to special classifications. Argues for the use of general classifications to provide access to collections nationally and internationally....

  3. Proposal of a new classification scheme for periocular injuries

    Directory of Open Access Journals (Sweden)

    Devi Prasad Mohapatra

    2017-01-01

    Full Text Available Background: Eyelids are important structures and play a role in protecting the globe from trauma, brightness, in maintaining the integrity of tear films and moving the tears towards the lacrimal drainage system and contribute to aesthetic appearance of the face. Ophthalmic trauma is an important cause of morbidity among individuals and has also been responsible for additional cost of healthcare. Periocular trauma involving eyelids and adjacent structures has been found to have increased recently probably due to increased pace of life and increased dependence on machinery. A comprehensive classification of periocular trauma would help in stratifying these injuries as well as study outcomes. Material and Methods: This study was carried out at our institute from June 2015 to Dec 2015. We searched multiple English language databases for existing classification systems for periocular trauma. We designed a system of classification of periocular soft tissue injuries based on clinico-anatomical presentations. This classification was applied prospectively to patients presenting with periocular soft tissue injuries to our department. Results: A comprehensive classification scheme was designed consisting of five types of periocular injuries. A total of 38 eyelid injuries in 34 patients were evaluated in this study. According to the System for Peri-Ocular Trauma (SPOT classification, Type V injuries were most common. SPOT Type II injuries were more common isolated injuries among all zones. Discussion: Classification systems are necessary in order to provide a framework in which to scientifically study the etiology, pathogenesis, and treatment of diseases in an orderly fashion. The SPOT classification has taken into account the periocular soft tissue injuries i.e., upper eyelid, lower eyelid, medial and lateral canthus injuries., based on observed clinico-anatomical patterns of eyelid injuries. Conclusion: The SPOT classification seems to be a reliable

  4. Remote Sensing Image Classification Based on Stacked Denoising Autoencoder

    Directory of Open Access Journals (Sweden)

    Peng Liang

    2017-12-01

    Full Text Available Focused on the issue that conventional remote sensing image classification methods have run into the bottlenecks in accuracy, a new remote sensing image classification method inspired by deep learning is proposed, which is based on Stacked Denoising Autoencoder. First, the deep network model is built through the stacked layers of Denoising Autoencoder. Then, with noised input, the unsupervised Greedy layer-wise training algorithm is used to train each layer in turn for more robust expressing, characteristics are obtained in supervised learning by Back Propagation (BP neural network, and the whole network is optimized by error back propagation. Finally, Gaofen-1 satellite (GF-1 remote sensing data are used for evaluation, and the total accuracy and kappa accuracy reach 95.7% and 0.955, respectively, which are higher than that of the Support Vector Machine and Back Propagation neural network. The experiment results show that the proposed method can effectively improve the accuracy of remote sensing image classification.

  5. Completed Local Ternary Pattern for Rotation Invariant Texture Classification

    Directory of Open Access Journals (Sweden)

    Taha H. Rassem

    2014-01-01

    Full Text Available Despite the fact that the two texture descriptors, the completed modeling of Local Binary Pattern (CLBP and the Completed Local Binary Count (CLBC, have achieved a remarkable accuracy for invariant rotation texture classification, they inherit some Local Binary Pattern (LBP drawbacks. The LBP is sensitive to noise, and different patterns of LBP may be classified into the same class that reduces its discriminating property. Although, the Local Ternary Pattern (LTP is proposed to be more robust to noise than LBP, however, the latter’s weakness may appear with the LTP as well as with LBP. In this paper, a novel completed modeling of the Local Ternary Pattern (LTP operator is proposed to overcome both LBP drawbacks, and an associated completed Local Ternary Pattern (CLTP scheme is developed for rotation invariant texture classification. The experimental results using four different texture databases show that the proposed CLTP achieved an impressive classification accuracy as compared to the CLBP and CLBC descriptors.

  6. Generic and robust method for automatic segmentation of PET images using an active contour model

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, Mingzan [Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, 9700 RB Groningen (Netherlands)

    2016-08-15

    Purpose: Although positron emission tomography (PET) images have shown potential to improve the accuracy of targeting in radiation therapy planning and assessment of response to treatment, the boundaries of tumors are not easily distinguishable from surrounding normal tissue owing to the low spatial resolution and inherent noisy characteristics of PET images. The objective of this study is to develop a generic and robust method for automatic delineation of tumor volumes using an active contour model and to evaluate its performance using phantom and clinical studies. Methods: MASAC, a method for automatic segmentation using an active contour model, incorporates the histogram fuzzy C-means clustering, and localized and textural information to constrain the active contour to detect boundaries in an accurate and robust manner. Moreover, the lattice Boltzmann method is used as an alternative approach for solving the level set equation to make it faster and suitable for parallel programming. Twenty simulated phantom studies and 16 clinical studies, including six cases of pharyngolaryngeal squamous cell carcinoma and ten cases of nonsmall cell lung cancer, were included to evaluate its performance. Besides, the proposed method was also compared with the contourlet-based active contour algorithm (CAC) and Schaefer’s thresholding method (ST). The relative volume error (RE), Dice similarity coefficient (DSC), and classification error (CE) metrics were used to analyze the results quantitatively. Results: For the simulated phantom studies (PSs), MASAC and CAC provide similar segmentations of the different lesions, while ST fails to achieve reliable results. For the clinical datasets (2 cases with connected high-uptake regions excluded) (CSs), CAC provides for the lowest mean RE (−8.38% ± 27.49%), while MASAC achieves the best mean DSC (0.71 ± 0.09) and mean CE (53.92% ± 12.65%), respectively. MASAC could reliably quantify different types of lesions assessed in this work

  7. [Headache: classification and diagnosis].

    Science.gov (United States)

    Carbaat, P A T; Couturier, E G M

    2016-11-01

    There are many types of headache and, moreover, many people have different types of headache at the same time. Adequate treatment is possible only on the basis of the correct diagnosis. Technically and in terms of content the current diagnostics process for headache is based on the 'International Classification of Headache Disorders' (ICHD-3-beta) that was produced under the auspices of the International Headache Society. This classification is based on a distinction between primary and secondary headaches. The most common primary headache types are the tension type headache, migraine and the cluster headache. Application of uniform diagnostic concepts is essential to come to the most appropriate treatment of the various types of headache.

  8. Classification of hand eczema

    DEFF Research Database (Denmark)

    Agner, T; Aalto-Korte, K; Andersen, K E

    2015-01-01

    BACKGROUND: Classification of hand eczema (HE) is mandatory in epidemiological and clinical studies, and also important in clinical work. OBJECTIVES: The aim was to test a recently proposed classification system of HE in clinical practice in a prospective multicentre study. METHODS: Patients were...... recruited from nine different tertiary referral centres. All patients underwent examination by specialists in dermatology and were checked using relevant allergy testing. Patients were classified into one of the six diagnostic subgroups of HE: allergic contact dermatitis, irritant contact dermatitis, atopic...... system investigated in the present study was useful, being able to give an appropriate main diagnosis for 89% of HE patients, and for another 7% when using two main diagnoses. The fact that more than half of the patients had one or more additional diagnoses illustrates that HE is a multifactorial disease....

  9. Sound classification of dwellings

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2012-01-01

    National schemes for sound classification of dwellings exist in more than ten countries in Europe, typically published as national standards. The schemes define quality classes reflecting different levels of acoustical comfort. Main criteria concern airborne and impact sound insulation between...... dwellings, facade sound insulation and installation noise. The schemes have been developed, implemented and revised gradually since the early 1990s. However, due to lack of coordination between countries, there are significant discrepancies, and new standards and revisions continue to increase the diversity...... is needed, and a European COST Action TU0901 "Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing Constructions", has been established and runs 2009-2013, one of the main objectives being to prepare a proposal for a European sound classification scheme with a number of quality...

  10. Definitions, Criteria and Global Classification of Mast Cell Disorders with Special Reference to Mast Cell Activation Syndromes: A Consensus Proposal

    Science.gov (United States)

    Valent, Peter; Akin, Cem; Arock, Michel; Brockow, Knut; Butterfield, Joseph H.; Carter, Melody C.; Castells, Mariana; Escribano, Luis; Hartmann, Karin; Lieberman, Philip; Nedoszytko, Boguslaw; Orfao, Alberto; Schwartz, Lawrence B.; Sotlar, Karl; Sperr, Wolfgang R.; Triggiani, Massimo; Valenta, Rudolf; Horny, Hans-Peter; Metcalfe, Dean D.

    2012-01-01

    Activation of tissue mast cells (MCs) and their abnormal growth and accumulation in various organs are typically found in primary MC disorders also referred to as mastocytosis. However, increasing numbers of patients are now being informed that their clinical findings are due to MC activation (MCA) that is neither associated with mastocytosis nor with a defined allergic or inflammatory reaction. In other patients with MCA, MCs appear to be clonal cells, but criteria for diagnosing mastocytosis are not met. A working conference was organized in 2010 with the aim to define criteria for diagnosing MCA and related disorders, and to propose a global unifying classification of all MC disorders and pathologic MC reactions. This classification includes three types of ‘MCA syndromes’ (MCASs), namely primary MCAS, secondary MCAS and idiopathic MCAS. MCA is now defined by robust and generally applicable criteria, including (1) typical clinical symptoms, (2) a substantial transient increase in serum total tryptase level or an increase in other MC-derived mediators, such as histamine or prostaglandin D2, or their urinary metabolites, and (3) a response of clinical symptoms to agents that attenuate the production or activities of MC mediators. These criteria should assist in the identification and diagnosis of patients with MCAS, and in avoiding misdiagnoses or overinterpretation of clinical symptoms in daily practice. Moreover, the MCAS concept should stimulate research in order to identify and exploit new molecular mechanisms and therapeutic targets. PMID:22041891

  11. Granular loess classification based

    International Nuclear Information System (INIS)

    Browzin, B.S.

    1985-01-01

    This paper discusses how loess might be identified by two index properties: the granulometric composition and the dry unit weight. These two indices are necessary but not always sufficient for identification of loess. On the basis of analyses of samples from three continents, it was concluded that the 0.01-0.5-mm fraction deserves the name loessial fraction. Based on the loessial fraction concept, a granulometric classification of loess is proposed. A triangular chart is used to classify loess

  12. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  13. CLASSIFICATION OF CRIMINAL GROUPS

    OpenAIRE

    Natalia Romanova

    2013-01-01

    New types of criminal groups are emerging in modern society.  These types have their special criminal subculture. The research objective is to develop new parameters of classification of modern criminal groups, create a new typology of criminal groups and identify some features of their subculture. Research methodology is based on the system approach that includes using the method of analysis of documentary sources (materials of a criminal case), method of conversations with themembers of the...

  14. Decimal Classification Editions

    Directory of Open Access Journals (Sweden)

    Zenovia Niculescu

    2009-01-01

    Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  15. Decimal Classification Editions

    OpenAIRE

    Zenovia Niculescu

    2009-01-01

    The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  16. Classifications of track structures

    International Nuclear Information System (INIS)

    Paretzke, H.G.

    1984-01-01

    When ionizing particles interact with matter they produce random topological structures of primary activations which represent the initial boundary conditions for all subsequent physical, chemical and/or biological reactions. There are two important aspects of research on such track structures, namely their experimental or theoretical determination on one hand and the quantitative classification of these complex structures which is a basic pre-requisite for the understanding of mechanisms of radiation actions. This paper deals only with the latter topic, i.e. the problems encountered in and possible approaches to quantitative ordering and grouping of these multidimensional objects by their degrees of similarity with respect to their efficiency in producing certain final radiation effects, i.e. to their ''radiation quality.'' Various attempts of taxonometric classification with respect to radiation efficiency have been made in basic and applied radiation research including macro- and microdosimetric concepts as well as track entities and stopping power based theories. In this paper no review of those well-known approaches is given but rather an outline and discussion of alternative methods new to this field of radiation research which have some very promising features and which could possibly solve at least some major classification problems

  17. Neuromuscular disease classification system

    Science.gov (United States)

    Sáez, Aurora; Acha, Begoña; Montero-Sánchez, Adoración; Rivas, Eloy; Escudero, Luis M.; Serrano, Carmen

    2013-06-01

    Diagnosis of neuromuscular diseases is based on subjective visual assessment of biopsies from patients by the pathologist specialist. A system for objective analysis and classification of muscular dystrophies and neurogenic atrophies through muscle biopsy images of fluorescence microscopy is presented. The procedure starts with an accurate segmentation of the muscle fibers using mathematical morphology and a watershed transform. A feature extraction step is carried out in two parts: 24 features that pathologists take into account to diagnose the diseases and 58 structural features that the human eye cannot see, based on the assumption that the biopsy is considered as a graph, where the nodes are represented by each fiber, and two nodes are connected if two fibers are adjacent. A feature selection using sequential forward selection and sequential backward selection methods, a classification using a Fuzzy ARTMAP neural network, and a study of grading the severity are performed on these two sets of features. A database consisting of 91 images was used: 71 images for the training step and 20 as the test. A classification error of 0% was obtained. It is concluded that the addition of features undetectable by the human visual inspection improves the categorization of atrophic patterns.

  18. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  19. Implicitly Weighted Methods in Robust Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 44, č. 3 (2012), s. 449-462 ISSN 0924-9907 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robustness * high breakdown point * outlier detection * robust correlation analysis * template matching * face recognition Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.767, year: 2012

  20. What is it to be sturdy (robust)?

    DEFF Research Database (Denmark)

    Nielsen, Niss Skov; Zwisler, Lars Pagter; Bojsen, Ann Kristina Mikkelsen

    Purpose: This paper intends to give a first insight into the concept of being "sturdy/robust"; To develop and test a Danish model of how to measure sturdi-ness/robustness; To test the scale's ability to identify people in emergency situa-tions who have high risk of developing psychological illness....

  1. Structural Robustness Evaluation of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Giuliani, Luisa; Bontempi, Franco

    2010-01-01

    in the framework of a safe design: it depends on different factors, like exposure, vulnerability and robustness. Particularly, the requirement of structural vulnerability and robustness are discussed in this paper and a numerical application is presented, in order to evaluate the effects of a ship collision...

  2. In Silico Design of Robust Bolalipid Membranes

    NARCIS (Netherlands)

    Bulacu, Monica; Periole, Xavier; Marrink, Siewert J.; Périole, Xavier

    The robustness of microorganisms used in industrial fermentations is essential for the efficiency and yield of the production process. A viable tool to increase the robustness is through engineering of the cell membrane and especially by incorporating lipids from species that survive under harsh

  3. Assessment of Process Robustness for Mass Customization

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev

    2013-01-01

    robustness and their capability to develop it. Through literature study and analysis of robust process design characteristics a number of metrics are described which can be used for assessment. The metrics are evaluated and analyzed to be applied as KPI’s to help MC companies prioritize efforts in business...

  4. Applying Robust Design in an Industrial Context

    DEFF Research Database (Denmark)

    Christensen, Martin Ebro

    mechanical architectures. Furthermore a set of 15 robust design principles for reducing the variation in functional performance is compiled in a format directly supporting the work of the design engineer. With these foundational methods in place, the existing tools, methods and KPIs of Robust Design...

  5. The importance of robust design methodology

    DEFF Research Database (Denmark)

    Eifler, Tobias; Howard, Thomas J.

    2018-01-01

    infamous recalls in automotive history, that of the GM ignition switch, from the perspective of Robust Design. It is investigated if available Robust Design methods such as sensitivity analysis, tolerance stack-ups, design clarity, etc. would have been suitable to account for the performance variation...

  6. Robust Control Charts for Time Series Data

    NARCIS (Netherlands)

    Croux, C.; Gelper, S.; Mahieu, K.

    2010-01-01

    This article presents a control chart for time series data, based on the one-step- ahead forecast errors of the Holt-Winters forecasting method. We use robust techniques to prevent that outliers affect the estimation of the control limits of the chart. Moreover, robustness is important to maintain

  7. Efficient reanalysis techniques for robust topology optimization

    DEFF Research Database (Denmark)

    Amir, Oded; Sigmund, Ole; Lazarov, Boyan Stefanov

    2012-01-01

    efficient robust topology optimization procedures based on reanalysis techniques. The approach is demonstrated on two compliant mechanism design problems where robust design is achieved by employing either a worst case formulation or a stochastic formulation. It is shown that the time spent on finite...

  8. Extending the Scope of Robust Quadratic Optimization

    NARCIS (Netherlands)

    Marandi, Ahmadreza; Ben-Tal, A.; den Hertog, Dick; Melenberg, Bertrand

    In this paper, we derive tractable reformulations of the robust counterparts of convex quadratic and conic quadratic constraints with concave uncertainties for a broad range of uncertainty sets. For quadratic constraints with convex uncertainty, it is well-known that the robust counterpart is, in

  9. Security and robustness for collaborative monitors

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2016-01-01

    Decentralized monitors can be subject to robustness and security risks. Robustness risks include attacks on the monitor’s infrastructure in order to disable parts of its functionality. Security risks include attacks that try to extract information from the monitor and thereby possibly leak sensitive

  10. EULAR points to consider in the development of classification and diagnostic criteria in systemic vasculitis

    DEFF Research Database (Denmark)

    Basu, Neil; Watts, Richard; Bajema, Ingeborg

    2010-01-01

    The systemic vasculitides are multiorgan diseases where early diagnosis and treatment can significantly improve outcomes. Robust nomenclature reduces diagnostic delay. However, key aspects of current nomenclature are widely perceived to be out of date, these include disease definitions, classific......, classification and diagnostic criteria. Therefore, the aim of the present work was to identify deficiencies and provide contemporary points to consider for the development of future definitions and criteria in systemic vasculitis....

  11. Tissue irradiator

    International Nuclear Information System (INIS)

    Hungate, F.P.; Riemath, W.F.; Bunnell, L.R.

    1975-01-01

    A tissue irradiator is provided for the in-vivo irradiation of body tissue. The irradiator comprises a radiation source material contained and completely encapsulated within vitreous carbon. An embodiment for use as an in-vivo blood irradiator comprises a cylindrical body having an axial bore therethrough. A radioisotope is contained within a first portion of vitreous carbon cylindrically surrounding the axial bore, and a containment portion of vitreous carbon surrounds the radioisotope containing portion, the two portions of vitreous carbon being integrally formed as a single unit. Connecting means are provided at each end of the cylindrical body to permit connections to blood-carrying vessels and to provide for passage of blood through the bore. In a preferred embodiment, the radioisotope is thulium-170 which is present in the irradiator in the form of thulium oxide. A method of producing the preferred blood irradiator is also provided, whereby nonradioactive thulium-169 is dispersed within a polyfurfuryl alcohol resin which is carbonized and fired to form the integral vitreous carbon body and the device is activated by neutron bombardment of the thulium-169 to produce the beta-emitting thulium-170

  12. Classification of multiple sclerosis lesions using adaptive dictionary learning.

    Science.gov (United States)

    Deshpande, Hrishikesh; Maurel, Pierre; Barillot, Christian

    2015-12-01

    This paper presents a sparse representation and an adaptive dictionary learning based method for automated classification of multiple sclerosis (MS) lesions in magnetic resonance (MR) images. Manual delineation of MS lesions is a time-consuming task, requiring neuroradiology experts to analyze huge volume of MR data. This, in addition to the high intra- and inter-observer variability necessitates the requirement of automated MS lesion classification methods. Among many image representation models and classification methods that can be used for such purpose, we investigate the use of sparse modeling. In the recent years, sparse representation has evolved as a tool in modeling data using a few basis elements of an over-complete dictionary and has found applications in many image processing tasks including classification. We propose a supervised classification approach by learning dictionaries specific to the lesions and individual healthy brain tissues, which include white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF). The size of the dictionaries learned for each class plays a major role in data representation but it is an even more crucial element in the case of competitive classification. Our approach adapts the size of the dictionary for each class, depending on the complexity of the underlying data. The algorithm is validated using 52 multi-sequence MR images acquired from 13 MS patients. The results demonstrate the effectiveness of our approach in MS lesion classification. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Molecular Classification of Melanoma

    Science.gov (United States)

    Tissue-based analyses of precursors, melanoma tumors and metastases within existing study populations to further understanding of the heterogeneity of melanoma and determine a predictive pattern of progression for dysplastic nevi.

  14. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    Science.gov (United States)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  15. How Robust is Your System Resilience?

    Science.gov (United States)

    Homayounfar, M.; Muneepeerakul, R.

    2017-12-01

    Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.

  16. A Survey on Robustness in Railway Planning

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Larsen, Jesper; Bull, Simon Henry

    2018-01-01

    Planning problems in passenger railway range from long term strategic decision making to the detailed planning of operations.Operations research methods have played an increasing role in this planning process. However, recently more attention has been given to considerations of robustness...... in the quality of solutions to individual planning problems, and of operations in general. Robustness in general is the capacity for some system to absorb or resist changes. In the context of railway robustness it is often taken to be the capacity for operations to continue at some level when faced...... with a disruption such as delay or failure. This has resulted in more attention given to the inclusion of robustness measures and objectives in individual planning problems, and to the providing of tools to ensure operations continue under disrupted situations. In this paper we survey the literature on robustness...

  17. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  18. A vocabulary for the identification and delineation of teratoma tissue components in hematoxylin and eosin-stained samples

    Directory of Open Access Journals (Sweden)

    Ramamurthy Bhagavatula

    2014-01-01

    Full Text Available We propose a methodology for the design of features mimicking the visual cues used by pathologists when identifying tissues in hematoxylin and eosin (H&E-stained samples. Background: H&E staining is the gold standard in clinical histology; it is cheap and universally used, producing a vast number of histopathological samples. While pathologists accurately and consistently identify tissues and their pathologies, it is a time-consuming and expensive task, establishing the need for automated algorithms for improved throughput and robustness. Methods: We use an iterative feedback process to design a histopathology vocabulary (HV, a concise set of features that mimic the visual cues used by pathologists, e.g. "cytoplasm color" or "nucleus density." These features are based in histology and understood by both pathologists and engineers. We compare our HV to several generic texture-feature sets in a pixel-level classification algorithm. Results: Results on delineating and identifying tissues in teratoma tumor samples validate our expert knowledge-based approach. Conclusions: The HV can be an effective tool for identifying and delineating teratoma components from images of H&E-stained tissue samples.

  19. Three-class classification in computer-aided diagnosis of breast cancer by support vector machine

    Science.gov (United States)

    Sun, Xuejun; Qian, Wei; Song, Dansheng

    2004-05-01

    Design of classifier in computer-aided diagnosis (CAD) scheme of breast cancer plays important role to its overall performance in sensitivity and specificity. Classification of a detected object as malignant lesion, benign lesion, or normal tissue on mammogram is a typical three-class pattern recognition problem. This paper presents a three-class classification approach by using two-stage classifier combined with support vector machine (SVM) learning algorithm for classification of breast cancer on mammograms. The first classification stage is used to detect abnormal areas and normal breast tissues, and the second stage is for classification of malignant or benign in detected abnormal objects. A series of spatial, morphology and texture features have been extracted on detected objects areas. By using genetic algorithm (GA), different feature groups for different stage classification have been investigated. Computerized free-response receiver operating characteristic (FROC) and receiver operating characteristic (ROC) analyses have been employed in different classification stages. Results have shown that obvious performance improvement in both sensitivity and specificity was observed through proposed classification approach compared with conventional two-class classification approaches, indicating its effectiveness in classification of breast cancer on mammograms.

  20. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  1. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  2. Assessment And Testing of Industrial Devices Robustness Against Cyber Security Attacks

    CERN Document Server

    Tilaro, F

    2011-01-01

    CERN (European Organization for Nuclear Research),like any organization, needs to achieve the conflicting objectives of connecting its operational network to Internet while at the same time keeping its industrial control systems secure from external and internal cyber attacks. With this in mind, the ISA-99[0F1] international cyber security standard has been adopted at CERN as a reference model to define a set of guidelines and security robustness criteria applicable to any network device. Devices robustness represents a key link in the defense-in-depth concept as some attacks will inevitably penetrate security boundaries and thus require further protection measures. When assessing the cyber security robustness of devices we have singled out control system-relevant attack patterns derived from the well-known CAPEC[1F2] classification. Once a vulnerability is identified, it needs to be documented, prioritized and reproduced at will in a dedicated test environment for debugging purposes. CERN - in collaboration ...

  3. Robust lane detection and tracking using multiple visual cues under stochastic lane shape conditions

    Science.gov (United States)

    Huang, Zhi; Fan, Baozheng; Song, Xiaolin

    2018-03-01

    As one of the essential components of environment perception techniques for an intelligent vehicle, lane detection is confronted with challenges including robustness against the complicated disturbance and illumination, also adaptability to stochastic lane shapes. To overcome these issues, we proposed a robust lane detection method named classification-generation-growth-based (CGG) operator to the detected lines, whereby the linear lane markings are identified by synergizing multiple visual cues with the a priori knowledge and spatial-temporal information. According to the quality of linear lane fitting, the linear and linear-parabolic models are dynamically switched to describe the actual lane. The Kalman filter with adaptive noise covariance and the region of interests (ROI) tracking are applied to improve the robustness and efficiency. Experiments were conducted with images covering various challenging scenarios. The experimental results evaluate the effectiveness of the presented method for complicated disturbances, illumination, and stochastic lane shapes.

  4. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    International Nuclear Information System (INIS)

    McGowan, S E; Albertini, F; Lomax, A J; Thomas, S J

    2015-01-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties. (paper)

  5. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    Science.gov (United States)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  6. Robust vehicle detection in aerial images based on salient region selection and superpixel classification

    Science.gov (United States)

    Sahli, Samir; Duval, Pierre-Luc; Sheng, Yunlong; Lavigne, Daniel A.

    2011-05-01

    For detecting vehicles in large scale aerial images we first used a non-parametric method proposed recently by Rosin to define the regions of interest, where the vehicles appear with dense edges. The saliency map is a sum of distance transforms (DT) of a set of edges maps, which are obtained by a threshold decomposition of the gradient image with a set of thresholds. A binary mask for highlighting the regions of interest is then obtained by a moment-preserving thresholding of the normalized saliency map. Secondly, the regions of interest were over-segmented by the SLIC superpixels proposed recently by Achanta et al. to cluster pixels into the color constancy sub-regions. In the aerial images of 11.2 cm/pixel resolution, the vehicles in general do not exceed 20 x 40 pixels. We introduced a size constraint to guarantee no superpixels exceed the size of a vehicle. The superpixels were then classified to vehicle or non-vehicle by the Support Vector Machine (SVM), in which the Scale Invariant Feature Transform (SIFT) features and the Linear Binary Pattern (LBP) texture features were used. Both features were extracted at two scales with two size patches. The small patches capture local structures and the larger patches include the neighborhood information. Preliminary results show a significant gain in the detection. The vehicles were detected with a dense concentration of the vehicle-class superpixels. Even dark color cars were successfully detected. A validation process will follow to reduce the presence of isolated false alarms in the background.

  7. Robust nuclear lamina-based cell classification of aging and senescent cells.

    Science.gov (United States)

    Righolt, Christiaan H; van 't Hoff, Merel L R; Vermolen, Bart J; Young, Ian T; Raz, Vered

    2011-12-01

    Changes in the shape of the nuclear lamina are exhibited in senescent cells, as well as in cells expressing mutations in lamina genes. To identify cells with defects in the nuclear lamina we developed an imaging method that quantifies the intensity and curvature of the nuclear lamina. We show that this method accurately describes changes in the nuclear lamina. Spatial changes in nuclear lamina coincide with redistribution of lamin A proteins and local reduction in protein mobility in senescent cell. We suggest that local accumulation of lamin A in the nuclear envelope leads to bending of the structure. A quantitative distinction of the nuclear lamina shape in cell populations was found between fresh and senescent cells, and between primary myoblasts from young and old donors. Moreover, with this method mutations in lamina genes were significantly distinct from cells with wild-type genes. We suggest that this method can be applied to identify abnormal cells during aging, in in vitro propagation, and in lamina disorders.

  8. Robust classification of motor imagery EEG signals using statistical time–domain features

    International Nuclear Information System (INIS)

    Khorshidtalab, A; Salami, M J E; Hamedi, M

    2013-01-01

    The tradeoff between computational complexity and speed, in addition to growing demands for real-time BMI (brain–machine interface) systems, expose the necessity of applying methods with least possible complexity. Willison amplitude (WAMP) and slope sign change (SSC) are two promising time–domain features only if the right threshold value is defined for them. To overcome the drawback of going through trial and error for the determination of a suitable threshold value, modified WAMP and modified SSC are proposed in this paper. Besides, a comprehensive assessment of statistical time–domain features in which their effectiveness is evaluated with a support vector machine (SVM) is presented. To ensure the accuracy of the results obtained by the SVM, the performance of each feature is reassessed with supervised fuzzy C-means. The general assessment shows that every subject had at least one of his performances near or greater than 80%. The obtained results prove that for BMI applications, in which a few errors can be tolerated, these combinations of feature–classifier are suitable. Moreover, features that could perform satisfactorily were selected for feature combination. Combinations of the selected features are evaluated with the SVM, and they could significantly improve the results, in some cases, up to full accuracy. (paper)

  9. Histomorphometry and cortical robusticity of the adult human femur.

    Science.gov (United States)

    Miszkiewicz, Justyna Jolanta; Mahoney, Patrick

    2018-01-13

    Recent quantitative analyses of human bone microanatomy, as well as theoretical models that propose bone microstructure and gross anatomical associations, have started to reveal insights into biological links that may facilitate remodeling processes. However, relationships between bone size and the underlying cortical bone histology remain largely unexplored. The goal of this study is to determine the extent to which static indicators of bone remodeling and vascularity, measured using histomorphometric techniques, relate to femoral midshaft cortical width and robusticity. Using previously published and new quantitative data from 450 adult human male (n = 233) and female (n = 217) femora, we determine if these aspects of femoral size relate to bone microanatomy. Scaling relationships are explored and interpreted within the context of tissue form and function. Analyses revealed that the area and diameter of Haversian canals and secondary osteons, and densities of secondary osteons and osteocyte lacunae from the sub-periosteal region of the posterior midshaft femur cortex were significantly, but not consistently, associated with femoral size. Cortical width and bone robusticity were correlated with osteocyte lacunae density and scaled with positive allometry. Diameter and area of osteons and Haversian canals decreased as the width of cortex and bone robusticity increased, revealing a negative allometric relationship. These results indicate that microscopic products of cortical bone remodeling and vascularity are linked to femur size. Allometric relationships between more robust human femora with thicker cortical bone and histological products of bone remodeling correspond with principles of bone functional adaptation. Future studies may benefit from exploring scaling relationships between bone histomorphometric data and measurements of bone macrostructure.

  10. Quality assurance: The 10-Group Classification System (Robson classification), induction of labor, and cesarean delivery.

    LENUS (Irish Health Repository)

    Robson, Michael

    2015-10-01

    Quality assurance in labor and delivery is needed. The method must be simple and consistent, and be of universal value. It needs to be clinically relevant, robust, and prospective, and must incorporate epidemiological variables. The 10-Group Classification System (TGCS) is a simple method providing a common starting point for further detailed analysis within which all perinatal events and outcomes can be measured and compared. The system is demonstrated in the present paper using data for 2013 from the National Maternity Hospital in Dublin, Ireland. Interpretation of the classification can be easily taught. The standard table can provide much insight into the philosophy of care in the population of women studied and also provide information on data quality. With standardization of audit of events and outcomes, any differences in either sizes of groups, events or outcomes can be explained only by poor data collection, significant epidemiological variables, or differences in practice. In April 2015, WHO proposed that the TGCS (also known as the Robson classification) is used as a global standard for assessing, monitoring, and comparing cesarean delivery rates within and between healthcare facilities.

  11. Classification of IRAS asteroids

    International Nuclear Information System (INIS)

    Tedesco, E.F.; Matson, D.L.; Veeder, G.J.

    1989-01-01

    Albedos and spectral reflectances are essential for classifying asteroids. For example, classes E, M and P are indistinguishable without albedo data. Colorometric data are available for about 1000 asteroids but, prior to IRAS, albedo data was available for only about 200. IRAS broke this bottleneck by providing albedo data on nearly 2000 asteroids. Hence, excepting absolute magnitudes, the albedo and size are now the most common asteroid physical parameters known. In this chapter the authors present the results of analyses of IRAS-derived asteroid albedos, discuss their application to asteroid classification, and mention several studies which might be done to exploit further this data set

  12. SPORT FOOD ADDITIVE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    I. P. Prokopenko

    2015-01-01

    Full Text Available Correctly organized nutritive and pharmacological support is an important component of an athlete's preparation for competitions, an optimal shape maintenance, fast recovery and rehabilitation after traumas and defatigation. Special products of enhanced biological value (BAS for athletes nutrition are used with this purpose. Easy-to-use energy sources are administered into athlete's organism, yielded materials and biologically active substances which regulate and activate exchange reactions which proceed with difficulties during certain physical trainings. The article presents sport supplements classification which can be used before warm-up and trainings, after trainings and in competitions breaks.

  13. Radioactive facilities classification criteria

    International Nuclear Information System (INIS)

    Briso C, H.A.; Riesle W, J.

    1992-01-01

    Appropriate classification of radioactive facilities into groups of comparable risk constitutes one of the problems faced by most Regulatory Bodies. Regarding the radiological risk, the main facts to be considered are the radioactive inventory and the processes to which these radionuclides are subjected. Normally, operations are ruled by strict safety procedures. Thus, the total activity of the radionuclides existing in a given facility is the varying feature that defines its risk. In order to rely on a quantitative criterion and, considering that the Annual Limits of Intake are widely accepted references, an index based on these limits, to support decisions related to radioactive facilities, is proposed. (author)

  14. Robust recognition via information theoretic learning

    CERN Document Server

    He, Ran; Yuan, Xiaotong; Wang, Liang

    2014-01-01

    This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The?authors?resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency,?the brief?introduces the additive and multip

  15. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  16. Design Robust Controller for Rotary Kiln

    Directory of Open Access Journals (Sweden)

    Omar D. Hernández-Arboleda

    2013-11-01

    Full Text Available This paper presents the design of a robust controller for a rotary kiln. The designed controller is a combination of a fractional PID and linear quadratic regulator (LQR, these are not used to control the kiln until now, in addition robustness criteria are evaluated (gain margin, phase margin, strength gain, rejecting high frequency noise and sensitivity applied to the entire model (controller-plant, obtaining good results with a frequency range of 0.020 to 90 rad/s, which contributes to the robustness of the system.

  17. Towards distortion-free robust image authentication

    International Nuclear Information System (INIS)

    Coltuc, D

    2007-01-01

    This paper investigates a general framework for distortion-free robust image authentication by multiple marking. First, by robust watermarking a subsampled version of image edges is embedded. Then, by reversible watermarking the information needed to recover the original image is inserted, too. The hiding capacity of the reversible watermarking is the essential requirement for this approach. Thus in case of no attacks not only image is authenticated but also the original is exactly recovered. In case of attacks, reversibility is lost, but image can still be authenticated. Preliminary results providing very good robustness against JPEG compression are presented

  18. An Overview of the Adaptive Robust DFT

    Directory of Open Access Journals (Sweden)

    Djurović Igor

    2010-01-01

    Full Text Available Abstract This paper overviews basic principles and applications of the robust DFT (RDFT approach, which is used for robust processing of frequency-modulated (FM signals embedded in non-Gaussian heavy-tailed noise. In particular, we concentrate on the spectral analysis and filtering of signals corrupted by impulsive distortions using adaptive and nonadaptive robust estimators. Several adaptive estimators of location parameter are considered, and it is shown that their application is preferable with respect to non-adaptive counterparts. This fact is demonstrated by efficiency comparison of adaptive and nonadaptive RDFT methods for different noise environments.

  19. A robust interpretation of duration calculus

    DEFF Research Database (Denmark)

    Franzle, M.; Hansen, Michael Reichhardt

    2005-01-01

    We transfer the concept of robust interpretation from arithmetic first-order theories to metric-time temporal logics. The idea is that the interpretation of a formula is robust iff its truth value does not change under small variation of the constants in the formula. Exemplifying this on Duration...... Calculus (DC), our findings are that the robust interpretation of DC is equivalent to a multi-valued interpretation that uses the real numbers as semantic domain and assigns Lipschitz-continuous interpretations to all operators of DC. Furthermore, this continuity permits approximation between discrete...

  20. REINA at CLEF 2007 Robust Task

    OpenAIRE

    Zazo Rodríguez, Ángel Francisco; Figuerola, Carlos G.; Alonso Berrocal, José Luis

    2007-01-01

    This paper describes our work at CLEF 2007 Robust Task. We have participated in the monolingual (English, French and Portuguese) and the bilingual (English to French) subtask. At CLEF 2006 our research group obtained very good results applying local query expansion using windows of terms in the robust task. This year we have used the same expansion technique, but taking into account some criteria of robustness: MAP, GMAP, MMR, GS@10, P@10, number of failed topics, number of topics bellow 0.1 ...

  1. REINA at CLEF 2007 Robust Track (2007)

    OpenAIRE

    Zazo, Ángel F.; G.-Figuerola, Carlos; Alonso-Berrocal, José-Luis

    2007-01-01

    This paper describes our work at CLEF 2007 Robust Task. We have participated in the monolingual (English, French and Portuguese) and the bilingual (English to French) subtask. At CLEF 2006 our research group obtained very good results applying local query expansion using windows of terms in the robust task. This year we have used the same expansion technique, but taking into account some criteria of robustness: MAP, GMAP, MMR, GS@10, P@10, number of failed topics, number of topics bellow 0.1 ...

  2. Danish Requirements for Robustness of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Christensen, H. H.

    2006-01-01

    . This paper describes the background of the revised robustness requirements implemented in the Danish Code of Practice for Safety of Structures in 2003 [1, 2, 3]. According to the Danish design rules robustness shall be documented for all structures where consequences of failure are serious. This paper...... describes the background of the design procedure in the Danish codes, which shall be followed in order to document sufficient robustness in the following steps: Step 1: review of loads and possible failure modes/scenarios and determination of acceptable collapse extent. Step 2: review of the structural...

  3. Robustness-related issues in speaker recognition

    CERN Document Server

    Zheng, Thomas Fang

    2017-01-01

    This book presents an overview of speaker recognition technologies with an emphasis on dealing with robustness issues. Firstly, the book gives an overview of speaker recognition, such as the basic system framework, categories under different criteria, performance evaluation and its development history. Secondly, with regard to robustness issues, the book presents three categories, including environment-related issues, speaker-related issues and application-oriented issues. For each category, the book describes the current hot topics, existing technologies, and potential research focuses in the future. The book is a useful reference book and self-learning guide for early researchers working in the field of robust speech recognition.

  4. Robustness and structure of complex networks

    Science.gov (United States)

    Shao, Shuai

    This dissertation covers the two major parts of my PhD research on statistical physics and complex networks: i) modeling a new type of attack -- localized attack, and investigating robustness of complex networks under this type of attack; ii) discovering the clustering structure in complex networks and its influence on the robustness of coupled networks. Complex networks appear in every aspect of our daily life and are widely studied in Physics, Mathematics, Biology, and Computer Science. One important property of complex networks is their robustness under attacks, which depends crucially on the nature of attacks and the structure of the networks themselves. Previous studies have focused on two types of attack: random attack and targeted attack, which, however, are insufficient to describe many real-world damages. Here we propose a new type of attack -- localized attack, and study the robustness of complex networks under this type of attack, both analytically and via simulation. On the other hand, we also study the clustering structure in the network, and its influence on the robustness of a complex network system. In the first part, we propose a theoretical framework to study the robustness of complex networks under localized attack based on percolation theory and generating function method. We investigate the percolation properties, including the critical threshold of the phase transition pc and the size of the giant component Pinfinity. We compare localized attack with random attack and find that while random regular (RR) networks are more robust against localized attack, Erdoḧs-Renyi (ER) networks are equally robust under both types of attacks. As for scale-free (SF) networks, their robustness depends crucially on the degree exponent lambda. The simulation results show perfect agreement with theoretical predictions. We also test our model on two real-world networks: a peer-to-peer computer network and an airline network, and find that the real-world networks

  5. Robust Structured Control Design via LMI Optimization

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher; Stoustrup, Jakob

    2011-01-01

    This paper presents a new procedure for discrete-time robust structured control design. Parameter-dependent nonconvex conditions for stabilizable and induced L2-norm performance controllers are solved by an iterative linear matrix inequalities (LMI) optimization. A wide class of controller...... structures including decentralized of any order, fixed-order dynamic output feedback, static output feedback can be designed robust to polytopic uncertainties. Stability is proven by a parameter-dependent Lyapunov function. Numerical examples on robust stability margins shows that the proposed procedure can...

  6. AUTOMATIC APPROACH TO VHR SATELLITE IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    P. Kupidura

    2016-06-01

    preliminary step of recalculation of pixel DNs to reflectance is required. Thanks to this, the proposed approach is in theory universal, and might be applied to different satellite system images of different acquisition dates. The test data consists of 3 Pleiades images captured on different dates. Research allowed to determine optimal indices values. Using the same parameters, we obtained a very good accuracy of extraction of 5 land cover/use classes: water, low vegetation, bare soil, wooded area and built-up area in all the test images (kappa from 87% to 96%. What constitutes important, even significant changes in parameter values, did not cause a significant declination of classification accuracy, which demonstrates how robust the proposed method is.

  7. Atmospheric circulation classification comparison based on wildfires in Portugal

    Science.gov (United States)

    Pereira, M. G.; Trigo, R. M.

    2009-04-01

    Atmospheric circulation classifications are not a simple description of atmospheric states but a tool to understand and interpret the atmospheric processes and to model the relation between atmospheric circulation and surface climate and other related variables (Radan Huth et al., 2008). Classifications were initially developed with weather forecasting purposes, however with the progress in computer processing capability, new and more robust objective methods were developed and applied to large datasets prompting atmospheric circulation classification methods to one of the most important fields in synoptic and statistical climatology. Classification studies have been extensively used in climate change studies (e.g. reconstructed past climates, recent observed changes and future climates), in bioclimatological research (e.g. relating human mortality to climatic factors) and in a wide variety of synoptic climatological applications (e.g. comparison between datasets, air pollution, snow avalanches, wine quality, fish captures and forest fires). Likewise, atmospheric circulation classifications are important for the study of the role of weather in wildfire occurrence in Portugal because the daily synoptic variability is the most important driver of local weather conditions (Pereira et al., 2005). In particular, the objective classification scheme developed by Trigo and DaCamara (2000) to classify the atmospheric circulation affecting Portugal have proved to be quite useful in discriminating the occurrence and development of wildfires as well as the distribution over Portugal of surface climatic variables with impact in wildfire activity such as maximum and minimum temperature and precipitation. This work aims to present: (i) an overview the existing circulation classification for the Iberian Peninsula, and (ii) the results of a comparison study between these atmospheric circulation classifications based on its relation with wildfires and relevant meteorological

  8. Pancreatic neuroendocrine tumours: correlation between MSCT features and pathological classification

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Yanji; Dong, Zhi; Li, Zi-Ping; Feng, Shi-Ting [The First Affiliated Hospital, Sun Yat-Sen University, Department of Radiology, Guangzhou, Guangdong (China); Chen, Jie [The First Affiliated Hospital, Sun Yat-Sen University, Department of Gastroenterology, Guangzhou, Guangdong (China); Chan, Tao; Chen, Minhu [Union Hospital, Hong Kong, Medical Imaging Department, Shatin, N.T. (China); Lin, Yuan [The First Affiliated Hospital, Sun Yat-Sen University, Department of Pathology, Guangzhou, Guangdong (China)

    2014-11-15

    We aimed to evaluate the multi-slice computed tomography (MSCT) features of pancreatic neuroendocrine neoplasms (P-NENs) and analyse the correlation between the MSCT features and pathological classification of P-NENs. Forty-one patients, preoperatively investigated by MSCT and subsequently operated on with a histological diagnosis of P-NENs, were included. Various MSCT features of the primary tumour, lymph node, and distant metastasis were analysed. The relationship between MSCT features and pathologic classification of P-NENs was analysed with univariate and multivariate models. Contrast-enhanced images showed significant differences among the three grades of tumours in the absolute enhancement (P = 0.013) and relative enhancement (P = 0.025) at the arterial phase. Univariate analysis revealed statistically significant differences among the tumours of different grades (based on World Health Organization [WHO] 2010 classification) in tumour size (P = 0.001), tumour contour (P < 0.001), cystic necrosis (P = 0.001), tumour boundary (P = 0.003), dilatation of the main pancreatic duct (P = 0.001), peripancreatic tissue or vascular invasion (P < 0.001), lymphadenopathy (P = 0.011), and distant metastasis (P = 0.012). Multivariate analysis suggested that only peripancreatic tissue or vascular invasion (HR 3.934, 95 % CI, 0.426-7.442, P = 0.028) was significantly associated with WHO 2010 pathological classification. MSCT is helpful in evaluating the pathological classification of P-NENs. (orig.)

  9. Advanced Steel Microstructural Classification by Deep Learning Methods.

    Science.gov (United States)

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  10. Magnetic resonance imaging texture analysis classification of primary breast cancer

    International Nuclear Information System (INIS)

    Waugh, S.A.; Lerski, R.A.; Purdie, C.A.; Jordan, L.B.; Vinnicombe, S.; Martin, P.; Thompson, A.M.

    2016-01-01

    Patient-tailored treatments for breast cancer are based on histological and immunohistochemical (IHC) subtypes. Magnetic Resonance Imaging (MRI) texture analysis (TA) may be useful in non-invasive lesion subtype classification. Women with newly diagnosed primary breast cancer underwent pre-treatment dynamic contrast-enhanced breast MRI. TA was performed using co-occurrence matrix (COM) features, by creating a model on retrospective training data, then prospectively applying to a test set. Analyses were blinded to breast pathology. Subtype classifications were performed using a cross-validated k-nearest-neighbour (k = 3) technique, with accuracy relative to pathology assessed and receiver operator curve (AUROC) calculated. Mann-Whitney U and Kruskal-Wallis tests were used to assess raw entropy feature values. Histological subtype classifications were similar across training (n = 148 cancers) and test sets (n = 73 lesions) using all COM features (training: 75 %, AUROC = 0.816; test: 72.5 %, AUROC = 0.823). Entropy features were significantly different between lobular and ductal cancers (p < 0.001; Mann-Whitney U). IHC classifications using COM features were also similar for training and test data (training: 57.2 %, AUROC = 0.754; test: 57.0 %, AUROC = 0.750). Hormone receptor positive and negative cancers demonstrated significantly different entropy features. Entropy features alone were unable to create a robust classification model. Textural differences on contrast-enhanced MR images may reflect underlying lesion subtypes, which merits testing against treatment response. (orig.)

  11. Classification of Strawberry Fruit Shape by Machine Learning

    Science.gov (United States)

    Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; Isobe, S.; Kochi, N.

    2018-05-01

    Shape is one of the most important traits of agricultural products due to its relationships with the quality, quantity, and value of the products. For strawberries, the nine types of fruit shape were defined and classified by humans based on the sampler patterns of the nine types. In this study, we tested the classification of strawberry shapes by machine learning in order to increase the accuracy of the classification, and we introduce the concept of computerization into this field. Four types of descriptors were extracted from the digital images of strawberries: (1) the Measured Values (MVs) including the length of the contour line, the area, the fruit length and width, and the fruit width/length ratio; (2) the Ellipse Similarity Index (ESI); (3) Elliptic Fourier Descriptors (EFDs), and (4) Chain Code Subtraction (CCS). We used these descriptors for the classification test along with the random forest approach, and eight of the nine shape types were classified with combinations of MVs + CCS + EFDs. CCS is a descriptor that adds human knowledge to the chain codes, and it showed higher robustness in classification than the other descriptors. Our results suggest machine learning's high ability to classify fruit shapes accurately. We will attempt to increase the classification accuracy and apply the machine learning methods to other plant species.

  12. Magnetic resonance imaging texture analysis classification of primary breast cancer

    Energy Technology Data Exchange (ETDEWEB)

    Waugh, S.A.; Lerski, R.A. [Ninewells Hospital and Medical School, Department of Medical Physics, Dundee (United Kingdom); Purdie, C.A.; Jordan, L.B. [Ninewells Hospital and Medical School, Department of Pathology, Dundee (United Kingdom); Vinnicombe, S. [University of Dundee, Division of Imaging and Technology, Ninewells Hospital and Medical School, Dundee (United Kingdom); Martin, P. [Ninewells Hospital and Medical School, Department of Clinical Radiology, Dundee (United Kingdom); Thompson, A.M. [University of Texas MD Anderson Cancer Center, Department of Surgical Oncology, Houston, TX (United States)

    2016-02-15

    Patient-tailored treatments for breast cancer are based on histological and immunohistochemical (IHC) subtypes. Magnetic Resonance Imaging (MRI) texture analysis (TA) may be useful in non-invasive lesion subtype classification. Women with newly diagnosed primary breast cancer underwent pre-treatment dynamic contrast-enhanced breast MRI. TA was performed using co-occurrence matrix (COM) features, by creating a model on retrospective training data, then prospectively applying to a test set. Analyses were blinded to breast pathology. Subtype classifications were performed using a cross-validated k-nearest-neighbour (k = 3) technique, with accuracy relative to pathology assessed and receiver operator curve (AUROC) calculated. Mann-Whitney U and Kruskal-Wallis tests were used to assess raw entropy feature values. Histological subtype classifications were similar across training (n = 148 cancers) and test sets (n = 73 lesions) using all COM features (training: 75 %, AUROC = 0.816; test: 72.5 %, AUROC = 0.823). Entropy features were significantly different between lobular and ductal cancers (p < 0.001; Mann-Whitney U). IHC classifications using COM features were also similar for training and test data (training: 57.2 %, AUROC = 0.754; test: 57.0 %, AUROC = 0.750). Hormone receptor positive and negative cancers demonstrated significantly different entropy features. Entropy features alone were unable to create a robust classification model. Textural differences on contrast-enhanced MR images may reflect underlying lesion subtypes, which merits testing against treatment response. (orig.)

  13. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    Science.gov (United States)

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD

  14. Adaptive partial volume classification of MRI data

    International Nuclear Information System (INIS)

    Chiverton, John P; Wells, Kevin

    2008-01-01

    Tomographic biomedical images are commonly affected by an imaging artefact known as the partial volume (PV) effect. The PV effect produces voxels composed of a mixture of tissues in anatomical magnetic resonance imaging (MRI) data resulting in a continuity of these tissue classes. Anatomical MRI data typically consist of a number of contiguous regions of tissues or even contiguous regions of PV voxels. Furthermore discontinuities exist between the boundaries of these contiguous image regions. The work presented here probabilistically models the PV effect using spatial regularization in the form of continuous Markov random fields (MRFs) to classify anatomical MRI brain data, simulated and real. A unique approach is used to adaptively control the amount of spatial regularization imposed by the MRF. Spatially derived image gradient magnitude is used to identify the discontinuities between image regions of contiguous tissue voxels and PV voxels, imposing variable amounts of regularization determined by simulation. Markov chain Monte Carlo (MCMC) is used to simulate the posterior distribution of the probabilistic image model. Promising quantitative results are presented for PV classification of simulated and real MRI data of the human brain.

  15. Detection and Classification of Whale Acoustic Signals

    Science.gov (United States)

    Xian, Yin

    This dissertation focuses on two vital challenges in relation to whale acoustic signals: detection and classification. In detection, we evaluated the influence of the uncertain ocean environment on the spectrogram-based detector, and derived the likelihood ratio of the proposed Short Time Fourier Transform detector. Experimental results showed that the proposed detector outperforms detectors based on the spectrogram. The proposed detector is more sensitive to environmental changes because it includes phase information. In classification, our focus is on finding a robust and sparse representation of whale vocalizations. Because whale vocalizations can be modeled as polynomial phase signals, we can represent the whale calls by their polynomial phase coefficients. In this dissertation, we used the Weyl transform to capture chirp rate information, and used a two dimensional feature set to represent whale vocalizations globally. Experimental results showed that our Weyl feature set outperforms chirplet coefficients and MFCC (Mel Frequency Cepstral Coefficients) when applied to our collected data. Since whale vocalizations can be represented by polynomial phase coefficients, it is plausible that the signals lie on a manifold parameterized by these coefficients. We also studied the intrinsic structure of high dimensional whale data by exploiting its geometry. Experimental results showed that nonlinear mappings such as Laplacian Eigenmap and ISOMAP outperform linear mappings such as PCA and MDS, suggesting that the whale acoustic data is nonlinear. We also explored deep learning algorithms on whale acoustic data. We built each layer as convolutions with either a PCA filter bank (PCANet) or a DCT filter bank (DCTNet). With the DCT filter bank, each layer has different a time-frequency scale representation, and from this, one can extract different physical information. Experimental results showed that our PCANet and DCTNet achieve high classification rate on the whale

  16. Reservoir Identification: Parameter Characterization or Feature Classification

    Science.gov (United States)

    Cao, J.

    2017-12-01

    The ultimate goal of oil and gas exploration is to find the oil or gas reservoirs with industrial mining value. Therefore, the core task of modern oil and gas exploration is to identify oil or gas reservoirs on the seismic profiles. Traditionally, the reservoir is identify by seismic inversion of a series of physical parameters such as porosity, saturation, permeability, formation pressure, and so on. Due to the heterogeneity of the geological medium, the approximation of the inversion model and the incompleteness and noisy of the data, the inversion results are highly uncertain and must be calibrated or corrected with well data. In areas where there are few wells or no well, reservoir identification based on seismic inversion is high-risk. Reservoir identification is essentially a classification issue. In the identification process, the underground rocks are divided into reservoirs with industrial mining value and host rocks with non-industrial mining value. In addition to the traditional physical parameters classification, the classification may be achieved using one or a few comprehensive features. By introducing the concept of seismic-print, we have developed a new reservoir identification method based on seismic-print analysis. Furthermore, we explore the possibility to use deep leaning to discover the seismic-print characteristics of oil and gas reservoirs. Preliminary experiments have shown that the deep learning of seismic data could distinguish gas reservoirs from host rocks. The combination of both seismic-print analysis and seismic deep learning is expected to be a more robust reservoir identification method. The work was supported by NSFC under grant No. 41430323 and No. U1562219, and the National Key Research and Development Program under Grant No. 2016YFC0601

  17. Supply chain planning classification

    Science.gov (United States)

    Hvolby, Hans-Henrik; Trienekens, Jacques; Bonde, Hans

    2001-10-01

    Industry experience a need to shift in focus from internal production planning towards planning in the supply network. In this respect customer oriented thinking becomes almost a common good amongst companies in the supply network. An increase in the use of information technology is needed to enable companies to better tune their production planning with customers and suppliers. Information technology opportunities and supply chain planning systems facilitate companies to monitor and control their supplier network. In spite if these developments, most links in today's supply chains make individual plans, because the real demand information is not available throughout the chain. The current systems and processes of the supply chains are not designed to meet the requirements now placed upon them. For long term relationships with suppliers and customers, an integrated decision-making process is needed in order to obtain a satisfactory result for all parties. Especially when customized production and short lead-time is in focus. An effective value chain makes inventory available and visible among the value chain members, minimizes response time and optimizes total inventory value held throughout the chain. In this paper a supply chain planning classification grid is presented based current manufacturing classifications and supply chain planning initiatives.

  18. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  19. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  20. Classification of smooth Fano polytopes

    DEFF Research Database (Denmark)

    Øbro, Mikkel

    A simplicial lattice polytope containing the origin in the interior is called a smooth Fano polytope, if the vertices of every facet is a basis of the lattice. The study of smooth Fano polytopes is motivated by their connection to toric varieties. The thesis concerns the classification of smooth...... Fano polytopes up to isomorphism. A smooth Fano -polytope can have at most vertices. In case of vertices an explicit classification is known. The thesis contains the classification in case of vertices. Classifications of smooth Fano -polytopes for fixed exist only for . In the thesis an algorithm...... for the classification of smooth Fano -polytopes for any given is presented. The algorithm has been implemented and used to obtain the complete classification for ....

  1. Small-scale classification schemes

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2004-01-01

    Small-scale classification schemes are used extensively in the coordination of cooperative work. This study investigates the creation and use of a classification scheme for handling the system requirements during the redevelopment of a nation-wide information system. This requirements...... classification inherited a lot of its structure from the existing system and rendered requirements that transcended the framework laid out by the existing system almost invisible. As a result, the requirements classification became a defining element of the requirements-engineering process, though its main...... effects remained largely implicit. The requirements classification contributed to constraining the requirements-engineering process by supporting the software engineers in maintaining some level of control over the process. This way, the requirements classification provided the software engineers...

  2. Robust synthesis for real-time systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Legay, Axel; Traonouez, Luois-Marie

    2014-01-01

    Specification theories for real-time systems allow reasoning about interfaces and their implementation models, using a set of operators that includes satisfaction, refinement, logical and parallel composition. To make such theories applicable throughout the entire design process from an abstract...... of introducing small perturbations into formal models. We address this problem of robust implementations in timed specification theories. We first consider a fixed perturbation and study the robustness of timed specifications with respect to the operators of the theory. To this end we synthesize robust...... specification to an implementation, we need to reason about the possibility to effectively implement the theoretical specifications on physical systems, despite their limited precision. In the literature, this implementation problem has been linked to the robustness problem that analyzes the consequences...

  3. Robust adaptive synchronization of general dynamical networks ...

    Indian Academy of Sciences (India)

    Robust adaptive synchronization; dynamical network; multiple delays; multiple uncertainties. ... Networks such as neural networks, communication transmission networks, social rela- tionship networks etc. ..... a very good effect. Pramana – J.

  4. Technical Challenges Hindering Development of Robust Wireless ...

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    2015-12-01

    Dec 1, 2015 ... challenges remain to be resolved, in designing robust wireless networks that can deliver the performance ... demonstrated the first radio transmission from the Isle of ... distances with better quality, less power, and smaller ...

  5. Multifidelity Robust Aeroelastic Design, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Nielsen Engineering & Research (NEAR) proposes a new method to generate mathematical models of wind-tunnel models and flight vehicles for robust aeroelastic...

  6. The structural robustness of multiprocessor computing system

    Directory of Open Access Journals (Sweden)

    N. Andronaty

    1996-03-01

    Full Text Available The model of the multiprocessor computing system on the base of transputers which permits to resolve the question of valuation of a structural robustness (viability, survivability is described.

  7. Design principles for robust oscillatory behavior.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Villota, Elizabeth R; Coronado, Alberto M

    2015-09-01

    Oscillatory responses are ubiquitous in regulatory networks of living organisms, a fact that has led to extensive efforts to study and replicate the circuits involved. However, to date, design principles that underlie the robustness of natural oscillators are not completely known. Here we study a three-component enzymatic network model in order to determine the topological requirements for robust oscillation. First, by simulating every possible topological arrangement and varying their parameter values, we demonstrate that robust oscillators can be obtained by augmenting the number of both negative feedback loops and positive autoregulations while maintaining an appropriate balance of positive and negative interactions. We then identify network motifs, whose presence in more complex topologies is a necessary condition for obtaining oscillatory responses. Finally, we pinpoint a series of simple architectural patterns that progressively render more robust oscillators. Together, these findings can help in the design of more reliable synthetic biomolecular networks and may also have implications in the understanding of other oscillatory systems.

  8. Robust and Efficient Parametric Face Alignment

    NARCIS (Netherlands)

    Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja

    2011-01-01

    We propose a correlation-based approach to parametric object alignment particularly suitable for face analysis applications which require efficiency and robustness against occlusions and illumination changes. Our algorithm registers two images by iteratively maximizing their correlation coefficient

  9. Framework for Robustness Assessment of Timber Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for the design and analysis of robustness of timber structures. This is actualized by a more4 frequent use of advanced types of timber structures with limited redundancy and serious consequences in the case of failure. Combined with increased requirements...... to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new structures essential. Further, the collapse of the Ballerup Super Arena, the bad Reichenhall Ice-Arena and a number of other structural systems during the last 10 years has...... increased the interest in robustness. Typically, modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, although the importance of robustness for structural design is widely recognized, the code requirements...

  10. Robust Analysis and Design of Multivariable Systems

    National Research Council Canada - National Science Library

    Tannenbaum, Allen

    1998-01-01

    In this Final Report, we will describe the work we have performed in robust control theory and nonlinear control, and the utilization of techniques in image processing and computer vision for problems in visual tracking...

  11. Robust Tracking Control for a Piezoelectric Actuator

    National Research Council Canada - National Science Library

    Salah, M; McIntyre, M; Dawson, D; Wagner, J

    2006-01-01

    In this paper, a hysteresis model-based nonlinear robust controller is developed for a piezoelectric actuator, utilizing a Lyapunov-based stability analysis, which ensures that a desired displacement...

  12. Robustness studies on coal gasification process variables

    African Journals Online (AJOL)

    coal before feeding to the gasification process [1]. .... to-control variables will make up the terms in the response surface model for the ... Montgomery (1999) explained that all the Taguchi engineering objectives for a robust ..... software [3].

  13. [Definition, etiology, classification and presentation forms].

    Science.gov (United States)

    Mas Garriga, Xavier

    2014-01-01

    Osteoarthritis is defined as a degenerative process affecting the joints as a result of mechanical and biological disorders that destabilize the balance between the synthesis and degradation of joint cartilage, stimulating the growth of subchondral bone; chronic synovitis is also present. Currently, the joint is considered as a functional unit that includes distinct tissues, mainly cartilage, the synovial membrane, and subchondral bone, all of which are involved in the pathogenesis of the disease. Distinct risk factors for the development of osteoarthritis have been described: general, unmodifiable risk factors (age, sex, and genetic makeup), general, modifiable risk factors (obesity and hormonal factors) and local risk factors (prior joint anomalies and joint overload). Notable among the main factors related to disease progression are joint alignment defects and generalized osteoarthritis. Several classifications of osteoarthritis have been proposed but none is particularly important for the primary care management of the disease. These classifications include etiological (primary or idiopathic forms and secondary forms) and topographical (typical and atypical localizations) classifications, the Kellgren and Lawrence classification (radiological repercussions) and that of the American College of Rheumatology for osteoarthritis of the hand, hip and knee. The prevalence of knee osteoarthritis is 10.2% in Spain and shows a marked discrepancy between clinical and radiological findings. Hand osteoarthritis, with a prevalence of symptomatic involvement of around 6.2%, has several forms of presentation (nodal osteoarthritis, generalized osteoarthritis, rhizarthrosis, and erosive osteoarthritis). Symptomatic osteoarthritis of the hip affects between 3.5% and 5.6% of persons older than 50 years and has different radiological patterns depending on femoral head migration. Copyright © 2014 Elsevier España, S.L. All rights reserved.

  14. Active Learning for Text Classification

    OpenAIRE

    Hu, Rong

    2011-01-01

    Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...

  15. Unsupervised Classification Using Immune Algorithm

    OpenAIRE

    Al-Muallim, M. T.; El-Kouatly, R.

    2012-01-01

    Unsupervised classification algorithm based on clonal selection principle named Unsupervised Clonal Selection Classification (UCSC) is proposed in this paper. The new proposed algorithm is data driven and self-adaptive, it adjusts its parameters to the data to make the classification operation as fast as possible. The performance of UCSC is evaluated by comparing it with the well known K-means algorithm using several artificial and real-life data sets. The experiments show that the proposed U...

  16. Antecedents and Dimensions of Supply Chain Robustness

    OpenAIRE

    Durach, Christian F.; Wieland, Andreas; Machuca, Jose A.D.

    2015-01-01

    Purpose – The purpose of this paper is to provide groundwork for an emerging theory of supply chain robustness – which has been conceptualized as a dimension of supply chain resilience – through reviewing and synthesizing related yet disconnected studies. The paper develops a formal definition of supply chain robustness to build a framework that captures the dimensions, antecedents and moderators of the construct as discussed in the literature. Design/methodology/approach – The...

  17. Contributions for classification of platelet rich plasma - proposal of a new classification: MARSPILL.

    Science.gov (United States)

    Lana, Jose Fabio Santos Duarte; Purita, Joseph; Paulus, Christian; Huber, Stephany Cares; Rodrigues, Bruno Lima; Rodrigues, Ana Amélia; Santana, Maria Helena; Madureira, João Lopo; Malheiros Luzo, Ângela Cristina; Belangero, William Dias; Annichino-Bizzacchi, Joyce Maria

    2017-07-01

    Platelet-rich plasma (PRP) has emerged as a significant therapy used in medical conditions with heterogeneous results. There are some important classifications to try to standardize the PRP procedure. The aim of this report is to describe PRP contents studying celular and molecular components, and also propose a new classification for PRP. The main focus is on mononuclear cells, which comprise progenitor cells and monocytes. In addition, there are important variables related to PRP application incorporated in this study, which are the harvest method, activation, red blood cells, number of spins, image guidance, leukocytes number and light activation. The other focus is the discussion about progenitor cells presence on peripherial blood which are interesting due to neovasculogenesis and proliferation. The function of monocytes (in tissue-macrophages) are discussed here and also its plasticity, a potential property for regenerative medicine treatments.

  18. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    Science.gov (United States)

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental

  19. Cancer classification in the genomic era: five contemporary problems.

    Science.gov (United States)

    Song, Qingxuan; Merajver, Sofia D; Li, Jun Z

    2015-10-19

    Classification is an everyday instinct as well as a full-fledged scientific discipline. Throughout the history of medicine, disease classification is central to how we develop knowledge, make diagnosis, and assign treatment. Here, we discuss the classification of cancer and the process of categorizing cancer subtypes based on their observed clinical and biological features. Traditionally, cancer nomenclature is primarily based on organ location, e.g., "lung cancer" designates a tumor originating in lung structures. Within each organ-specific major type, finer subgroups can be defined based on patient age, cell type, histological grades, and sometimes molecular markers, e.g., hormonal receptor status in breast cancer or microsatellite instability in colorectal cancer. In the past 15+ years, high-throughput technologies have generated rich new data regarding somatic variations in DNA, RNA, protein, or epigenomic features for many cancers. These data, collected for increasingly large tumor cohorts, have provided not only new insights into the biological diversity of human cancers but also exciting opportunities to discover previously unrecognized cancer subtypes. Meanwhile, the unprecedented volume and complexity of these data pose significant challenges for biostatisticians, cancer biologists, and clinicians alike. Here, we review five related issues that represent contemporary problems in cancer taxonomy and interpretation. (1) How many cancer subtypes are there? (2) How can we evaluate the robustness of a new classification system? (3) How are classification systems affected by intratumor heterogeneity and tumor evolution? (4) How should we interpret cancer subtypes? (5) Can multiple classification systems co-exist? While related issues have existed for a long time, we will focus on those aspects that have been magnified by the recent influx of complex multi-omics data. Exploration of these problems is essential for data-driven refinement of cancer classification

  20. Precision automation of cell type classification and sub-cellular fluorescence quantification from laser scanning confocal images

    Directory of Open Access Journals (Sweden)

    Hardy Craig Hall

    2016-02-01

    Full Text Available While novel whole-plant phenotyping technologies have been successfully implemented into functional genomics and breeding programs, the potential of automated phenotyping with cellular resolution is largely unexploited. Laser scanning confocal microscopy has the potential to close this gap by providing spatially highly resolved images containing anatomic as well as chemical information on a subcellular basis. However, in the absence of automated methods, the assessment of the spatial patterns and abundance of fluorescent markers with subcellular resolution is still largely qualitative and time-consuming. Recent advances in image acquisition and analysis, coupled with improvements in microprocessor performance, have brought such automated methods within reach, so that information from thousands of cells per image for hundreds of images may be derived in an experimentally convenient time-frame. Here, we present a MATLAB-based analytical pipeline to 1 segment radial plant organs into individual cells, 2 classify cells into cell type categories based upon random forest classification, 3 divide each cell into sub-regions, and 4 quantify fluorescence intensity to a subcellular degree of precision for a separate fluorescence channel. In this research advance, we demonstrate the precision of this analytical process for the relatively complex tissues of Arabidopsis hypocotyls at various stages of development. High speed and robustness make our approach suitable for phenotyping of large collections of stem-like material and other tissue types.