WorldWideScience

Sample records for large margin classifiers

  1. Maximum margin classifier working in a set of strings.

    Science.gov (United States)

    Koyano, Hitoshi; Hayashida, Morihiro; Akutsu, Tatsuya

    2016-03-01

    Numbers and numerical vectors account for a large portion of data. However, recently, the amount of string data generated has increased dramatically. Consequently, classifying string data is a common problem in many fields. The most widely used approach to this problem is to convert strings into numerical vectors using string kernels and subsequently apply a support vector machine that works in a numerical vector space. However, this non-one-to-one conversion involves a loss of information and makes it impossible to evaluate, using probability theory, the generalization error of a learning machine, considering that the given data to train and test the machine are strings generated according to probability laws. In this study, we approach this classification problem by constructing a classifier that works in a set of strings. To evaluate the generalization error of such a classifier theoretically, probability theory for strings is required. Therefore, we first extend a limit theorem for a consensus sequence of strings demonstrated by one of the authors and co-workers in a previous study. Using the obtained result, we then demonstrate that our learning machine classifies strings in an asymptotically optimal manner. Furthermore, we demonstrate the usefulness of our machine in practical data analysis by applying it to predicting protein-protein interactions using amino acid sequences and classifying RNAs by the secondary structure using nucleotide sequences.

  2. Large margin image set representation and classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-07-06

    In this paper, we propose a novel image set representation and classification method by maximizing the margin of image sets. The margin of an image set is defined as the difference of the distance to its nearest image set from different classes and the distance to its nearest image set of the same class. By modeling the image sets by using both their image samples and their affine hull models, and maximizing the margins of the images sets, the image set representation parameter learning problem is formulated as an minimization problem, which is further optimized by an expectation - maximization (EM) strategy with accelerated proximal gradient (APG) optimization in an iterative algorithm. To classify a given test image set, we assign it to the class which could provide the largest margin. Experiments on two applications of video-sequence-based face recognition demonstrate that the proposed method significantly outperforms state-of-the-art image set classification methods in terms of both effectiveness and efficiency.

  3. Large margin image set representation and classification

    KAUST Repository

    Wang, Jim Jing-Yan; Alzahrani, Majed A.; Gao, Xin

    2014-01-01

    In this paper, we propose a novel image set representation and classification method by maximizing the margin of image sets. The margin of an image set is defined as the difference of the distance to its nearest image set from different classes and the distance to its nearest image set of the same class. By modeling the image sets by using both their image samples and their affine hull models, and maximizing the margins of the images sets, the image set representation parameter learning problem is formulated as an minimization problem, which is further optimized by an expectation - maximization (EM) strategy with accelerated proximal gradient (APG) optimization in an iterative algorithm. To classify a given test image set, we assign it to the class which could provide the largest margin. Experiments on two applications of video-sequence-based face recognition demonstrate that the proposed method significantly outperforms state-of-the-art image set classification methods in terms of both effectiveness and efficiency.

  4. A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification.

    Directory of Open Access Journals (Sweden)

    Cuihong Wen

    Full Text Available Optical Music Recognition (OMR has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM. The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM, which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs and Neural Networks (NNs.

  5. A Directed Acyclic Graph-Large Margin Distribution Machine Model for Music Symbol Classification.

    Science.gov (United States)

    Wen, Cuihong; Zhang, Jing; Rebelo, Ana; Cheng, Fanyong

    2016-01-01

    Optical Music Recognition (OMR) has received increasing attention in recent years. In this paper, we propose a classifier based on a new method named Directed Acyclic Graph-Large margin Distribution Machine (DAG-LDM). The DAG-LDM is an improvement of the Large margin Distribution Machine (LDM), which is a binary classifier that optimizes the margin distribution by maximizing the margin mean and minimizing the margin variance simultaneously. We modify the LDM to the DAG-LDM to solve the multi-class music symbol classification problem. Tests are conducted on more than 10000 music symbol images, obtained from handwritten and printed images of music scores. The proposed method provides superior classification capability and achieves much higher classification accuracy than the state-of-the-art algorithms such as Support Vector Machines (SVMs) and Neural Networks (NNs).

  6. Large margin classification with indefinite similarities

    KAUST Repository

    Alabdulmohsin, Ibrahim

    2016-01-07

    Classification with indefinite similarities has attracted attention in the machine learning community. This is partly due to the fact that many similarity functions that arise in practice are not symmetric positive semidefinite, i.e. the Mercer condition is not satisfied, or the Mercer condition is difficult to verify. Examples of such indefinite similarities in machine learning applications are ample including, for instance, the BLAST similarity score between protein sequences, human-judged similarities between concepts and words, and the tangent distance or the shape matching distance in computer vision. Nevertheless, previous works on classification with indefinite similarities are not fully satisfactory. They have either introduced sources of inconsistency in handling past and future examples using kernel approximation, settled for local-minimum solutions using non-convex optimization, or produced non-sparse solutions by learning in Krein spaces. Despite the large volume of research devoted to this subject lately, we demonstrate in this paper how an old idea, namely the 1-norm support vector machine (SVM) proposed more than 15 years ago, has several advantages over more recent work. In particular, the 1-norm SVM method is conceptually simpler, which makes it easier to implement and maintain. It is competitive, if not superior to, all other methods in terms of predictive accuracy. Moreover, it produces solutions that are often sparser than more recent methods by several orders of magnitude. In addition, we provide various theoretical justifications by relating 1-norm SVM to well-established learning algorithms such as neural networks, SVM, and nearest neighbor classifiers. Finally, we conduct a thorough experimental evaluation, which reveals that the evidence in favor of 1-norm SVM is statistically significant.

  7. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    from data rather than having a predefined feature set. We explore deep learning approach of convolutional neural network (CNN) for segmenting three dimensional medical images. We propose a novel system integrating three 2D CNNs, which have a one-to-one association with the xy, yz and zx planes of 3D......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...... amount of training data to cover sufficient biological variability. Learning methods scaling badly with number of training data points cannot be used in such scenarios. This may restrict the usage of many powerful classifiers having excellent generalization ability. We propose a cascaded classifier which...

  8. A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-11-01

    This article carries out a large dimensional analysis of standard regularized discriminant analysis classifiers designed on the assumption that data arise from a Gaussian mixture model with different means and covariances. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under mild assumptions, we show that the asymptotic classification error approaches a deterministic quantity that depends only on the means and covariances associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized discriminant analsysis, in practical large but finite dimensions, and can be used to determine and pre-estimate the optimal regularization parameter that minimizes the misclassification error probability. Despite being theoretically valid only for Gaussian data, our findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from the popular USPS data base, thereby making an interesting connection between theory and practice.

  9. A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers

    KAUST Repository

    Elkhalil, Khalil; Kammoun, Abla; Couillet, Romain; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2017-01-01

    on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under mild assumptions, we show that the asymptotic classification error approaches a

  10. Hierarchical Learning of Tree Classifiers for Large-Scale Plant Species Identification.

    Science.gov (United States)

    Fan, Jianping; Zhou, Ning; Peng, Jinye; Gao, Ling

    2015-11-01

    In this paper, a hierarchical multi-task structural learning algorithm is developed to support large-scale plant species identification, where a visual tree is constructed for organizing large numbers of plant species in a coarse-to-fine fashion and determining the inter-related learning tasks automatically. For a given parent node on the visual tree, it contains a set of sibling coarse-grained categories of plant species or sibling fine-grained plant species, and a multi-task structural learning algorithm is developed to train their inter-related classifiers jointly for enhancing their discrimination power. The inter-level relationship constraint, e.g., a plant image must first be assigned to a parent node (high-level non-leaf node) correctly if it can further be assigned to the most relevant child node (low-level non-leaf node or leaf node) on the visual tree, is formally defined and leveraged to learn more discriminative tree classifiers over the visual tree. Our experimental results have demonstrated the effectiveness of our hierarchical multi-task structural learning algorithm on training more discriminative tree classifiers for large-scale plant species identification.

  11. Control and large deformations of marginal disordered structures

    Science.gov (United States)

    Murugan, Arvind; Pinson, Matthew; Chen, Elizabeth

    Designed deformations, such as origami patterns, provide a way to make easily controlled mechanical metamaterials with tailored responses to external forces. We focus on an often overlooked regime of origami - non-linear deformations of large disordered origami patterns with no symmetries. We find that practical questions of control in origami have counterintuitive answers, because of intimate connections to spin glasses and neural networks. For example, 1 degree of freedom origami structures are actually difficult to control about the flat state with a single actuator; the actuator is thrown off by an exponential number of `red herring' zero modes for small deformations, all but one of which disappear at larger deformations. Conversely, structures with multiple programmed motions are much easier to control than expected - in fact, they are as easy to control as a dedicated single-motion structure if the number of programmed motions is below a threshold (`memory capacity').

  12. Effective diagnosis of Alzheimer’s disease by means of large margin-based methodology

    Directory of Open Access Journals (Sweden)

    Chaves Rosa

    2012-07-01

    Full Text Available Abstract Background Functional brain images such as Single-Photon Emission Computed Tomography (SPECT and Positron Emission Tomography (PET have been widely used to guide the clinicians in the Alzheimer’s Disease (AD diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD Systems. Methods It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT, Principal Component Analysis (PCA or Partial Least Squares (PLS (the two latter also analysed with a LMNN transformation. Regarding the classifiers, kernel Support Vector Machines (SVMs and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. Results Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i linear transformation of the PLS or PCA reduced data, ii feature reduction technique, and iii classifier (with Euclidean, Mahalanobis or Energy-based methodology. The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT and 90.67%, 88% and 93.33% (for PET, respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. Conclusions All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between

  13. The fusion of large scale classified side-scan sonar image mosaics.

    Science.gov (United States)

    Reed, Scott; Tena, Ruiz Ioseba; Capus, Chris; Petillot, Yvan

    2006-07-01

    This paper presents a unified framework for the creation of classified maps of the seafloor from sonar imagery. Significant challenges in photometric correction, classification, navigation and registration, and image fusion are addressed. The techniques described are directly applicable to a range of remote sensing problems. Recent advances in side-scan data correction are incorporated to compensate for the sonar beam pattern and motion of the acquisition platform. The corrected images are segmented using pixel-based textural features and standard classifiers. In parallel, the navigation of the sonar device is processed using Kalman filtering techniques. A simultaneous localization and mapping framework is adopted to improve the navigation accuracy and produce georeferenced mosaics of the segmented side-scan data. These are fused within a Markovian framework and two fusion models are presented. The first uses a voting scheme regularized by an isotropic Markov random field and is applicable when the reliability of each information source is unknown. The Markov model is also used to inpaint regions where no final classification decision can be reached using pixel level fusion. The second model formally introduces the reliability of each information source into a probabilistic model. Evaluation of the two models using both synthetic images and real data from a large scale survey shows significant quantitative and qualitative improvement using the fusion approach.

  14. How large a training set is needed to develop a classifier for microarray data?

    Science.gov (United States)

    Dobbin, Kevin K; Zhao, Yingdong; Simon, Richard M

    2008-01-01

    A common goal of gene expression microarray studies is the development of a classifier that can be used to divide patients into groups with different prognoses, or with different expected responses to a therapy. These types of classifiers are developed on a training set, which is the set of samples used to train a classifier. The question of how many samples are needed in the training set to produce a good classifier from high-dimensional microarray data is challenging. We present a model-based approach to determining the sample size required to adequately train a classifier. It is shown that sample size can be determined from three quantities: standardized fold change, class prevalence, and number of genes or features on the arrays. Numerous examples and important experimental design issues are discussed. The method is adapted to address ex post facto determination of whether the size of a training set used to develop a classifier was adequate. An interactive web site for performing the sample size calculations is provided. We showed that sample size calculations for classifier development from high-dimensional microarray data are feasible, discussed numerous important considerations, and presented examples.

  15. An Efficient Approach for Fast and Accurate Voltage Stability Margin Computation in Large Power Grids

    Directory of Open Access Journals (Sweden)

    Heng-Yi Su

    2016-11-01

    Full Text Available This paper proposes an efficient approach for the computation of voltage stability margin (VSM in a large-scale power grid. The objective is to accurately and rapidly determine the load power margin which corresponds to voltage collapse phenomena. The proposed approach is based on the impedance match-based technique and the model-based technique. It combines the Thevenin equivalent (TE network method with cubic spline extrapolation technique and the continuation technique to achieve fast and accurate VSM computation for a bulk power grid. Moreover, the generator Q limits are taken into account for practical applications. Extensive case studies carried out on Institute of Electrical and Electronics Engineers (IEEE benchmark systems and the Taiwan Power Company (Taipower, Taipei, Taiwan system are used to demonstrate the effectiveness of the proposed approach.

  16. A Core Set Based Large Vector-Angular Region and Margin Approach for Novelty Detection

    Directory of Open Access Journals (Sweden)

    Jiusheng Chen

    2016-01-01

    Full Text Available A large vector-angular region and margin (LARM approach is presented for novelty detection based on imbalanced data. The key idea is to construct the largest vector-angular region in the feature space to separate normal training patterns; meanwhile, maximize the vector-angular margin between the surface of this optimal vector-angular region and abnormal training patterns. In order to improve the generalization performance of LARM, the vector-angular distribution is optimized by maximizing the vector-angular mean and minimizing the vector-angular variance, which separates the normal and abnormal examples well. However, the inherent computation of quadratic programming (QP solver takes O(n3 training time and at least O(n2 space, which might be computational prohibitive for large scale problems. By (1+ε  and  (1-ε-approximation algorithm, the core set based LARM algorithm is proposed for fast training LARM problem. Experimental results based on imbalanced datasets have validated the favorable efficiency of the proposed approach in novelty detection.

  17. Named-Entity Tagging a Very Large Unbalanced Corpus. Training and Evaluating NE classifiers

    DEFF Research Database (Denmark)

    Bingel, Joachim; Haider, Thomas

    2014-01-01

    We describe a systematic and application-oriented approach to training and evaluating named entity recognition and classification (NERC) systems, the purpose of which is to identify an optimal system and to train an optimal model for named entity tagging DeReKo, a very large general-purpose corpus...... when evaluated on more uniform and less diverse data. We create and manually annotate such a representative sample as evaluation data for three different NERC systems, for each of which various models are learnt on multiple training data. The proposed sampling method can be viewed as a generally...

  18. Classifying injury narratives of large administrative databases for surveillance-A practical approach combining machine learning ensembles and human review.

    Science.gov (United States)

    Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R

    2017-01-01

    Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as

  19. A Novel Sensing Circuit with Large Sensing Margin for Embedded Spin-Transfer Torque MRAMs

    DEFF Research Database (Denmark)

    Bagheriye, Leila; Toofan, Siroos; Saeidi, Roghayeh

    -disturbance and high yield. In this paper, to deal with the read reliability challenge, a high sensing margin sensing circuit with strong positive feedback is proposed. It improves the sensing margin (SM) by 10.42X/3.3X and a with 1.24X/1.59X lower read energy at iso-sensing time (2ns) in comparison...

  20. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    Science.gov (United States)

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  1. Experimental study and large eddy simulation of effect of terrain slope on marginal burning in shrub fuel beds

    Science.gov (United States)

    Xiangyang Zhou; Shankar Mahalingam; David Weise

    2007-01-01

    This paper presents a combined study of laboratory scale fire spread experiments and a three-dimensional large eddy simulation (LES) to analyze the effect of terrain slope on marginal burning behavior in live chaparral shrub fuel beds. Line fire was initiated in single species fuel beds of four common chaparral plants under various fuel bed configurations and ambient...

  2. Marginal adaptation of large adhesive class IV composite restorations before and after artificial aging

    NARCIS (Netherlands)

    Ardu, S.; Stavridakis, M.; Feilzer, A.J.; Krejci, I.; Lefever, D.; Dietschi, D.

    2011-01-01

    Purpose: To test the marginal adaptation of Class IV restorations made of different composite materials designed for anterior use. Materials and Methods: Forty-two extracted caries-free human maxillary central incisors were randomly divided into 7 experimental groups - one per composite tested - for

  3. A new quantum flux parametron logic gate with large input margin

    International Nuclear Information System (INIS)

    Hioe, W.; Hosoya, M.; Goto, E.

    1991-01-01

    This paper reports on the Quantum Flux Parametron (QFP) which is a flux transfer, flux activated Josephson logic device which realizes much lower power dissipation than other Josephson logic devices. Being a two-terminal device its correct operation may be affected by coupling to other QFPs. The problems include backcoupling from active QFPs through inactive QFPs (relay noise), coupling between QFPs activated at different times because of clock skew (homophase noise), and interaction between active QFPs (reaction hazard). Previous QFP circuits worked by wired-majority, which being a linear input logic, has low input margin. A new logic gate (D-gate) using a QFP to perform logic operations has been analyzed and tested by computer simulation. Relay noise, homophase noise and reaction hazard are substantially reduced. Moreover, the input have little interaction hence input margin is greatly improved

  4. Marginal and joint distributions of S100, HMB-45, and Melan-A across a large series of cutaneous melanomas.

    Science.gov (United States)

    Viray, Hollis; Bradley, William R; Schalper, Kurt A; Rimm, David L; Gould Rothberg, Bonnie E

    2013-08-01

    The distribution of the standard melanoma antibodies S100, HMB-45, and Melan-A has been extensively studied. Yet, the overlap in their expression is less well characterized. To determine the joint distributions of the classic melanoma markers and to determine if classification according to joint antigen expression has prognostic relevance. S100, HMB-45, and Melan-A were assayed by immunofluorescence-based immunohistochemistry on a large tissue microarray of 212 cutaneous melanoma primary tumors and 341 metastases. Positive expression for each antigen required display of immunoreactivity for at least 25% of melanoma cells. Marginal and joint distributions were determined across all markers. Bivariate associations with established clinicopathologic covariates and melanoma-specific survival analyses were conducted. Of 322 assayable melanomas, 295 (91.6%), 203 (63.0%), and 236 (73.3%) stained with S100, HMB-45, and Melan-A, respectively. Twenty-seven melanomas, representing a diverse set of histopathologic profiles, were S100 negative. Coexpression of all 3 antibodies was observed in 160 melanomas (49.7%). Intensity of endogenous melanin pigment did not confound immunolabeling. Among primary tumors, associations with clinicopathologic parameters revealed a significant relationship only between HMB-45 and microsatellitosis (P = .02). No significant differences among clinicopathologic criteria were observed across the HMB-45/Melan-A joint distribution categories. Neither marginal HMB-45 (P = .56) nor Melan-A (P = .81), or their joint distributions (P = .88), was associated with melanoma-specific survival. Comprehensive characterization of the marginal and joint distributions for S100, HMB-45, and Melan-A across a large series of cutaneous melanomas revealed diversity of expression across this group of antigens. However, these immunohistochemically defined subclasses of melanomas do not significantly differ according to clinicopathologic correlates or outcome.

  5. Large-scale evolution of the central-east Greenland margin: New insights to the North Atlantic glaciation history

    Science.gov (United States)

    Pérez, Lara F.; Nielsen, Tove; Knutz, Paul C.; Kuijpers, Antoon; Damm, Volkmar

    2018-04-01

    The continental shelf of central-east Greenland is shaped by several glacially carved transverse troughs that form the oceanward extension of the major fjord systems. The evolution of these troughs through time, and their relation with the large-scale glaciation of the Northern Hemisphere, is poorly understood. In this study seismostratigraphic analyses have been carried out to determine the morphological and structural development of this important sector of the East Greenland glaciated margin. The age of major stratigraphic discontinuities has been constrained by a direct tie to ODP site 987 drilled in the Greenland Sea basin plain off Scoresby Sund fan system. The areal distribution and internal facies of the identified seismic units reveal the large-scale depositional pattern formed by ice-streams draining a major part of the central-east Greenland ice sheet. Initial sedimentation along the margin was, however, mainly controlled by tectonic processes related to the margin construction, continental uplift, and fluvial processes. From late Miocene to present, progradational and erosional patterns point to repeated glacial advances across the shelf. The evolution of depo-centres suggests that ice sheet advances over the continental shelf have occurred since late Miocene, about 2 Myr earlier than previously assumed. This cross-shelf glaciation is more pronounced during late Miocene and early Pliocene along Blosseville Kyst and around the Pliocene/Pleistocene boundary off Scoresby Sund; indicating a northward migration of the glacial advance. The two main periods of glaciation were separated by a major retreat of the ice sheet to an inland position during middle Pliocene. Mounded-wavy deposits interpreted as current-related deposits suggest the presence of changing along-slope current dynamics in concert with the development of the modern North Atlantic oceanographic pattern.

  6. Classifying Microorganisms

    DEFF Research Database (Denmark)

    Sommerlund, Julie

    2006-01-01

    This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological characteris......This paper describes the coexistence of two systems for classifying organisms and species: a dominant genetic system and an older naturalist system. The former classifies species and traces their evolution on the basis of genetic characteristics, while the latter employs physiological...... characteristics. The coexistence of the classification systems does not lead to a conflict between them. Rather, the systems seem to co-exist in different configurations, through which they are complementary, contradictory and inclusive in different situations-sometimes simultaneously. The systems come...

  7. Evidence of Past Large Storms or Tsunamis from an Uplifted Section of the Southern Hikurangi Margin, Wairarapa Coast, New Zealand

    Science.gov (United States)

    Mitchell, S. P.; Jessica, P.; Clark, K.; Kosciuch, T. J.; Reinhardt, E. G.

    2017-12-01

    Evidence of past large storms or tsunamis from an uplifted section of the southern Hikurangi margin, Wairarapa coast, New Zealand Stephen Mitchell1, Jessica Pilarczyk1, Kate Clark2, Thomas Kosciuch1, Eduard Reinhardt31University of Southern Mississippi, Department of Marine Science 2GNS Science, New Zealand 3McMaster University The Hikurangi margin, located along the east coast of New Zealand, has generated multiple tsunamigenic-earthquakes in the historic times that have impacted coastlines of the North Island. Knowledge of the possible magnitudes and recurrence intervals associated with Hikurangi earthquakes and tsunamis is necessary to understand and mitigate hazards facing New Zealand's coasts. Events such as the 1931 Napier earthquake, which caused severe ground shaking, and the Gisborne tsunami of 1947 that reached 10 meters high, demonstrate the earthquake and tsunami hazards associated with the Hikurangi margin. To better understand these hazards, longer-term geologic records are needed. Along the Wairarapa coast of the North Island of New Zealand, marine terraces provide evidence for multiple Hikurangi earthquakes over the past 7,000 years. Evidence for possible tsunami inundation in this area has also been discovered, but the record is patchy in spatial and temporal extent. We found three anomalous sand layers preserved within an uplifted beach exposure along the Wairarapa coast. The sand layers, consisting of very fine to coarse sand (3.5-0.8 Φ), sharply overlie paleosols containing fine to medium sized silt (6.1-7.1 Φ) in a sequence that extends for approximately 400 meters along shore. We assign a marine origin to the sand layers because they contain relatively high elemental concentrations of calcium and barium. By contrast, the paleosols contained relatively high elemental concentrations of iron. The marine sands contain evidence in support of tsunami inundation; rip-up clasts, coarse pulses, fining upward sequences, and erosive contacts were

  8. THEORETICAL AND EXPERIMENTAL INVESTIGATION OF THE MAGNETOELECTRIC SYSTEM, WHICH RECOGNIZES THE LARGE OF SUNS OF A SINGLE-SPIRAL CLASSIFIER

    Directory of Open Access Journals (Sweden)

    A. N. Matsui

    2018-02-01

    Full Text Available Purpose. The aim of the work is to create a magnetoelectric system with permanent magnets, which perceives the coarse size of the sand of a single-helix classifier, by establishing the connection of the output signal with the measured quantity, eliminating the effect of disturbances on the result and justifying its parameters. Methodology. The studies carried out on the basis of the use of methods of the theory of electrical engineering, magnetic systems with permanent magnets, galvanomagnetic transducers, probabilities, random processes, statistics, regression analysis, sensitivity, differential calculus, rock magnetism, determination of the physical properties of matrix material when impurities are added to it with others explicitly expressed properties, the classification of enrichment products. Findings. The process of the rate of change of the volume of solid in a controlled volume of space through which the sand material moves is described mathematically. The limits of the volume of the controlled volume at which the sensitivity is still sufficient are determined. The theoretical dependences of the rate of change of the solid volume in the controlled volume on the size of the sands at different speeds are obtained. It is established that the state of the controlled volume is best estimated by the magnetic method. A magnetoelectric system with permanent magnets has been developed, which has optimal parameter values and an induction winding containing up to 25,000 turns, and in one of the pole pieces of which a Hall transducer is installed in a continuous slot. The magnetic system near the air gap creates in the material a magnetic field 5 × 20 × 60 mm in size with almost the same intensity. Ed. The magnetoelectric system practically changes linearly with the increase in the size of the material. It depends on the content of magnetic iron in the solid, which is compensated by the use of the signal from the Hall converter. Correlation

  9. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier.

    Science.gov (United States)

    Wolters, Mark A; Dean, C B

    2017-01-01

    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  10. Large-scale Reconstructions and Independent, Unbiased Clustering Based on Morphological Metrics to Classify Neurons in Selective Populations.

    Science.gov (United States)

    Bragg, Elise M; Briggs, Farran

    2017-02-15

    This protocol outlines large-scale reconstructions of neurons combined with the use of independent and unbiased clustering analyses to create a comprehensive survey of the morphological characteristics observed among a selective neuronal population. Combination of these techniques constitutes a novel approach for the collection and analysis of neuroanatomical data. Together, these techniques enable large-scale, and therefore more comprehensive, sampling of selective neuronal populations and establish unbiased quantitative methods for describing morphologically unique neuronal classes within a population. The protocol outlines the use of modified rabies virus to selectively label neurons. G-deleted rabies virus acts like a retrograde tracer following stereotaxic injection into a target brain structure of interest and serves as a vehicle for the delivery and expression of EGFP in neurons. Large numbers of neurons are infected using this technique and express GFP throughout their dendrites, producing "Golgi-like" complete fills of individual neurons. Accordingly, the virus-mediated retrograde tracing method improves upon traditional dye-based retrograde tracing techniques by producing complete intracellular fills. Individual well-isolated neurons spanning all regions of the brain area under study are selected for reconstruction in order to obtain a representative sample of neurons. The protocol outlines procedures to reconstruct cell bodies and complete dendritic arborization patterns of labeled neurons spanning multiple tissue sections. Morphological data, including positions of each neuron within the brain structure, are extracted for further analysis. Standard programming functions were utilized to perform independent cluster analyses and cluster evaluations based on morphological metrics. To verify the utility of these analyses, statistical evaluation of a cluster analysis performed on 160 neurons reconstructed in the thalamic reticular nucleus of the thalamus

  11. Evolution of Fine-Grained Channel Margin Deposits behind Large Woody Debris in an Experimental Gravel-Bed Flume

    Science.gov (United States)

    ONeill, B.; Marks, S.; Skalak, K.; Puleo, J. A.; Wilcock, P. R.; Pizzuto, J. E.

    2014-12-01

    Fine grained channel margin (FGCM) deposits of the South River, Virginia sequester a substantial volume of fine-grained sediment behind large woody debris (LWD). FGCM deposits were created in a laboratory setting meant to simulate the South River environment using a recirculating flume (15m long by 0.6m wide) with a fixed gravel bed and adjustable slope (set to 0.0067) to determine how fine sediment is transported and deposited behind LWD. Two model LWD structures were placed 3.7 m apart on opposite sides of the flume. A wire mesh screen with attached wooden dowels simulated LWD with an upstream facing rootwad. Six experiments with three different discharge rates, each with low and high sediment concentrations, were run. Suspended sediment was very fine grained (median grain size of 3 phi) and well sorted (0.45 phi) sand. Upstream of the wood, water depths averaged about 0.08m, velocities averaged about 0.3 m/s, and Froude numbers averaged around 0.3. Downstream of the first LWD structure, velocities were reduced tenfold. Small amounts of sediment passed through the rootwad and fell out of suspension in the area of reduced flow behind LWD, but most of the sediment was carried around the LWD by the main flow and then behind the LWD by a recirculating eddy current. Upstream migrating dunes formed behind LWD due to recirculating flow, similar to reattachment bars documented in bedrock canyon rivers partially obstructed by debouching debris fans. These upstream migrating dunes began at the reattachment point and merged with deposits formed from sediment transported through the rootwad. Downstream migrating dunes formed along the channel margin behind the LWD, downstream of the reattachment point. FGCM deposits were about 3 m long, with average widths of about 0.8 m. Greater sediment concentration created thicker FGCM deposits, and higher flows eroded the sides of the deposits, reducing their widths.

  12. Large-scale landslide triggering mechanisms in Debre Sina area, Central Ethiopian Highlands at the western Afar rift margin

    Science.gov (United States)

    Kiros, T.; Wohnlich, S.; Hussien, B.

    2017-12-01

    The Central Highlands of Ethiopia have repeatedly experiencing large-scale landslide events. Debre Sina area is one of the most landslide prone areas located along the western Afar rift margin of Ethiopia, which is frequently affected by large-scale and deep-seated landslides. Despite that, urban and rural development is currently taking place in almost all constricted valleys as well as on the imposing cliffs. Therefore, understanding the major triggering factors and failure mechanisms in the Debre Sina area and surroundings is of critical importance. In the present study, we investigate the landslide in the area using geological and topographic analysis, structural settings, geophysical investigation (seismic refraction), rainfall data and seismicity. Furthermore, petrographical as well as X-ray Diffraction (XRD) analysis are conducted to explain the mineral composition of parent rock and its weathering products. The topographic analysis result revealed that the slope range from 100 - 400, with elevation of 1,800 - 2,500m, with aspect to east and southeast are highly prone to landslide. The seismic refraction method identified four main layers of geomaterials which contained a subsurface landslides anomaly within the layers. The results consist of clay, loosely cemented colluvial sediments and highly weathered agglomerates (1000-1500m/s) 7-15m, highly to moderately fractured porphyritic basalt, ignimbrite, rhyolite/trachyte and volcanic ash (1500-2500m/s) 10-30m, moderately to slightly fractured ignimbrite, rhyolite/trachyte and basalt (2500-3500m/s) 30-50m and very strong, massive, fresh rock/bed rock (>3500m/s) from 45m depth. The large-scale and deep-seated landslides problem in the study area appears to be caused by heavy rainfall, complex geology and rugged topography, the presence of geological structures oriented parallel to the rift margin N-S fault (NNE-SSW trending) of the central Ethiopian highlands and coinciding with the head scarp of the slides and

  13. Carbon classified?

    DEFF Research Database (Denmark)

    Lippert, Ingmar

    2012-01-01

    . Using an actor- network theory (ANT) framework, the aim is to investigate the actors who bring together the elements needed to classify their carbon emission sources and unpack the heterogeneous relations drawn on. Based on an ethnographic study of corporate agents of ecological modernisation over...... a period of 13 months, this paper provides an exploration of three cases of enacting classification. Drawing on ANT, we problematise the silencing of a range of possible modalities of consumption facts and point to the ontological ethics involved in such performances. In a context of global warming...

  14. Compressive fracture resistance of the marginal ridge in large Class II tunnels restored with cermet and composite resin.

    Science.gov (United States)

    Ehrnford, L E; Fransson, H

    1994-01-01

    Compressive fracture resistance of the marginal ridge was studied in large tunnel preparations, before and after restoration with cermet (Ketac Silver, ESPE), a universal hybrid composite (Superlux, DMG) and an experimental composite. Each group was represented by six tunnels in extracted upper premolars. The tunnels were prepared by the use of round burs up to size #6. Remaining ridge width was 1.5 mm and ridge height 1.7 mm in the contact area. The ridge was loaded to fracture by a rod placed perpendicular to the ridge. Generally this resulted in a shear fracture of the restoration. There was no significant reinforcement of the ridge by the cermet whereas the composites both reinforced by the same magnitude, averaging 62%. It was concluded that the ridge could be considered a "megafiller" where contact need to be preserved and contour protected against proximal and occlusal wear of the restoration. Clinically there would therefore be good reasons to save even ridge areas with very low inherent strength. Based on the present study composite resin might therefore be the filling material of choice for such tunnel preparations.

  15. A Study on the compensation margin on butt welding joint of Large Steel plates during Shipbuilding construction

    International Nuclear Information System (INIS)

    Kim, J; Jeong, K; Chung, H; Jeong, H; Ji, M; Yun, C; Lee, J

    2015-01-01

    This paper examines the characteristics of butt welding joint shrinkage for shipbuilding and marine structures main plate. The shrinkage strain of butt welding joint which is caused by the process of heat input and cooling, results in the difference between dimensions of the actual parent metal and the dimensions of design. This, in turn, leads to poor quality in the production of ship blocks and reworking through period of correction brings about impediment on improvement of productivity. Through experiments on butt welding joint's shrinkage strain on large structures main plate, the deformation of welding residual stress in the form of I, Y, V was obtained. In addition, the results of experiments indicate that there is limited range of shrinkage in the range of 1 ∼ 2 mm in 11t ∼ 21.5t thickness and the effect of heat transfer of weld appears to be limited within 1000 mm based on one side of seam line so there was limited impact of weight of parent metal on the shrinkage. Finally, it has been learned that Shrinkage margin needs to be applied differently based on groove phenomenon in the design phase in order to minimize shrinkage. (paper)

  16. Risk-informed analysis of the large break loss of coolant accident and PCT margin evaluation with the RISMC methodology

    International Nuclear Information System (INIS)

    Liang, T.H.; Liang, K.S.; Cheng, C.K.; Pei, B.S.; Patelli, E.

    2016-01-01

    Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10 −3 . • The occurrence probability of the deterministic licensing sequence is 5.46 * 10 −5 . - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability of the

  17. Risk-informed analysis of the large break loss of coolant accident and PCT margin evaluation with the RISMC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Liang, T.H. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Liang, K.S., E-mail: ksliang@alum.mit.edu [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Cheng, C.K.; Pei, B.S. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Patelli, E. [Institute of Risk and Uncertainty, University of Liverpool, Room 610, Brodie Tower, L69 3GQ (United Kingdom)

    2016-11-15

    Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10{sup −3}. • The occurrence probability of the deterministic licensing sequence is 5.46 * 10{sup −5}. - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability

  18. Performance of a Machine Learning Classifier of Knee MRI Reports in Two Large Academic Radiology Practices: A Tool to Estimate Diagnostic Yield.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P; Amrhein, Timothy J; Befera, Nicholas T; Lungren, Matthew P

    2017-04-01

    The purpose of this study is to evaluate the performance of a natural language processing (NLP) system in classifying a database of free-text knee MRI reports at two separate academic radiology practices. An NLP system that uses terms and patterns in manually classified narrative knee MRI reports was constructed. The NLP system was trained and tested on expert-classified knee MRI reports from two major health care organizations. Radiology reports were modeled in the training set as vectors, and a support vector machine framework was used to train the classifier. A separate test set from each organization was used to evaluate the performance of the system. We evaluated the performance of the system both within and across organizations. Standard evaluation metrics, such as accuracy, precision, recall, and F1 score (i.e., the weighted average of the precision and recall), and their respective 95% CIs were used to measure the efficacy of our classification system. The accuracy for radiology reports that belonged to the model's clinically significant concept classes after training data from the same institution was good, yielding an F1 score greater than 90% (95% CI, 84.6-97.3%). Performance of the classifier on cross-institutional application without institution-specific training data yielded F1 scores of 77.6% (95% CI, 69.5-85.7%) and 90.2% (95% CI, 84.5-95.9%) at the two organizations studied. The results show excellent accuracy by the NLP machine learning classifier in classifying free-text knee MRI reports, supporting the institution-independent reproducibility of knee MRI report classification. Furthermore, the machine learning classifier performed well on free-text knee MRI reports from another institution. These data support the feasibility of multiinstitutional classification of radiologic imaging text reports with a single machine learning classifier without requiring institution-specific training data.

  19. The Research of Tectonic Framework and the Fault Activity in Large Detachment Basin System on Northern Margin of South China Sea

    Science.gov (United States)

    Pan, L., Sr.; Ren, J.

    2017-12-01

    The South China Sea (SCS) is one of the largest marginal sea on southeast Asia continental margin, developed Paleogene extension-rifting continental margin system which is rare in the world and preserving many deformed characterizes of this kind system. With the investigation of the SCS, guiding by the development of tectonics and geo-physics, especially the development of tectonics and the high quality seismic data based on the development of geo-physics, people gradually accept that the northern margin of the SCS has some detachment basin characterizes. After researching the northern margin of the SCS, we come up with lithosphere profiles across the shelf, slope and deep sea basin in the northeast of the SCS to confirm the tectonic style of ocean-continental transition and the property of the detachment fault. Furthermore, we describe the outline of large detachment basins at northern SCS. Based on the large number of high-quality 2D and 3D deep seismic profile(TWT,10s), drilling and logging data, combined with domestic and international relevant researches, using basin dynamics and tectono-stratigraphy theory, techniques and methods of geology and geophysics, qualitative and quantitative, we describe the formation of the detachment basin and calculate the fault activity rate, stretching factor and settlement. According to the research, we propose that there is a giant and complete detachment basin system in the northern SCS and suggest three conclusions. First of all, the detachment basin system can be divided into three domains: proximal domain covering the Yangjiang Sag, Shenhu uplift and part of Shunde Sag, necking zone covering part of the Shunde Sag and Heshan Sag, distal domain covering most part of Heshan Sag. Second, the difference of the stretching factor is observed along the three domains of the detachment basin system. The factor of the proximal domain is the minimum among them. On the other side, the distal domain is the maximum among them. This

  20. Robust wavebuoys for the marginal ice zone: Experiences from a large persistent array in the Beaufort Sea

    Directory of Open Access Journals (Sweden)

    Martin J. Doble

    2017-08-01

    Full Text Available An array of novel directional wavebuoys was designed and deployed into the Beaufort Sea ice cover in March 2014, as part of the Office of Naval Research 'Marginal Ice Zone' experiment. The buoys were designed to drift with the ice throughout the year and monitor the expected breakup and retreat of the ice cover, forced by waves travelling into the ice from open water. Buoys were deployed from fast-and-light air-supported ice camps, based out of Sachs Harbour on Canada’s Banks Island, and drifted westwards with the sea ice over the course of spring, summer and autumn, as the ice melted, broke up and finally re-froze. The buoys transmitted heave, roll and pitch timeseries at 1 Hz sample frequency over the course of up to eight months, surviving both convergent ice dynamics and significant waves-in-ice events. Twelve of the 19 buoys survived until their batteries were finally exhausted during freeze-up in late October/November. Ice impact was found to have contaminated a significant proportion of the Kalman-filter-derived heave records, and these bad records were removed with reference to raw x/y/z accelerations. The quality of magnetometer-derived buoy headings at the very high magnetic field inclinations close to the magnetic pole was found to be generally acceptable, except in the case of four buoys which had probably suffered rough handling during transport to the ice. In general, these new buoys performed as expected, though vigilance as to the veracity of the output is required.

  1. A Topic Model Approach to Representing and Classifying Football Plays

    KAUST Repository

    Varadarajan, Jagannadan

    2013-09-09

    We address the problem of modeling and classifying American Football offense teams’ plays in video, a challenging example of group activity analysis. Automatic play classification will allow coaches to infer patterns and tendencies of opponents more ef- ficiently, resulting in better strategy planning in a game. We define a football play as a unique combination of player trajectories. To this end, we develop a framework that uses player trajectories as inputs to MedLDA, a supervised topic model. The joint maximiza- tion of both likelihood and inter-class margins of MedLDA in learning the topics allows us to learn semantically meaningful play type templates, as well as, classify different play types with 70% average accuracy. Furthermore, this method is extended to analyze individual player roles in classifying each play type. We validate our method on a large dataset comprising 271 play clips from real-world football games, which will be made publicly available for future comparisons.

  2. Final Validation of the ProMisE Molecular Classifier for Endometrial Carcinoma in a Large Population-based Case Series.

    Science.gov (United States)

    Kommoss, S; McConechy, M K; Kommoss, F; Leung, S; Bunz, A; Magrill, J; Britton, H; Kommoss, F; Grevenkamp, F; Karnezis, A; Yang, W; Lum, A; Krämer, B; Taran, F; Staebler, A; Lax, S; Brucker, S Y; Huntsman, D G; Gilks, C B; McAlpine, J N; Talhouk, A

    2018-02-07

    Based on The Cancer Genome Atlas, we previously developed and confirmed a pragmatic molecular classifier for endometrial cancers; ProMisE (Proactive Molecular Risk Classifier for Endometrial Cancer). ProMisE identifies four prognostically distinct molecular subtypes, and can be applied to diagnostic specimens (biopsy/curettings), enabling earlier informed decision-making. We have strictly adhered to the Institute of Medicine (IOM) guidelines for the development of genomic biomarkers, and herein present the final validation step of a locked-down classifier prior to clinical application. We assessed a retrospective cohort of women from the Tübingen University Women's Hospital treated for endometrial carcinoma between 2003-13. Primary outcomes of overall, disease-specific and progression-free survival were evaluated for clinical, pathological, and molecular features. Complete clinical and molecular data were evaluable from 452 women. Patient age ranged from 29 - 93 (median 65) years, and 87.8% cases were endometrioid histotype. Grade distribution included 282 (62.4%) G1, 75 (16.6%) G2, and 95 (21.0%) G3 tumors. 276 (61.1%) patients had stage IA disease, with the remaining stage IB (89 (19.7%)), stage II (26 (5.8%)), and stage III/IV (61 (13.5%)). ProMisE molecular classification yielded 127 (28.1%) MMR-D, 42 (9.3%) POLE, 55 (12.2%) p53abn, and 228 (50.4%) p53wt. ProMisE was a prognostic marker for progression-free (P=0.001) and disease-specific (P=0.03) survival even after adjusting for known risk factors. Concordance between diagnostic and surgical specimens was highly favorable; accuracy 0.91, kappa 0.88. We have developed, confirmed and now validated a pragmatic molecular classification tool (ProMisE) that provides consistent categorization of tumors and identifies four distinct prognostic molecular subtypes. ProMisE can be applied to diagnostic samples and thus could be used to inform surgical procedure(s) and/or need for adjuvant therapy. Based on the IOM

  3. Subducted slab-plume interaction traced by magnesium isotopes in the northern margin of the Tarim Large Igneous Province

    Science.gov (United States)

    Cheng, Zhiguo; Zhang, Zhaochong; Xie, Qiuhong; Hou, Tong; Ke, Shan

    2018-05-01

    Incorporation of subducted slabs may account for the geochemical and isotopic variations of large igneous provinces (LIPs). However, the mechanism and process by which subducted slabs are involved into magmas is still highly debated. Here, we report a set of high resolution Mg isotopes for a suite of alkaline and Fe-rich rocks (including basalts, mafic-ultramafic layered intrusions, diabase dykes and mantle xenoliths in the kimberlitic rocks) from Tarim Large Igneous Province (TLIP). We observed that δ26 Mg values of basalts range from -0.29 to - 0.45 ‰, -0.31 to - 0.42 ‰ for mafic-ultramafic layered intrusions, -0.28 to - 0.31 ‰ for diabase dykes and -0.29 to - 0.44 ‰ for pyroxenite xenoliths from the kimberlitic rocks, typically lighter than the normal mantle source (- 0.25 ‰ ± 0.04, 2 SD). After carefully precluding other possibilities, we propose that the light Mg isotopic compositions and high FeO contents should be ascribed to the involvement of recycled sedimentary carbonate rocks and pyroxenite/eclogite. Moreover, from basalts, through layered intrusions to diabase dykes, (87Sr/86Sr)i values and δ18OV-SMOW declined, whereas ε (Nd) t and δ26 Mg values increased with progressive partial melting of mantle, indicating that components of carbonate rock and pyroxenite/eclogite in the mantle sources were waning over time. In combination with the previous reported Mg isotopes for carbonatite, nephelinite and kimberlitic rocks in TLIP, two distinct mantle domains are recognized for this province: 1) a lithospheric mantle source for basalts and mafic-ultramafic layered intrusions which were modified by calcite/dolomite and eclogite-derived high-Si melts, as evidenced by enriched Sr-Nd-O and light Mg isotopic compositions; 2) a plume source for carbonatite, nephelinite and kimberlitic rocks which were related to magnesite or periclase/perovskite involvement as reflected by depleted Sr-Nd-O and extremely light Mg isotopes. Ultimately, our study suggests

  4. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  5. Marginal Matter

    Science.gov (United States)

    van Hecke, Martin

    2013-03-01

    All around us, things are falling apart. The foam on our cappuccinos appears solid, but gentle stirring irreversibly changes its shape. Skin, a biological fiber network, is firm when you pinch it, but soft under light touch. Sand mimics a solid when we walk on the beach but a liquid when we pour it out of our shoes. Crucially, a marginal point separates the rigid or jammed state from the mechanical vacuum (freely flowing) state - at their marginal points, soft materials are neither solid nor liquid. Here I will show how the marginal point gives birth to a third sector of soft matter physics: intrinsically nonlinear mechanics. I will illustrate this with shock waves in weakly compressed granular media, the nonlinear rheology of foams, and the nonlinear mechanics of weakly connected elastic networks.

  6. Quantum ensembles of quantum classifiers.

    Science.gov (United States)

    Schuld, Maria; Petruccione, Francesco

    2018-02-09

    Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which - similar to Bayesian learning - the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.

  7. Automated gastric cancer diagnosis on H&E-stained sections; ltraining a classifier on a large scale with multiple instance machine learning

    Science.gov (United States)

    Cosatto, Eric; Laquerre, Pierre-Francois; Malon, Christopher; Graf, Hans-Peter; Saito, Akira; Kiyuna, Tomoharu; Marugame, Atsushi; Kamijo, Ken'ichi

    2013-03-01

    We present a system that detects cancer on slides of gastric tissue sections stained with hematoxylin and eosin (H&E). At its heart is a classi er trained using the semi-supervised multi-instance learning framework (MIL) where each tissue is represented by a set of regions-of-interest (ROI) and a single label. Such labels are readily obtained because pathologists diagnose each tissue independently as part of the normal clinical work ow. From a large dataset of over 26K gastric tissue sections from over 12K patients obtained from a clinical load spanning several months, we train a MIL classi er on a patient-level partition of the dataset (2/3 of the patients) and obtain a very high performance of 96% (AUC), tested on the remaining 1/3 never-seen before patients (over 8K tissues). We show this level of performance to match the more costly supervised approach where individual ROIs need to be labeled manually. The large amount of data used to train this system gives us con dence in its robustness and that it can be safely used in a clinical setting. We demonstrate how it can improve the clinical work ow when used for pre-screening or quality control. For pre-screening, the system can diagnose 47% of the tissues with a very low likelihood (cancers, thus halving the clinicians' caseload. For quality control, compared to random rechecking of 33% of the cases, the system achieves a three-fold increase in the likelihood of catching cancers missed by pathologists. The system is currently in regular use at independent pathology labs in Japan where it is used to double-check clinician's diagnoses. At the end of 2012 it will have analyzed over 80,000 slides of gastric and colorectal samples (200,000 tissues).

  8. Hierarchical mixtures of naive Bayes classifiers

    NARCIS (Netherlands)

    Wiering, M.A.

    2002-01-01

    Naive Bayes classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this pa- per we study combining multiple naive Bayes classifiers by using the hierar- chical

  9. Classifying Returns as Extreme

    DEFF Research Database (Denmark)

    Christiansen, Charlotte

    2014-01-01

    I consider extreme returns for the stock and bond markets of 14 EU countries using two classification schemes: One, the univariate classification scheme from the previous literature that classifies extreme returns for each market separately, and two, a novel multivariate classification scheme tha...

  10. LCC: Light Curves Classifier

    Science.gov (United States)

    Vo, Martin

    2017-08-01

    Light Curves Classifier uses data mining and machine learning to obtain and classify desired objects. This task can be accomplished by attributes of light curves or any time series, including shapes, histograms, or variograms, or by other available information about the inspected objects, such as color indices, temperatures, and abundances. After specifying features which describe the objects to be searched, the software trains on a given training sample, and can then be used for unsupervised clustering for visualizing the natural separation of the sample. The package can be also used for automatic tuning parameters of used methods (for example, number of hidden neurons or binning ratio). Trained classifiers can be used for filtering outputs from astronomical databases or data stored locally. The Light Curve Classifier can also be used for simple downloading of light curves and all available information of queried stars. It natively can connect to OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO, and new connectors or descriptors can be implemented. In addition to direct usage of the package and command line UI, the program can be used through a web interface. Users can create jobs for ”training” methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised clustering.

  11. Error minimizing algorithms for nearest eighbor classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory; Zimmer, G. Beate [TEXAS A& M

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  12. Intelligent Garbage Classifier

    Directory of Open Access Journals (Sweden)

    Ignacio Rodríguez Novelle

    2008-12-01

    Full Text Available IGC (Intelligent Garbage Classifier is a system for visual classification and separation of solid waste products. Currently, an important part of the separation effort is based on manual work, from household separation to industrial waste management. Taking advantage of the technologies currently available, a system has been built that can analyze images from a camera and control a robot arm and conveyor belt to automatically separate different kinds of waste.

  13. Classifying Linear Canonical Relations

    OpenAIRE

    Lorand, Jonathan

    2015-01-01

    In this Master's thesis, we consider the problem of classifying, up to conjugation by linear symplectomorphisms, linear canonical relations (lagrangian correspondences) from a finite-dimensional symplectic vector space to itself. We give an elementary introduction to the theory of linear canonical relations and present partial results toward the classification problem. This exposition should be accessible to undergraduate students with a basic familiarity with linear algebra.

  14. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  15. Arthroscopic repair of large U-shaped rotator cuff tears without margin convergence versus repair of crescent- or L-shaped tears.

    Science.gov (United States)

    Park, Jin-Young; Jung, Seok Won; Jeon, Seung-Hyub; Cho, Hyoung-Weon; Choi, Jin-Ho; Oh, Kyung-Soo

    2014-01-01

    For large-sized tears of the rotator cuff, data according to the tear shape have not yet been reported for repair methodology, configuration, and subsequent integrity. The retear rate after the repair of large mobile tears, such as crescent- or L-shaped tears, is believed to be lower compared with retear rates after the repair of large U-shaped tears that are accompanied by anterior or posterior leaves of the rotator cuff. Cohort study; Level of evidence, 3. Data were collected and analyzed from 95 consecutive patients with a large-sized rotator cuff tear who underwent arthroscopic suture-bridge repair. Patients were divided into 2 groups: those having crescent- or L-shaped tears (mobile tear group, 53 patients) and those having U-shaped tears (U-shaped tear group, 42 patients). The integrity of the repaired constructs was determined by ultrasonography at 4.5, 12, and 24 months. Moreover, clinical evaluations were performed by using the Constant score, the American Shoulder and Elbow Surgeons (ASES) score, and muscle strength at intervals of 3, 6, 12, and 24 months postoperatively. On ultrasonography at 4.5, 12, and 24 months, a retear was detected in 6, 2, and 1 patients in the mobile tear group and in 5, 2, and 1 patients in the U-shaped tear group, respectively. Significant differences in retear rates were not detected between the groups overall or at each time point. Moreover, clinical scores were similar between groups, except for the presence of a temporarily higher Constant score at 12 months in the mobile tear group. With regard to shoulder strength, between-group comparisons indicated no statistically significant difference, either in abduction or external rotation, except for the presence of temporarily higher external rotation strength at 3 months in the mobile tear group. Arthroscopic repair of large-sized rotator cuff tears yielded substantial improvements in shoulder function, regardless of tear retraction, during midterm follow-up. Moreover, the

  16. Convexity and Marginal Vectors

    NARCIS (Netherlands)

    van Velzen, S.; Hamers, H.J.M.; Norde, H.W.

    2002-01-01

    In this paper we construct sets of marginal vectors of a TU game with the property that if the marginal vectors from these sets are core elements, then the game is convex.This approach leads to new upperbounds on the number of marginal vectors needed to characterize convexity.An other result is that

  17. "We call ourselves marginalized"

    DEFF Research Database (Denmark)

    Jørgensen, Nanna Jordt

    2014-01-01

    of the people we refer to as marginalized. In this paper, I discuss how young secondary school graduates from a pastoralist community in Kenya use and negotiate indigeneity, marginal identity, and experiences of marginalization in social navigations aimed at broadening their current and future opportunities. I...

  18. Fast Most Similar Neighbor (MSN) classifiers for Mixed Data

    OpenAIRE

    Hernández Rodríguez, Selene

    2010-01-01

    The k nearest neighbor (k-NN) classifier has been extensively used in Pattern Recognition because of its simplicity and its good performance. However, in large datasets applications, the exhaustive k-NN classifier becomes impractical. Therefore, many fast k-NN classifiers have been developed; most of them rely on metric properties (usually the triangle inequality) to reduce the number of prototype comparisons. Hence, the existing fast k-NN classifiers are applicable only when the comparison f...

  19. Recognize and classify pneumoconiosis

    International Nuclear Information System (INIS)

    Hering, K.G.; Hofmann-Preiss, K.

    2014-01-01

    In the year 2012, out of the 10 most frequently recognized occupational diseases 6 were forms of pneumoconiosis. With respect to healthcare and economic aspects, silicosis and asbestos-associated diseases are of foremost importance. The latter are to be found everywhere and are not restricted to large industrial areas. Radiology has a central role in the diagnosis and evaluation of occupational lung disorders. In cases of known exposure mainly to asbestos and quartz, the diagnosis of pneumoconiosis, with few exceptions will be established primarily by the radiological findings. As these disorders are asymptomatic for a long time they are quite often detected as incidental findings in examinations for other reasons. Therefore, radiologists have to be familiar with the pattern of findings of the most frequent forms of pneumoconiosis and the differential diagnoses. For reasons of equal treatment of the insured a quality-based, standardized performance, documentation and evaluation of radiological examinations is required in preventive procedures and evaluations. Above all, a standardized low-dose protocol has to be used in computed tomography (CT) examinations, although individualized concerning the dose, in order to keep radiation exposure as low as possible for the patient. The International Labour Office (ILO) classification for the coding of chest X-rays and the international classification of occupational and environmental respiratory diseases (ICOERD) classification used since 2004 for CT examinations meet the requirements of the insured and the occupational insurance associations as a means of reproducible and comparable data for decision-making. (orig.) [de

  20. Molecular markers in the surgical margin of oral carcinomas

    DEFF Research Database (Denmark)

    Bilde, A.; Buchwald, C. von; Dabelsteen, E.

    2009-01-01

    epithelium in the surgical resection margin may explain the local recurrence rate. The purpose of this study is to investigate the presence of senescence markers, which may represent early malignant changes in the margin that in routine pathological evaluations are classified as histologically normal...

  1. On the evaluation of marginal expected shortfall

    DEFF Research Database (Denmark)

    Caporin, Massimiliano; Santucci de Magistris, Paolo

    2012-01-01

    In the analysis of systemic risk, Marginal Expected Shortfall may be considered to evaluate the marginal impact of a single stock on the market Expected Shortfall. These quantities are generally computed using log-returns, in particular when there is also a focus on returns conditional distribution....... In this case, the market log-return is only approximately equal to the weighed sum of equities log-returns. We show that the approximation error is large during turbulent market phases, with a subsequent impact on Marginal Expected Shortfall. We then suggest how to improve the evaluation of Marginal Expected...

  2. Fingerprint prediction using classifier ensembles

    CSIR Research Space (South Africa)

    Molale, P

    2011-11-01

    Full Text Available ); logistic discrimination (LgD), k-nearest neighbour (k-NN), artificial neural network (ANN), association rules (AR) decision tree (DT), naive Bayes classifier (NBC) and the support vector machine (SVM). The performance of several multiple classifier systems...

  3. Large margin classification with indefinite similarities

    KAUST Repository

    Alabdulmohsin, Ibrahim; Cisse, Moustapha; Gao, Xin; Zhang, Xiangliang

    2016-01-01

    Classification with indefinite similarities has attracted attention in the machine learning community. This is partly due to the fact that many similarity functions that arise in practice are not symmetric positive semidefinite, i.e. the Mercer

  4. Refining margins and prospects

    International Nuclear Information System (INIS)

    Baudouin, C.; Favennec, J.P.

    1997-01-01

    Refining margins throughout the world have remained low in 1996. In Europe, in spite of an improvement, particularly during the last few weeks, they are still not high enough to finance new investments. Although the demand for petroleum products is increasing, experts are still sceptical about any rapid recovery due to prevailing overcapacity and to continuing capacity growth. After a historical review of margins and an analysis of margins by regions, we analyse refining over-capacities in Europe and the unbalances between production and demand. Then we discuss the current situation concerning barriers to the rationalization, agreements between oil companies, and the consequences on the future of refining capacities and margins. (author)

  5. Marginalization of the Youth

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal

    2009-01-01

    The article is based on a key note speach in Bielefeld on the subject "welfare state and marginalized youth", focusing upon the high ambition of expanding schooling in Denmark from 9 to 12 years. The unintended effect may be a new kind of marginalization.......The article is based on a key note speach in Bielefeld on the subject "welfare state and marginalized youth", focusing upon the high ambition of expanding schooling in Denmark from 9 to 12 years. The unintended effect may be a new kind of marginalization....

  6. Classified

    CERN Multimedia

    Computer Security Team

    2011-01-01

    In the last issue of the Bulletin, we have discussed recent implications for privacy on the Internet. But privacy of personal data is just one facet of data protection. Confidentiality is another one. However, confidentiality and data protection are often perceived as not relevant in the academic environment of CERN.   But think twice! At CERN, your personal data, e-mails, medical records, financial and contractual documents, MARS forms, group meeting minutes (and of course your password!) are all considered to be sensitive, restricted or even confidential. And this is not all. Physics results, in particular when being preliminary and pending scrutiny, are sensitive, too. Just recently, an ATLAS collaborator copy/pasted the abstract of an ATLAS note onto an external public blog, despite the fact that this document was clearly marked as an "Internal Note". Such an act was not only embarrassing to the ATLAS collaboration, and had negative impact on CERN’s reputation --- i...

  7. Classifying Sluice Occurrences in Dialogue

    DEFF Research Database (Denmark)

    Baird, Austin; Hamza, Anissa; Hardt, Daniel

    2018-01-01

    perform manual annotation with acceptable inter-coder agreement. We build classifier models with Decision Trees and Naive Bayes, with accuracy of 67%. We deploy a classifier to automatically classify sluice occurrences in OpenSubtitles, resulting in a corpus with 1.7 million occurrences. This will support....... Despite this, the corpus can be of great use in research on sluicing and development of systems, and we are making the corpus freely available on request. Furthermore, we are in the process of improving the accuracy of sluice identification and annotation for the purpose of created a subsequent version...

  8. The marginal costs of greenhouse gas emissions

    International Nuclear Information System (INIS)

    Tol, R.S.J.

    1999-01-01

    Estimates of the marginal costs of greenhouse gas emissions are on important input to the decision how much society would want to spend on greenhouse gas emission reduction. Marginal cost estimates in the literature range between $5 and $25 per ton of carbon. Using similar assumptions, the FUND model finds marginal costs of $9--23/tC, depending on the discount rate. If the aggregation of impacts over countries accounts for inequalities in income distribution or for risk aversion, marginal costs would rise by about a factor of 3. Marginal costs per region are an order of magnitude smaller than global marginal costs. The ratios between the marginal costs of CO 2 and those of CH 4 and N 2 O are roughly equal to the global warming potentials of these gases. The uncertainty about the marginal costs is large and right-skewed. The expected value of the marginal costs lies about 35% above the best guess, the 95-percentile about 250%

  9. On marginal regeneration

    NARCIS (Netherlands)

    Stein, H.N.

    1991-01-01

    On applying the marginal regeneration concept to the drainage of free liquid films, problems are encountered: the films do not show a "neck" of minimum thickness at the film/border transition; and the causes of the direction dependence of the marginal regeneration are unclear. Both problems can be

  10. Indian Ocean margins

    Digital Repository Service at National Institute of Oceanography (India)

    Naqvi, S.W.A

    in the latter two areas. Some of these fluxes are expected to be substantial in the case of Indonesian continental margins and probably also across the eastern coasts of Africa not covered in this chapter. However, a dearth of information makes these margins...

  11. Matthew and marginality

    Directory of Open Access Journals (Sweden)

    Denis C. Duling

    1995-12-01

    Full Text Available This article explores marginality theory as it was first proposed in  the social sciences, that is related to persons caught between two competing cultures (Park; Stonequist, and, then, as it was developed in sociology as related to the poor (Germani and in anthropology as it was related to involuntary marginality and voluntary marginality (Victor Turner. It then examines a (normative scheme' in antiquity that creates involuntary marginality at the macrosocial level, namely, Lenski's social stratification model in an agrarian society, and indicates how Matthean language might fit with a sample inventory  of socioreligious roles. Next, it examines some (normative schemes' in  antiquity for voluntary margi-nality at the microsocial level, namely, groups, and examines how the Matthean gospel would fit based on indications of factions and leaders. The article ,shows that the author of the Gospel of Matthew has an ideology of (voluntary marginality', but his gospel includes some hope for (involuntary  marginals' in  the  real world, though it is somewhat tempered. It also suggests that the writer of the Gospel is a (marginal man', especially in the sense defined by the early theorists (Park; Stone-quist.

  12. Fixing soft margins

    NARCIS (Netherlands)

    P. Kofman (Paul); A. Vaal, de (Albert); C.G. de Vries (Casper)

    1993-01-01

    textabstractNon-parametric tolerance limits are employed to calculate soft margins such as advocated in Williamson's target zone proposal. In particular, the tradeoff between softness and zone width is quantified. This may be helpful in choosing appropriate margins. Furthermore, it offers

  13. IAEA safeguards and classified materials

    International Nuclear Information System (INIS)

    Pilat, J.F.; Eccleston, G.W.; Fearey, B.L.; Nicholas, N.J.; Tape, J.W.; Kratzer, M.

    1997-01-01

    The international community in the post-Cold War period has suggested that the International Atomic Energy Agency (IAEA) utilize its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials, some of which are classified, under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring classified materials. A traditional safeguards approach, based on nuclear material accountancy, would seem unavoidably to reveal classified information. However, further analysis of the IAEA's safeguards approaches is warranted in order to understand fully the scope and nature of any problems. The issues are complex and difficult, and it is expected that common technical understandings will be essential for their resolution. Accordingly, this paper examines and compares traditional safeguards item accounting of fuel at a nuclear power station (especially spent fuel) with the challenges presented by inspections of classified materials. This analysis is intended to delineate more clearly the problems as well as reveal possible approaches, techniques, and technologies that could allow the adaptation of safeguards to the unprecedented task of inspecting classified materials. It is also hoped that a discussion of these issues can advance ongoing political-technical debates on international inspections of excess classified materials

  14. Hybrid classifiers methods of data, knowledge, and classifier combination

    CERN Document Server

    Wozniak, Michal

    2014-01-01

    This book delivers a definite and compact knowledge on how hybridization can help improving the quality of computer classification systems. In order to make readers clearly realize the knowledge of hybridization, this book primarily focuses on introducing the different levels of hybridization and illuminating what problems we will face with as dealing with such projects. In the first instance the data and knowledge incorporated in hybridization were the action points, and then a still growing up area of classifier systems known as combined classifiers was considered. This book comprises the aforementioned state-of-the-art topics and the latest research results of the author and his team from Department of Systems and Computer Networks, Wroclaw University of Technology, including as classifier based on feature space splitting, one-class classification, imbalance data, and data stream classification.

  15. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  16. Refining margins: recent trends

    International Nuclear Information System (INIS)

    Baudoin, C.; Favennec, J.P.

    1999-01-01

    Despite a business environment that was globally mediocre due primarily to the Asian crisis and to a mild winter in the northern hemisphere, the signs of improvement noted in the refining activity in 1996 were borne out in 1997. But the situation is not yet satisfactory in this sector: the low return on invested capital and the financing of environmental protection expenditure are giving cause for concern. In 1998, the drop in crude oil prices and the concomitant fall in petroleum product prices was ultimately rather favorable to margins. Two elements tended to put a damper on this relative optimism. First of all, margins continue to be extremely volatile and, secondly, the worsening of the economic and financial crisis observed during the summer made for a sharp decline in margins in all geographic regions, especially Asia. Since the beginning of 1999, refining margins are weak and utilization rates of refining capacities have decreased. (authors)

  17. SOCIAL MARGINALIZATION AND HEALTH

    Directory of Open Access Journals (Sweden)

    Marjana Bogdanović

    2007-04-01

    Full Text Available The 20th century was characterized by special improvement in health. The aim of WHO’s policy EQUITY IN HEALTH is to enable equal accessibility and equal high quality of health care for all citizens. More or less some social groups have stayed out of many social systems even out of health care system in the condition of social marginalization. Phenomenon of social marginalization is characterized by dynamics. Marginalized persons have lack of control over their life and available resources. Social marginalization stands for a stroke on health and makes the health status worse. Low socio-economic level dramatically influences people’s health status, therefore, poverty and illness work together. Characteristic marginalized groups are: Roma people, people with AIDS, prisoners, persons with development disorders, persons with mental health disorders, refugees, homosexual people, delinquents, prostitutes, drug consumers, homeless…There is a mutual responsibility of community and marginalized individuals in trying to resolve the problem. Health and other problems could be solved only by multisector approach to well-designed programs.

  18. Pickering seismic safety margin

    International Nuclear Information System (INIS)

    Ghobarah, A.; Heidebrecht, A.C.; Tso, W.K.

    1992-06-01

    A study was conducted to recommend a methodology for the seismic safety margin review of existing Canadian CANDU nuclear generating stations such as Pickering A. The purpose of the seismic safety margin review is to determine whether the nuclear plant has sufficient seismic safety margin over its design basis to assure plant safety. In this review process, it is possible to identify the weak links which might limit the seismic performance of critical structures, systems and components. The proposed methodology is a modification the EPRI (Electric Power Research Institute) approach. The methodology includes: the characterization of the site margin earthquake, the definition of the performance criteria for the elements of a success path, and the determination of the seismic withstand capacity. It is proposed that the margin earthquake be established on the basis of using historical records and the regional seismo-tectonic and site specific evaluations. The ability of the components and systems to withstand the margin earthquake is determined by database comparisons, inspection, analysis or testing. An implementation plan for the application of the methodology to the Pickering A NGS is prepared

  19. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management, * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  20. Correlation Dimension-Based Classifier

    Czech Academy of Sciences Publication Activity Database

    Jiřina, Marcel; Jiřina jr., M.

    2014-01-01

    Roč. 44, č. 12 (2014), s. 2253-2263 ISSN 2168-2267 R&D Projects: GA MŠk(CZ) LG12020 Institutional support: RVO:67985807 Keywords : classifier * multidimensional data * correlation dimension * scaling exponent * polynomial expansion Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.469, year: 2014

  1. Clinicopathological features of histological transformation from extranodal marginal zone B-cell lymphoma of mucosa-associated lymphoid tissue to diffuse large B-cell lymphoma: an analysis of 467 patients.

    Science.gov (United States)

    Maeshima, Akiko Miyagi; Taniguchi, Hirokazu; Toyoda, Kosuke; Yamauchi, Nobuhiko; Makita, Shinichi; Fukuhara, Suguru; Munakata, Wataru; Maruyama, Dai; Kobayashi, Yukio; Tobinai, Kensei

    2016-09-01

    This study analysed incidence, patient outcome, immunophenotype and prognostic factors of histological transformation (HT) from extranodal marginal zone B-cell lymphoma of mucosa-associated lymphoid tissue (MALT lymphoma) to diffuse large B-cell lymphoma (DLBCL) in 467 patients (median age, 61 years). The primary sites of MALT lymphoma were the stomach (43%), ocular adnexa (25%), lung (8%), systemic (8%) and other tissues (16%). HT occurred in 8% of MALT lymphomas. Risk of HT by 15 years was 5%: 4% in limited-stage diseases (n = 385) and 16% in advanced-stage diseases (n = 56) (P = 0·02). The median time to HT was 48 months (range, 4-139). Five-year progression-free survival (PFS) and overall survival (OS) rates after HT were 80% and 94%, respectively. Immunohistochemical results of DLBCL were as follows: germinal centre B-cell (GCB)/non-GCB, 37%/63%; CD10, 9%; BCL6, 59%; MUM1, 38%; MYC, 42%; BCL2, 35%; Ki67 ≥ 90%, 23%; and CD5, 3%. The majority (75%, 9/12) of GCB-type DLBCLs exhibited CD10(-) , BCL6(+) and MUM1(-) immunophenotypes; the remainder had CD10(+) immunophenotypes. Multivariate analysis revealed that only advanced stage at HT was a significant adverse factor for PFS (P = 0·037). Thus, overall risk of HT was low and prognosis after HT was favourable; however, in advanced-stage cases, risk of HT was relatively high and prognosis was unfavourable. © 2016 John Wiley & Sons Ltd.

  2. The value of breast lumpectomy margin assessment as a predictor of residual tumor burden

    International Nuclear Information System (INIS)

    Wazer, David E.; Schmidt-Ullrich, Rupert K.; Schmid, Christopher H.; Ruthazer, Robin; Kramer, Bradley; Safaii, Homa; Graham, Roger

    1997-01-01

    Purpose: Margin assessment is commonly used as a guide to the relative aggressiveness of therapy for breast conserving treatment (BCT), though its value as a predictor of the presence, type, or extent of residual tumor has not been conclusively studied. Controversy continues to exist as to what constitutes a margin that is 'positive', 'close', or 'negative'. We attempt to address these issues through an analysis of re-excision specimens. Patients and Methods: As part of an institutional prospective practice approach for BCT, 265 cases with AJCC Stage I/II carcinoma with an initial excision margin that was ≤2 mm or indeterminate were subjected to re-excision. The probability of residual tumor (+RE) was evaluated with respect to tumor size, histopathologic subtype, relative closeness of the measured margin, the extent of margin positivity graded as focal, minimal, moderate, or extensive, and the extent of specimen processing as reflected in the number of cut sections per specimen volume (S:V ratio). The amount of residual tumor was graded as microscopic, small, medium, or large. The histopathologic subtype of tumor in the re-excision specimen was classified as having an invasive component (ICa) or pure DCIS (DCIS). Results: The primary excision margin was positive, >0≤1 mm, 1.1-2 mm, and indeterminate in 60%, 18%, 5%, and 17%, respectively. The predominant histopathologies in the initial excision specimens were invasive ductal (IDC) (50%) and tumors with an extensive intraductal component (EIC) (43%). The histopathology of the initial excision specimen was highly predictive of the histopathology of tumor found on re-excision, as residual DCIS was found in 60% of +RE specimens with initial histopathology of EIC compared to 26% for IDC (p 0.001). Neither the extent of margin positivity nor the extent of tumor in the re-excision were significantly related to the initial histopathologic subtype; however, a +RE was seen in 59% of EIC, 43% of IDC, and 32% of invasive

  3. Classified facilities for environmental protection

    International Nuclear Information System (INIS)

    Anon.

    1993-02-01

    The legislation of the classified facilities governs most of the dangerous or polluting industries or fixed activities. It rests on the law of 9 July 1976 concerning facilities classified for environmental protection and its application decree of 21 September 1977. This legislation, the general texts of which appear in this volume 1, aims to prevent all the risks and the harmful effects coming from an installation (air, water or soil pollutions, wastes, even aesthetic breaches). The polluting or dangerous activities are defined in a list called nomenclature which subjects the facilities to a declaration or an authorization procedure. The authorization is delivered by the prefect at the end of an open and contradictory procedure after a public survey. In addition, the facilities can be subjected to technical regulations fixed by the Environment Minister (volume 2) or by the prefect for facilities subjected to declaration (volume 3). (A.B.)

  4. Energy-Efficient Neuromorphic Classifiers.

    Science.gov (United States)

    Martí, Daniel; Rigotti, Mattia; Seok, Mingoo; Fusi, Stefano

    2016-10-01

    Neuromorphic engineering combines the architectural and computational principles of systems neuroscience with semiconductor electronics, with the aim of building efficient and compact devices that mimic the synaptic and neural machinery of the brain. The energy consumptions promised by neuromorphic engineering are extremely low, comparable to those of the nervous system. Until now, however, the neuromorphic approach has been restricted to relatively simple circuits and specialized functions, thereby obfuscating a direct comparison of their energy consumption to that used by conventional von Neumann digital machines solving real-world tasks. Here we show that a recent technology developed by IBM can be leveraged to realize neuromorphic circuits that operate as classifiers of complex real-world stimuli. Specifically, we provide a set of general prescriptions to enable the practical implementation of neural architectures that compete with state-of-the-art classifiers. We also show that the energy consumption of these architectures, realized on the IBM chip, is typically two or more orders of magnitude lower than that of conventional digital machines implementing classifiers with comparable performance. Moreover, the spike-based dynamics display a trade-off between integration time and accuracy, which naturally translates into algorithms that can be flexibly deployed for either fast and approximate classifications, or more accurate classifications at the mere expense of longer running times and higher energy costs. This work finally proves that the neuromorphic approach can be efficiently used in real-world applications and has significant advantages over conventional digital devices when energy consumption is considered.

  5. Univariate decision tree induction using maximum margin classification

    OpenAIRE

    Yıldız, Olcay Taner

    2012-01-01

    In many pattern recognition applications, first decision trees are used due to their simplicity and easily interpretable nature. In this paper, we propose a new decision tree learning algorithm called univariate margin tree where, for each continuous attribute, the best split is found using convex optimization. Our simulation results on 47 data sets show that the novel margin tree classifier performs at least as good as C4.5 and linear discriminant tree (LDT) with a similar time complexity. F...

  6. Marginal kidney donor

    Directory of Open Access Journals (Sweden)

    Ganesh Gopalakrishnan

    2007-01-01

    Full Text Available Renal transplantation is the treatment of choice for a medically eligible patient with end stage renal disease. The number of renal transplants has increased rapidly over the last two decades. However, the demand for organs has increased even more. This disparity between the availability of organs and waitlisted patients for transplants has forced many transplant centers across the world to use marginal kidneys and donors. We performed a Medline search to establish the current status of marginal kidney donors in the world. Transplant programs using marginal deceased renal grafts is well established. The focus is now on efforts to improve their results. Utilization of non-heart-beating donors is still in a plateau phase and comprises a minor percentage of deceased donations. The main concern is primary non-function of the renal graft apart from legal and ethical issues. Transplants with living donors outnumbered cadaveric transplants at many centers in the last decade. There has been an increased use of marginal living kidney donors with some acceptable medical risks. Our primary concern is the safety of the living donor. There is not enough scientific data available to quantify the risks involved for such donation. The definition of marginal living donor is still not clear and there are no uniform recommendations. The decision must be tailored to each donor who in turn should be actively involved at all levels of the decision-making process. In the current circumstances, our responsibility is very crucial in making decisions for either accepting or rejecting a marginal living donor.

  7. 76 FR 34761 - Classified National Security Information

    Science.gov (United States)

    2011-06-14

    ... MARINE MAMMAL COMMISSION Classified National Security Information [Directive 11-01] AGENCY: Marine... Commission's (MMC) policy on classified information, as directed by Information Security Oversight Office... of Executive Order 13526, ``Classified National Security Information,'' and 32 CFR part 2001...

  8. From Borders to Margins

    DEFF Research Database (Denmark)

    Parker, Noel

    2009-01-01

    of entities that are ever open to identity shifts.  The concept of the margin possesses a much wider reach than borders, and focuses continual attention on the meetings and interactions between a range of indeterminate entities whose interactions may determine both themselves and the types of entity...... upon Deleuze's philosophy to set out an ontology in which the continual reformulation of entities in play in ‘post-international' society can be grasped.  This entails a strategic shift from speaking about the ‘borders' between sovereign states to referring instead to the ‘margins' between a plethora...

  9. Methylation patterns in marginal zone lymphoma.

    Science.gov (United States)

    Arribas, Alberto J; Bertoni, Francesco

    Promoter DNA methylation is a major regulator of gene expression and transcription. The identification of methylation changes is important for understanding disease pathogenesis, for identifying prognostic markers and can drive novel therapeutic approaches. In this review we summarize the current knowledge regarding DNA methylation in MALT lymphoma, splenic marginal zone lymphoma, nodal marginal zone lymphoma. Despite important differences in the study design for different publications and the existence of a sole large and genome-wide methylation study for splenic marginal zone lymphoma, it is clear that DNA methylation plays an important role in marginal zone lymphomas, in which it contributes to the inactivation of tumor suppressors but also to the expression of genes sustaining tumor cell survival and proliferation. Existing preclinical data provide the rationale to target the methylation machinery in these disorders. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Limitations of ''margin'' in qualification tests

    International Nuclear Information System (INIS)

    Clough, R.L.; Gillen, K.T.

    1984-01-01

    We have carried out investigations of polymer radiation degradation behaviors which have brought to light a number of reasons why this concept of margin can break down. First of all, we have found that dose-rate effects vary greatly in magnitude. Thus, based on high dose-rate testing, poor materials with large dose-rate effects may be selected over better materials with small effects. Also, in certain cases, material properties have been found to level out (as with PVC) or reverse trend (as with buna-n) at high doses, so that ''margin'' may be ineffective, misleading, or counterproductive. For Viton, the material properties were found to change in opposite directions at high and low dose rates, making ''margin'' inappropriate. The underlying problem with the concept of ''margin'' is that differences in aging conditions can lead to fundamental differences in degradation mechanisms

  11. Splenic marginal zone lymphoma.

    Science.gov (United States)

    Piris, Miguel A; Onaindía, Arantza; Mollejo, Manuela

    Splenic marginal zone lymphoma (SMZL) is an indolent small B-cell lymphoma involving the spleen and bone marrow characterized by a micronodular tumoral infiltration that replaces the preexisting lymphoid follicles and shows marginal zone differentiation as a distinctive finding. SMZL cases are characterized by prominent splenomegaly and bone marrow and peripheral blood infiltration. Cells in peripheral blood show a villous cytology. Bone marrow and peripheral blood characteristic features usually allow a diagnosis of SMZL to be performed. Mutational spectrum of SMZL identifies specific findings, such as 7q loss and NOTCH2 and KLF2 mutations, both genes related with marginal zone differentiation. There is a striking clinical variability in SMZL cases, dependent of the tumoral load and performance status. Specific molecular markers such as 7q loss, p53 loss/mutation, NOTCH2 and KLF2 mutations have been found to be associated with the clinical variability. Distinction from Monoclonal B-cell lymphocytosis with marginal zone phenotype is still an open issue that requires identification of precise and specific thresholds with clinical meaning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Komorbiditet ved marginal parodontitis

    DEFF Research Database (Denmark)

    Holmstrup, Palle; Damgaard, Christian; Olsen, Ingar

    2017-01-01

    Nærværende artikel præsenterer en oversigt over den foreliggende væsentligste viden om sammenhængen mellem marginal parodontitis og en række medicinske sygdomme, herunder hjerte-kar-sygdomme, diabetes mellitus, reumatoid arthritis, osteoporose, Parkinsons sygdom, Alzheimers sygdom, psoriasis og...

  13. Marginally Deformed Starobinsky Gravity

    DEFF Research Database (Denmark)

    Codello, A.; Joergensen, J.; Sannino, Francesco

    2015-01-01

    We show that quantum-induced marginal deformations of the Starobinsky gravitational action of the form $R^{2(1 -\\alpha)}$, with $R$ the Ricci scalar and $\\alpha$ a positive parameter, smaller than one half, can account for the recent experimental observations by BICEP2 of primordial tensor modes....

  14. Deep continental margin reflectors

    Science.gov (United States)

    Ewing, J.; Heirtzler, J.; Purdy, M.; Klitgord, Kim D.

    1985-01-01

    In contrast to the rarity of such observations a decade ago, seismic reflecting and refracting horizons are now being observed to Moho depths under continental shelves in a number of places. These observations provide knowledge of the entire crustal thickness from the shoreline to the oceanic crust on passive margins and supplement Consortium for Continental Reflection Profiling (COCORP)-type measurements on land.

  15. Marginalization and School Nursing

    Science.gov (United States)

    Smith, Julia Ann

    2004-01-01

    The concept of marginalization was first analyzed by nursing researchers Hall, Stevens, and Meleis. Although nursing literature frequently refers to this concept when addressing "at risk" groups such as the homeless, gays and lesbians, and those infected with HIV/AIDS, the concept can also be applied to nursing. Analysis of current school nursing…

  16. A Supervised Multiclass Classifier for an Autocoding System

    Directory of Open Access Journals (Sweden)

    Yukako Toko

    2017-11-01

    Full Text Available Classification is often required in various contexts, including in the field of official statistics. In the previous study, we have developed a multiclass classifier that can classify short text descriptions with high accuracy. The algorithm borrows the concept of the naïve Bayes classifier and is so simple that its structure is easily understandable. The proposed classifier has the following two advantages. First, the processing times for both learning and classifying are extremely practical. Second, the proposed classifier yields high-accuracy results for a large portion of a dataset. We have previously developed an autocoding system for the Family Income and Expenditure Survey in Japan that has a better performing classifier. While the original system was developed in Perl in order to improve the efficiency of the coding process of short Japanese texts, the proposed system is implemented in the R programming language in order to explore versatility and is modified to make the system easily applicable to English text descriptions, in consideration of the increasing number of R users in the field of official statistics. We are planning to publish the proposed classifier as an R-package. The proposed classifier would be generally applicable to other classification tasks including coding activities in the field of official statistics, and it would contribute greatly to improving their efficiency.

  17. Waste classifying and separation device

    International Nuclear Information System (INIS)

    Kakiuchi, Hiroki.

    1997-01-01

    A flexible plastic bags containing solid wastes of indefinite shape is broken and the wastes are classified. The bag cutting-portion of the device has an ultrasonic-type or a heater-type cutting means, and the cutting means moves in parallel with the transferring direction of the plastic bags. A classification portion separates and discriminates the plastic bag from the contents and conducts classification while rotating a classification table. Accordingly, the plastic bag containing solids of indefinite shape can be broken and classification can be conducted efficiently and reliably. The device of the present invention has a simple structure which requires small installation space and enables easy maintenance. (T.M.)

  18. A quantitative analysis of transtensional margin width

    Science.gov (United States)

    Jeanniot, Ludovic; Buiter, Susanne J. H.

    2018-06-01

    Continental rifted margins show variations between a few hundred to almost a thousand kilometres in their conjugated widths from the relatively undisturbed continent to the oceanic crust. Analogue and numerical modelling results suggest that the conjugated width of rifted margins may have a relationship to their obliquity of divergence, with narrower margins occurring for higher obliquity. We here test this prediction by analysing the obliquity and rift width for 26 segments of transtensional conjugate rifted margins in the Atlantic and Indian Oceans. We use the plate reconstruction software GPlates (http://www.gplates.org) for different plate rotation models to estimate the direction and magnitude of rifting from the initial phases of continental rifting until breakup. Our rift width corresponds to the distance between the onshore maximum topography and the last identified continental crust. We find a weak positive correlation between the obliquity of rifting and rift width. Highly oblique margins are narrower than orthogonal margins, as expected from analogue and numerical models. We find no relationships between rift obliquities and rift duration nor the presence or absence of Large Igneous Provinces (LIPs).

  19. Middlemen Margins and Globalization

    OpenAIRE

    Pranab Bardhan; Dilip Mookherjee; Masatoshi Tsumagari

    2013-01-01

    We develop a theory of trading middlemen or entrepreneurs who perform financing, quality supervision and marketing roles for goods produced by suppliers or workers. Brand-name reputations are necessary to overcome product quality moral hazard problems; middlemen margins represent reputational incentive rents. We develop a two sector North-South model of competitive equilibrium, with endogenous sorting of agents with heterogenous entrepreneurial abilities into sectors and occupations. The Sout...

  20. Containment safety margins

    International Nuclear Information System (INIS)

    Von Riesemann, W.A.

    1980-01-01

    Objective of the Containment Safety Margins program is the development and verification of methodologies which are capable of reliably predicting the ultimate load-carrying capability of light water reactor containment structures under accident and severe environments. The program was initiated in June 1980 at Sandia and this paper addresses the first phase of the program which is essentially a planning effort. Brief comments are made about the second phase, which will involve testing of containment models

  1. Marginalized Youth. An Introduction.

    OpenAIRE

    Kessl, Fabian; Otto, Hans-Uwe

    2009-01-01

    The life conduct of marginalized groups has become subject to increasing levels of risk in advanced capitalist societies. In particular, children and young people are confronted with the harsh consequences of a “new poverty” in the contemporary era. The demographic complexion of today’s poverty is youthful, as a number of government reports have once again documented in recent years in Australia, Germany, France, Great Britain, the US or Scandinavian countries. Key youth studies have shown a ...

  2. Using Neural Networks to Classify Digitized Images of Galaxies

    Science.gov (United States)

    Goderya, S. N.; McGuire, P. C.

    2000-12-01

    Automated classification of Galaxies into Hubble types is of paramount importance to study the large scale structure of the Universe, particularly as survey projects like the Sloan Digital Sky Survey complete their data acquisition of one million galaxies. At present it is not possible to find robust and efficient artificial intelligence based galaxy classifiers. In this study we will summarize progress made in the development of automated galaxy classifiers using neural networks as machine learning tools. We explore the Bayesian linear algorithm, the higher order probabilistic network, the multilayer perceptron neural network and Support Vector Machine Classifier. The performance of any machine classifier is dependant on the quality of the parameters that characterize the different groups of galaxies. Our effort is to develop geometric and invariant moment based parameters as input to the machine classifiers instead of the raw pixel data. Such an approach reduces the dimensionality of the classifier considerably, and removes the effects of scaling and rotation, and makes it easier to solve for the unknown parameters in the galaxy classifier. To judge the quality of training and classification we develop the concept of Mathews coefficients for the galaxy classification community. Mathews coefficients are single numbers that quantify classifier performance even with unequal prior probabilities of the classes.

  3. Mercury⊕: An evidential reasoning image classifier

    Science.gov (United States)

    Peddle, Derek R.

    1995-12-01

    MERCURY⊕ is a multisource evidential reasoning classification software system based on the Dempster-Shafer theory of evidence. The design and implementation of this software package is described for improving the classification and analysis of multisource digital image data necessary for addressing advanced environmental and geoscience applications. In the remote-sensing context, the approach provides a more appropriate framework for classifying modern, multisource, and ancillary data sets which may contain a large number of disparate variables with different statistical properties, scales of measurement, and levels of error which cannot be handled using conventional Bayesian approaches. The software uses a nonparametric, supervised approach to classification, and provides a more objective and flexible interface to the evidential reasoning framework using a frequency-based method for computing support values from training data. The MERCURY⊕ software package has been implemented efficiently in the C programming language, with extensive use made of dynamic memory allocation procedures and compound linked list and hash-table data structures to optimize the storage and retrieval of evidence in a Knowledge Look-up Table. The software is complete with a full user interface and runs under Unix, Ultrix, VAX/VMS, MS-DOS, and Apple Macintosh operating system. An example of classifying alpine land cover and permafrost active layer depth in northern Canada is presented to illustrate the use and application of these ideas.

  4. Marginal Models for Categorial Data

    NARCIS (Netherlands)

    Bergsma, W.P.; Rudas, T.

    2002-01-01

    Statistical models defined by imposing restrictions on marginal distributions of contingency tables have received considerable attention recently. This paper introduces a general definition of marginal log-linear parameters and describes conditions for a marginal log-linear parameter to be a smooth

  5. Masculinity at the margins

    DEFF Research Database (Denmark)

    Jensen, Sune Qvotrup

    2010-01-01

    and other types of material. Taking the concepts of othering, intersectionality and marginality as point of departure the article analyses how these young men experience othering and how they react to it. One type of reaction, described as stylization, relies on accentuating the latently positive symbolic...... of critique although in a masculinist way. These reactions to othering represent a challenge to researchers interested in intersectionality and gender, because gender is reproduced as a hierarchical form of social differentiation at the same time as racism is both reproduced and resisted....

  6. Classifying prion and prion-like phenomena.

    Science.gov (United States)

    Harbi, Djamel; Harrison, Paul M

    2014-01-01

    The universe of prion and prion-like phenomena has expanded significantly in the past several years. Here, we overview the challenges in classifying this data informatically, given that terms such as "prion-like", "prion-related" or "prion-forming" do not have a stable meaning in the scientific literature. We examine the spectrum of proteins that have been described in the literature as forming prions, and discuss how "prion" can have a range of meaning, with a strict definition being for demonstration of infection with in vitro-derived recombinant prions. We suggest that although prion/prion-like phenomena can largely be apportioned into a small number of broad groups dependent on the type of transmissibility evidence for them, as new phenomena are discovered in the coming years, a detailed ontological approach might be necessary that allows for subtle definition of different "flavors" of prion / prion-like phenomena.

  7. Composite Classifiers for Automatic Target Recognition

    National Research Council Canada - National Science Library

    Wang, Lin-Cheng

    1998-01-01

    ...) using forward-looking infrared (FLIR) imagery. Two existing classifiers, one based on learning vector quantization and the other on modular neural networks, are used as the building blocks for our composite classifiers...

  8. Aggregation Operator Based Fuzzy Pattern Classifier Design

    DEFF Research Database (Denmark)

    Mönks, Uwe; Larsen, Henrik Legind; Lohweg, Volker

    2009-01-01

    This paper presents a novel modular fuzzy pattern classifier design framework for intelligent automation systems, developed on the base of the established Modified Fuzzy Pattern Classifier (MFPC) and allows designing novel classifier models which are hardware-efficiently implementable....... The performances of novel classifiers using substitutes of MFPC's geometric mean aggregator are benchmarked in the scope of an image processing application against the MFPC to reveal classification improvement potentials for obtaining higher classification rates....

  9. 15 CFR 4.8 - Classified Information.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Classified Information. 4.8 Section 4... INFORMATION Freedom of Information Act § 4.8 Classified Information. In processing a request for information..., the information shall be reviewed to determine whether it should remain classified. Ordinarily the...

  10. The role of deep-water sedimentary processes in shaping a continental margin: The Northwest Atlantic

    Science.gov (United States)

    Mosher, David C.; Campbell, D.C.; Gardner, J.V.; Piper, D.J.W.; Chaytor, Jason; Rebesco, M.

    2017-01-01

    The tectonic history of a margin dictates its general shape; however, its geomorphology is generally transformed by deep-sea sedimentary processes. The objective of this study is to show the influences of turbidity currents, contour currents and sediment mass failures on the geomorphology of the deep-water northwestern Atlantic margin (NWAM) between Blake Ridge and Hudson Trough, spanning about 32° of latitude and the shelf edge to the abyssal plain. This assessment is based on new multibeam echosounder data, global bathymetric models and sub-surface geophysical information.The deep-water NWAM is divided into four broad geomorphologic classifications based on their bathymetric shape: graded, above-grade, stepped and out-of-grade. These shapes were created as a function of the balance between sediment accumulation and removal that in turn were related to sedimentary processes and slope-accommodation. This descriptive method of classifying continental margins, while being non-interpretative, is more informative than the conventional continental shelf, slope and rise classification, and better facilitates interpretation concerning dominant sedimentary processes.Areas of the margin dominated by turbidity currents and slope by-pass developed graded slopes. If sediments did not by-pass the slope due to accommodation then an above grade or stepped slope resulted. Geostrophic currents created sedimentary bodies of a variety of forms and positions along the NWAM. Detached drifts form linear, above-grade slopes along their crests from the shelf edge to the deep basin. Plastered drifts formed stepped slope profiles. Sediment mass failure has had a variety of consequences on the margin morphology; large mass-failures created out-of-grade profiles, whereas smaller mass failures tended to remain on the slope and formed above-grade profiles at trough-mouth fans, or nearly graded profiles, such as offshore Cape Fear.

  11. Robust Combining of Disparate Classifiers Through Order Statistics

    Science.gov (United States)

    Tumer, Kagan; Ghosh, Joydeep

    2001-01-01

    Integrating the outputs of multiple classifiers via combiners or meta-learners has led to substantial improvements in several difficult pattern recognition problems. In this article we investigate a family of combiners based on order statistics, for robust handling of situations where there are large discrepancies in performance of individual classifiers. Based on a mathematical modeling of how the decision boundaries are affected by order statistic combiners, we derive expressions for the reductions in error expected when simple output combination methods based on the the median, the maximum and in general, the ith order statistic, are used. Furthermore, we analyze the trim and spread combiners, both based on linear combinations of the ordered classifier outputs, and show that in the presence of uneven classifier performance, they often provide substantial gains over both linear and simple order statistics combiners. Experimental results on both real world data and standard public domain data sets corroborate these findings.

  12. Preoperative imaging and surgical margins in maxillectomy patients

    NARCIS (Netherlands)

    Kreeft, Anne Marijn; Smeele, Ludwig E.; Rasch, Coen R. N.; Hauptmann, Michael; Rietveld, Derk H. F.; Leemans, C. René; Balm, Alfons J. M.

    2012-01-01

    Background High rates of positive surgical margins are reported after a maxillectomy. A large part of tumors that are preoperatively considered operable can thus not be resected with tumor-free margins. Methods This was a retrospective study on medical files of 69 patients that underwent

  13. Analysis of safety margins for PuO2 containers

    International Nuclear Information System (INIS)

    Hubert, P.; Tomachevsky, E.

    1987-11-01

    In the regular manner the containers for PuO 2 transport are type B(U) and give satisfaction to the AIEA proofs. However the vigour of this conception's containers and the analysis of other radioactive containers permit to think that large safety margins exist. In this paper, the importance and the kind of these margins are studied [fr

  14. A Politics of Marginability

    DEFF Research Database (Denmark)

    Pallesen, Cecil Marie

    2015-01-01

    always been contested and to some extent vulnerable. However, the Indian communities are strong socially and economically, and the vast majority of its people have great international networks and several potential plans or strategies for the future, should the political climate in Tanzania become......In the end of the 19th century, Indians began settling in East Africa. Most of them left Gujarat because of drought and famine, and they were in search for business opportunities and a more comfortable life. Within the following decades, many of them went from being small-scale entrepreneurs to big...... hostile towards them. I argue that this migrant group is unique being marginalized and strong at the same time, and I explain this uniqueness by several features in the Indian migrants’ cultural and religious background, in colonial and post-colonial Tanzania, and in the Indians’ role as middlemen between...

  15. The Need for Consensus and Transparency in Assessing Population-Based Rates of Positive Circumferential Radial Margins in Rectal Cancer: Data from Consecutive Cases in a Large Region of Ontario, Canada.

    Science.gov (United States)

    Keng, Christine; Coates, Angela; Grubac, Vanja; Lovrics, Peter; DeNardi, Franco; Thabane, Lehana; Simunovic, Marko

    2016-02-01

    A positive circumferential radial margin (CRM) after rectal cancer surgery is an important predictor of local recurrence. The definition of a positive CRM differs internationally, and reported rates vary greatly in the literature. This study used time-series population-based data to assess positive CRM rates in a region over time and to inform future methods of CRM analysis in a defined geographic area. Chart reviews provided relevant data from consecutive patients undergoing rectal cancer surgery between 2006 and 2012 in all hospitals of the authors' region. Outcomes included rates for pathologic examination of CRM, CRM distance reporting, and positive CRM. The rate of positive CRM was calculated using various definitions. The variations included positive margin cutoffs of CRM at 1 mm or less versus 2 mm or less and inclusion or exclusion of cases without CRM assessment. In this study, 1222 consecutive rectal cancer cases were analyzed. The rate for pathology reporting of CRM distance increased from 54.7 to 93.2 % during the study. Depending on how the rate of positive CRM was defined, its value varied 8.5 to 19.4 % in 2006 and 6.0 to 12.5 % in 2012. Using a pre-specified definition, the rate of positive CRM decreased over time from 14.0 to 6.3 %. A marked increase in CRM distance reporting was observed, whereas the rates of positive CRM dropped, suggesting improved pathologist and surgeon performance over time. Changing definitions greatly influenced the rates of positive CRM, indicating the need for more transparency when such population-based rates are reported in the literature.

  16. Tectonic signatures on active margins

    Science.gov (United States)

    Hogarth, Leah Jolynn

    High-resolution Compressed High-Intensity Radar Pulse (CHIRP) surveys offshore of La Jolla in southern California and the Eel River in northern California provide the opportunity to investigate the role of tectonics in the formation of stratigraphic architecture and margin morphology. Both study sites are characterized by shore-parallel tectonic deformation, which is largely observed in the structure of the prominent angular unconformity interpreted as the transgressive surface. Based on stratal geometry and acoustic character, we identify three sedimentary sequences offshore of La Jolla: an acoustically laminated estuarine unit deposited during early transgression, an infilling or "healing-phase" unit formed during the transgression, and an upper transparent unit. The estuarine unit is confined to the canyon edges in what may have been embayments during the last sea-level rise. The healing-phase unit appears to infill rough areas on the transgressive surface that may be related to relict fault structures. The upper transparent unit is largely controlled by long-wavelength tectonic deformation due to the Rose Canyon Fault. This unit is also characterized by a mid-shelf (˜40 m water depth) thickness high, which is likely a result of hydrodynamic forces and sediment grain size. On the Eel margin, we observe three distinct facies: a seaward-thinning unit truncated by the transgressive surface, a healing-phase unit confined to the edges of a broad structural high, and a highly laminated upper unit. The seaward-thinning wedge of sediment below the transgressive surface is marked by a number of channels that we interpret as distributary channels based on their morphology. Regional divergence of the sequence boundary and transgressive surface with up to ˜8 m of sediment preserved across the interfluves suggests the formation of subaerial accommodation during the lowstand. The healing-phase, much like that in southern California, appears to infill rough areas in the

  17. Jatropha potential on marginal land in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    narrative. But both the availability and suitability of “marginal” land for commercial level jatropha production is not well understood/examined, especially in Africa. Using a case study of large-scale jatropha plantation in Ethiopia, this paper examines the process of land identification for jatropha....... The increasing trend of land acquisition for biofuels has led to the widespread debate about food versus biofuel because of the perceived competition for land and water. To avoid the food versus fuel debate, the use of “marginal” land for biofuel feedstock production (jatropha) has emerged as a dominant...... investments, and the agronomic performance of large-scale jatropha plantation on so-called marginal land. Although it has been argued that jatropha can be grown well on marginal land without irrigation, and thus does not compete for land and water or displace food production from agricultural land, this study...

  18. Oceanography of marginal seas

    Digital Repository Service at National Institute of Oceanography (India)

    DileepKumar, M.

    in the first two shallow seas are driven by surface densification following evaporation that in the latter is largely influenced by freshwater discharge from Irrawaddy and inflows across the Andaman Ridge from east Bay of Bengal. Biological productivity...

  19. Exploring diversity in ensemble classification: Applications in large area land cover mapping

    Science.gov (United States)

    Mellor, Andrew; Boukir, Samia

    2017-07-01

    Ensemble classifiers, such as random forests, are now commonly applied in the field of remote sensing, and have been shown to perform better than single classifier systems, resulting in reduced generalisation error. Diversity across the members of ensemble classifiers is known to have a strong influence on classification performance - whereby classifier errors are uncorrelated and more uniformly distributed across ensemble members. The relationship between ensemble diversity and classification performance has not yet been fully explored in the fields of information science and machine learning and has never been examined in the field of remote sensing. This study is a novel exploration of ensemble diversity and its link to classification performance, applied to a multi-class canopy cover classification problem using random forests and multisource remote sensing and ancillary GIS data, across seven million hectares of diverse dry-sclerophyll dominated public forests in Victoria Australia. A particular emphasis is placed on analysing the relationship between ensemble diversity and ensemble margin - two key concepts in ensemble learning. The main novelty of our work is on boosting diversity by emphasizing the contribution of lower margin instances used in the learning process. Exploring the influence of tree pruning on diversity is also a new empirical analysis that contributes to a better understanding of ensemble performance. Results reveal insights into the trade-off between ensemble classification accuracy and diversity, and through the ensemble margin, demonstrate how inducing diversity by targeting lower margin training samples is a means of achieving better classifier performance for more difficult or rarer classes and reducing information redundancy in classification problems. Our findings inform strategies for collecting training data and designing and parameterising ensemble classifiers, such as random forests. This is particularly important in large area

  20. Workers' marginal costs of commuting

    DEFF Research Database (Denmark)

    van Ommeren, Jos; Fosgerau, Mogens

    2009-01-01

    This paper applies a dynamic search model to estimate workers' marginal costs of commuting, including monetary and time costs. Using data on workers' job search activity as well as moving behaviour, for the Netherlands, we provide evidence that, on average, workers' marginal costs of one hour...

  1. Margin improvement initiatives: realistic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Chan, P.K.; Paquette, S. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Cunning, T.A. [Department of National Defence, Ottawa, ON (Canada); French, C.; Bonin, H.W. [Royal Military College of Canada, Chemistry and Chemical Engineering Dept., Kingston, ON (Canada); Pandey, M. [Univ. of Waterloo, Waterloo, ON (Canada); Murchie, M. [Cameco Fuel Manufacturing, Port Hope, ON (Canada)

    2014-07-01

    With reactor core aging, safety margins are particularly tight. Two realistic and practical approaches are proposed here to recover margins. The first project is related to the use of a small amount of neutron absorbers in CANDU Natural Uranium (NU) fuel bundles. Preliminary results indicate that the fuelling transient and subsequent reactivity peak can be lowered to improve the reactor's operating margins, with minimal impact on burnup when less than 1000 mg of absorbers is added to a fuel bundle. The second project involves the statistical analysis of fuel manufacturing data to demonstrate safety margins. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to generate input for ELESTRES and ELOCA. It is found that the fuel response distributions are far below industrial failure limits, implying that margin exists in the current fuel design. (author)

  2. Comparing classifiers for pronunciation error detection

    NARCIS (Netherlands)

    Strik, H.; Truong, K.; Wet, F. de; Cucchiarini, C.

    2007-01-01

    Providing feedback on pronunciation errors in computer assisted language learning systems requires that pronunciation errors be detected automatically. In the present study we compare four types of classifiers that can be used for this purpose: two acoustic-phonetic classifiers (one of which employs

  3. Feature extraction for dynamic integration of classifiers

    NARCIS (Netherlands)

    Pechenizkiy, M.; Tsymbal, A.; Puuronen, S.; Patterson, D.W.

    2007-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. In this paper, we present an algorithm for the dynamic integration of classifiers in the space of extracted features (FEDIC). It is based on the technique

  4. Deconvolution When Classifying Noisy Data Involving Transformations

    KAUST Repository

    Carroll, Raymond

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  5. Deconvolution When Classifying Noisy Data Involving Transformations.

    Science.gov (United States)

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-09-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  6. Deconvolution When Classifying Noisy Data Involving Transformations

    KAUST Repository

    Carroll, Raymond; Delaigle, Aurore; Hall, Peter

    2012-01-01

    In the present study, we consider the problem of classifying spatial data distorted by a linear transformation or convolution and contaminated by additive random noise. In this setting, we show that classifier performance can be improved if we carefully invert the data before the classifier is applied. However, the inverse transformation is not constructed so as to recover the original signal, and in fact, we show that taking the latter approach is generally inadvisable. We introduce a fully data-driven procedure based on cross-validation, and use several classifiers to illustrate numerical properties of our approach. Theoretical arguments are given in support of our claims. Our procedure is applied to data generated by light detection and ranging (Lidar) technology, where we improve on earlier approaches to classifying aerosols. This article has supplementary materials online.

  7. The Seismicity of Two Hyperextended Margins

    Science.gov (United States)

    Redfield, Tim; Terje Osmundsen, Per

    2013-04-01

    A seismic belt marks the outermost edge of Scandinavia's proximal margin, inboard of and roughly parallel to the Taper Break. A similar near- to onshore seismic belt runs along its inner edge, roughly parallel to and outboard of the asymmetric, seaward-facing escarpment. The belts converge at both the northern and southern ends of Scandinavia, where crustal taper is sharp and the proximal margin is narrow. Very few seismic events have been recorded on the intervening, gently-tapering Trøndelag Platform. Norway's distribution of seismicity is systematically ordered with respect to 1) the structural templates of high-beta extension that shaped the thinning gradient during Late Jurassic or Early Cretaceous time, and 2) the topographically resurgent Cretaceous-Cenozoic "accommodation phase" family of escarpments that approximate the innermost limit of crustal thinning [See Redfield and Osmundsen (2012) for diagrams, definitions, discussion, and supporting citations.] Landwards from the belt of earthquake epicenters that mark the Taper Break the crust consistently thickens, and large fault arrays tend to sole out at mid crustal levels. Towards the sea the crystalline continental crust is hyperextended, pervasively faulted, and generally very thin. Also, faulting and serpentinization may have affected the uppermost parts of the distal margin's lithospheric mantle. Such contrasting structural conditions may generate a contrasting stiffness: for a given stress, more strain can be accommodated in the distal margin than in the less faulted proximal margin. By way of comparison, inboard of the Taper Break on the gently-tapered Trøndelag Platform, faulting was not penetrative. There, similar structural conditions prevail and proximal margin seismicity is negligible. Because stress concentration can occur where material properties undergo significant contrast, the necking zone may constitute a natural localization point for post-thinning phase earthquakes. In Scandinavia

  8. Consideration of margins for hypo fractionated radiotherapy

    International Nuclear Information System (INIS)

    Herschtal, A.; Foroudi, F.; Kron, T.

    2010-01-01

    Full text: Geographical misses of the tumour are of concern in radiotherapy and are typically accommodated by introducing margins around the target. However, there is a trade-off between ensuring the target receives sufficient dose and minimising the dose to surrounding normal structures. Several methods of determining margin width have been developed with the most commonly used one proposed by M. VanHerk (VanHerk UROBP 52: 1407, 2002). VanHerk's model sets margins to achieve 95% of dose coverage for the target in 90% of patients. However, this model was derived assuming an infinite number of fractions. The aim of the present work is to estimate the modifications necessary to the model if a finite number of fractions are given. Software simulations were used to determine the true probability of a patient achieving 95% target coverage if different fraction numbers are used for a given margin width. Model parameters were informed by a large data set recently acquired at our institution using daily image guidance for prostate cancer patients with implanted fiducial markers. Assuming a 3 mm penumbral width it was found that using the VanHerk model only 74 or 54% of patients receive 95% of the prescription dose if 20 or 6 fractions are given, respectively. The steep dose gradients afforded by IMRT are likely to make consideration of the effects of hypofractionation more important. It is necessary to increase the margins around the target to ensure adequate tumour coverage if hypofractionated radiotherapy is to be used for cancer treatment. (author)

  9. Generalization in the XCSF classifier system: analysis, improvement, and extension.

    Science.gov (United States)

    Lanzi, Pier Luca; Loiacono, Daniele; Wilson, Stewart W; Goldberg, David E

    2007-01-01

    We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.

  10. Logarithmic learning for generalized classifier neural network.

    Science.gov (United States)

    Ozyildirim, Buse Melis; Avci, Mutlu

    2014-12-01

    Generalized classifier neural network is introduced as an efficient classifier among the others. Unless the initial smoothing parameter value is close to the optimal one, generalized classifier neural network suffers from convergence problem and requires quite a long time to converge. In this work, to overcome this problem, a logarithmic learning approach is proposed. The proposed method uses logarithmic cost function instead of squared error. Minimization of this cost function reduces the number of iterations used for reaching the minima. The proposed method is tested on 15 different data sets and performance of logarithmic learning generalized classifier neural network is compared with that of standard one. Thanks to operation range of radial basis function included by generalized classifier neural network, proposed logarithmic approach and its derivative has continuous values. This makes it possible to adopt the advantage of logarithmic fast convergence by the proposed learning method. Due to fast convergence ability of logarithmic cost function, training time is maximally decreased to 99.2%. In addition to decrease in training time, classification performance may also be improved till 60%. According to the test results, while the proposed method provides a solution for time requirement problem of generalized classifier neural network, it may also improve the classification accuracy. The proposed method can be considered as an efficient way for reducing the time requirement problem of generalized classifier neural network. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A CLASSIFIER SYSTEM USING SMOOTH GRAPH COLORING

    Directory of Open Access Journals (Sweden)

    JORGE FLORES CRUZ

    2017-01-01

    Full Text Available Unsupervised classifiers allow clustering methods with less or no human intervention. Therefore it is desirable to group the set of items with less data processing. This paper proposes an unsupervised classifier system using the model of soft graph coloring. This method was tested with some classic instances in the literature and the results obtained were compared with classifications made with human intervention, yielding as good or better results than supervised classifiers, sometimes providing alternative classifications that considers additional information that humans did not considered.

  12. High dimensional classifiers in the imbalanced case

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Jensen, Jens Ledet

    We consider the binary classification problem in the imbalanced case where the number of samples from the two groups differ. The classification problem is considered in the high dimensional case where the number of variables is much larger than the number of samples, and where the imbalance leads...... to a bias in the classification. A theoretical analysis of the independence classifier reveals the origin of the bias and based on this we suggest two new classifiers that can handle any imbalance ratio. The analytical results are supplemented by a simulation study, where the suggested classifiers in some...

  13. Maximum Margin Clustering of Hyperspectral Data

    Science.gov (United States)

    Niazmardi, S.; Safari, A.; Homayouni, S.

    2013-09-01

    In recent decades, large margin methods such as Support Vector Machines (SVMs) are supposed to be the state-of-the-art of supervised learning methods for classification of hyperspectral data. However, the results of these algorithms mainly depend on the quality and quantity of available training data. To tackle down the problems associated with the training data, the researcher put effort into extending the capability of large margin algorithms for unsupervised learning. One of the recent proposed algorithms is Maximum Margin Clustering (MMC). The MMC is an unsupervised SVMs algorithm that simultaneously estimates both the labels and the hyperplane parameters. Nevertheless, the optimization of the MMC algorithm is a non-convex problem. Most of the existing MMC methods rely on the reformulating and the relaxing of the non-convex optimization problem as semi-definite programs (SDP), which are computationally very expensive and only can handle small data sets. Moreover, most of these algorithms are two-class classification, which cannot be used for classification of remotely sensed data. In this paper, a new MMC algorithm is used that solve the original non-convex problem using Alternative Optimization method. This algorithm is also extended for multi-class classification and its performance is evaluated. The results of the proposed algorithm show that the algorithm has acceptable results for hyperspectral data clustering.

  14. An ensemble self-training protein interaction article classifier.

    Science.gov (United States)

    Chen, Yifei; Hou, Ping; Manderick, Bernard

    2014-01-01

    Protein-protein interaction (PPI) is essential to understand the fundamental processes governing cell biology. The mining and curation of PPI knowledge are critical for analyzing proteomics data. Hence it is desired to classify articles PPI-related or not automatically. In order to build interaction article classification systems, an annotated corpus is needed. However, it is usually the case that only a small number of labeled articles can be obtained manually. Meanwhile, a large number of unlabeled articles are available. By combining ensemble learning and semi-supervised self-training, an ensemble self-training interaction classifier called EST_IACer is designed to classify PPI-related articles based on a small number of labeled articles and a large number of unlabeled articles. A biological background based feature weighting strategy is extended using the category information from both labeled and unlabeled data. Moreover, a heuristic constraint is put forward to select optimal instances from unlabeled data to improve the performance further. Experiment results show that the EST_IACer can classify the PPI related articles effectively and efficiently.

  15. Arabic Handwriting Recognition Using Neural Network Classifier

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... an OCR using Neural Network classifier preceded by a set of preprocessing .... Artificial Neural Networks (ANNs), which we adopt in this research, consist of ... advantage and disadvantages of each technique. In [9],. Khemiri ...

  16. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha

    2013-11-25

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  17. Classifiers based on optimal decision rules

    KAUST Repository

    Amin, Talha M.; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2013-01-01

    Based on dynamic programming approach we design algorithms for sequential optimization of exact and approximate decision rules relative to the length and coverage [3, 4]. In this paper, we use optimal rules to construct classifiers, and study two questions: (i) which rules are better from the point of view of classification-exact or approximate; and (ii) which order of optimization gives better results of classifier work: length, length+coverage, coverage, or coverage+length. Experimental results show that, on average, classifiers based on exact rules are better than classifiers based on approximate rules, and sequential optimization (length+coverage or coverage+length) is better than the ordinary optimization (length or coverage).

  18. Combining multiple classifiers for age classification

    CSIR Research Space (South Africa)

    Van Heerden, C

    2009-11-01

    Full Text Available The authors compare several different classifier combination methods on a single task, namely speaker age classification. This task is well suited to combination strategies, since significantly different feature classes are employed. Support vector...

  19. Neural Network Classifiers for Local Wind Prediction.

    Science.gov (United States)

    Kretzschmar, Ralf; Eckert, Pierre; Cattani, Daniel; Eggimann, Fritz

    2004-05-01

    This paper evaluates the quality of neural network classifiers for wind speed and wind gust prediction with prediction lead times between +1 and +24 h. The predictions were realized based on local time series and model data. The selection of appropriate input features was initiated by time series analysis and completed by empirical comparison of neural network classifiers trained on several choices of input features. The selected input features involved day time, yearday, features from a single wind observation device at the site of interest, and features derived from model data. The quality of the resulting classifiers was benchmarked against persistence for two different sites in Switzerland. The neural network classifiers exhibited superior quality when compared with persistence judged on a specific performance measure, hit and false-alarm rates.

  20. Learning Convex Inference of Marginals

    OpenAIRE

    Domke, Justin

    2012-01-01

    Graphical models trained using maximum likelihood are a common tool for probabilistic inference of marginal distributions. However, this approach suffers difficulties when either the inference process or the model is approximate. In this paper, the inference process is first defined to be the minimization of a convex function, inspired by free energy approximations. Learning is then done directly in terms of the performance of the inference process at univariate marginal prediction. The main ...

  1. Steel Industry Marginal Opportunity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2005-09-01

    The Steel Industry Marginal Opportunity Analysis (PDF 347 KB) identifies opportunities for developing advanced technologies and estimates both the necessary funding and the potential payoff. This analysis determines what portion of the energy bandwidth can be captured through the adoption of state-of-the-art technology and practices. R&D opportunities for addressing the remainder of the bandwidth are characterized and plotted on a marginal opportunity curve.

  2. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  3. Binary classifiers and latent sequence models for emotion detection in suicide notes.

    Science.gov (United States)

    Cherry, Colin; Mohammad, Saif M; de Bruijn, Berry

    2012-01-01

    This paper describes the National Research Council of Canada's submission to the 2011 i2b2 NLP challenge on the detection of emotions in suicide notes. In this task, each sentence of a suicide note is annotated with zero or more emotions, making it a multi-label sentence classification task. We employ two distinct large-margin models capable of handling multiple labels. The first uses one classifier per emotion, and is built to simplify label balance issues and to allow extremely fast development. This approach is very effective, scoring an F-measure of 55.22 and placing fourth in the competition, making it the best system that does not use web-derived statistics or re-annotated training data. Second, we present a latent sequence model, which learns to segment the sentence into a number of emotion regions. This model is intended to gracefully handle sentences that convey multiple thoughts and emotions. Preliminary work with the latent sequence model shows promise, resulting in comparable performance using fewer features.

  4. A systems biology-based classifier for hepatocellular carcinoma diagnosis.

    Directory of Open Access Journals (Sweden)

    Yanqiong Zhang

    Full Text Available AIM: The diagnosis of hepatocellular carcinoma (HCC in the early stage is crucial to the application of curative treatments which are the only hope for increasing the life expectancy of patients. Recently, several large-scale studies have shed light on this problem through analysis of gene expression profiles to identify markers correlated with HCC progression. However, those marker sets shared few genes in common and were poorly validated using independent data. Therefore, we developed a systems biology based classifier by combining the differential gene expression with topological features of human protein interaction networks to enhance the ability of HCC diagnosis. METHODS AND RESULTS: In the Oncomine platform, genes differentially expressed in HCC tissues relative to their corresponding normal tissues were filtered by a corrected Q value cut-off and Concept filters. The identified genes that are common to different microarray datasets were chosen as the candidate markers. Then, their networks were analyzed by GeneGO Meta-Core software and the hub genes were chosen. After that, an HCC diagnostic classifier was constructed by Partial Least Squares modeling based on the microarray gene expression data of the hub genes. Validations of diagnostic performance showed that this classifier had high predictive accuracy (85.88∼92.71% and area under ROC curve (approximating 1.0, and that the network topological features integrated into this classifier contribute greatly to improving the predictive performance. Furthermore, it has been demonstrated that this modeling strategy is not only applicable to HCC, but also to other cancers. CONCLUSION: Our analysis suggests that the systems biology-based classifier that combines the differential gene expression and topological features of human protein interaction network may enhance the diagnostic performance of HCC classifier.

  5. Submarine slope failures along the convergent continental margin of the Middle America Trench

    Science.gov (United States)

    Harders, Rieka; Ranero, CéSar R.; Weinrebe, Wilhelm; Behrmann, Jan H.

    2011-06-01

    We present the first comprehensive study of mass wasting processes in the continental slope of a convergent margin of a subduction zone where tectonic processes are dominated by subduction erosion. We have used multibeam bathymetry along ˜1300 km of the Middle America Trench of the Central America Subduction Zone and deep-towed side-scan sonar data. We found abundant evidence of large-scale slope failures that were mostly previously unmapped. The features are classified into a variety of slope failure types, creating an inventory of 147 slope failure structures. Their type distribution and abundance define a segmentation of the continental slope in six sectors. The segmentation in slope stability processes does not appear to be related to slope preconditioning due to changes in physical properties of sediment, presence/absence of gas hydrates, or apparent changes in the hydrogeological system. The segmentation appears to be better explained by changes in slope preconditioning due to variations in tectonic processes. The region is an optimal setting to study how tectonic processes related to variations in intensity of subduction erosion and changes in relief of the underthrusting plate affect mass wasting processes of the continental slope. The largest slope failures occur offshore Costa Rica. There, subducting ridges and seamounts produce failures with up to hundreds of meters high headwalls, with detachment planes that penetrate deep into the continental margin, in some cases reaching the plate boundary. Offshore northern Costa Rica a smooth oceanic seafloor underthrusts the least disturbed continental slope. Offshore Nicaragua, the ocean plate is ornamented with smaller seamounts and horst and graben topography of variable intensity. Here mass wasting structures are numerous and comparatively smaller, but when combined, they affect a large part of the margin segment. Farther north, offshore El Salvador and Guatemala the downgoing plate has no large seamounts but

  6. Realistic respiratory motion margins for external beam partial breast irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, Leigh; Quirk, Sarah [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Smith, Wendy L., E-mail: wendy.smith@albertahealthservices.ca [Department of Medical Physics, Tom Baker Cancer Centre, Calgary, Alberta T2N 4N2 (Canada); Department of Physics and Astronomy, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Department of Oncology, University of Calgary, Calgary, Alberta T2N 1N4 (Canada)

    2015-09-15

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  7. Realistic respiratory motion margins for external beam partial breast irradiation

    International Nuclear Information System (INIS)

    Conroy, Leigh; Quirk, Sarah; Smith, Wendy L.

    2015-01-01

    Purpose: Respiratory margins for partial breast irradiation (PBI) have been largely based on geometric observations, which may overestimate the margin required for dosimetric coverage. In this study, dosimetric population-based respiratory margins and margin formulas for external beam partial breast irradiation are determined. Methods: Volunteer respiratory data and anterior–posterior (AP) dose profiles from clinical treatment plans of 28 3D conformal radiotherapy (3DCRT) PBI patient plans were used to determine population-based respiratory margins. The peak-to-peak amplitudes (A) of realistic respiratory motion data from healthy volunteers were scaled from A = 1 to 10 mm to create respiratory motion probability density functions. Dose profiles were convolved with the respiratory probability density functions to produce blurred dose profiles accounting for respiratory motion. The required margins were found by measuring the distance between the simulated treatment and original dose profiles at the 95% isodose level. Results: The symmetric dosimetric respiratory margins to cover 90%, 95%, and 100% of the simulated treatment population were 1.5, 2, and 4 mm, respectively. With patient set up at end exhale, the required margins were larger in the anterior direction than the posterior. For respiratory amplitudes less than 5 mm, the population-based margins can be expressed as a fraction of the extent of respiratory motion. The derived formulas in the anterior/posterior directions for 90%, 95%, and 100% simulated population coverage were 0.45A/0.25A, 0.50A/0.30A, and 0.70A/0.40A. The differences in formulas for different population coverage criteria demonstrate that respiratory trace shape and baseline drift characteristics affect individual respiratory margins even for the same average peak-to-peak amplitude. Conclusions: A methodology for determining population-based respiratory margins using real respiratory motion patterns and dose profiles in the AP direction was

  8. Patterns of failure for glioblastoma multiforme following limited-margin radiation and concurrent temozolomide

    International Nuclear Information System (INIS)

    Gebhardt, Brian J; Dobelbower, Michael C; Ennis, William H; Bag, Asim K; Markert, James M; Fiveash, John B

    2014-01-01

    To analyze patterns of failure in patients with glioblastoma multiforme (GBM) treated with limited-margin radiation therapy and concurrent temozolomide. We hypothesize that patients treated with margins in accordance with Adult Brain Tumor Consortium guidelines (ABTC) will demonstrate patterns of failure consistent with previous series of patients treated with 2–3 cm margins. A retrospective review was performed of patients treated at the University of Alabama at Birmingham for GBM between 2000 and 2011. Ninety-five patients with biopsy-proven disease and documented disease progression after treatment were analyzed. The initial planning target volume includes the T1-enhancing tumor and surrounding edema plus a 1 cm margin. The boost planning target volume includes the T1-enhancing tumor plus a 1 cm margin. The tumors were classified as in-field, marginal, or distant if greater than 80%, 20-80%, or less than 20% of the recurrent volume fell within the 95% isodose line, respectively. The median progression-free survival from the time of diagnosis to documented failure was 8 months (range 3–46). Of the 95 documented recurrences, 77 patients (81%) had an in-field component of treatment failure, 6 (6%) had a marginal component, and 27 (28%) had a distant component. Sixty-three patients (66%) demonstrated in-field only recurrence. The low rate of marginal recurrence suggests that wider margins would have little impact on the pattern of failure, validating the use of limited margins in accordance ABTC guidelines

  9. Pathologic margin involvement and the risk of recurrence in patients treated with breast-conserving therapy

    International Nuclear Information System (INIS)

    Gage, Irene; Nixon, Asa J.; Schnitt, Stuart J.; Recht, Abram; Gelman, Rebecca; Silver, Barbara; Connolly, James L.; Harris, Jay R.

    1995-01-01

    PURPOSE: To assess the relationship between microscopic margin status and recurrence after breast-conserving therapy for tumors with or without an extensive intraductal component (EIC). MATERIALS AND METHODS: During the years 1968 to 1986, 1865 women with unilateral clinical stage I or II breast cancer were treated with radiation therapy for breast conservation. Of these, 340 received ≥60 Gy to the tumor bed and had margins that were evaluable on review of their pathologic slides; these constitute the study population. The median follow-up was 109 months. All available slides were reviewed by one of the study pathologists (SS, JC). Final radial margins of excision were classified as negative >1 mm (no invasive or ductal carcinoma in-situ within 1 mm of the inked margin), negative ≤1 mm (any carcinoma ≤1 mm of the inked margin but not at ink) or positive (any carcinoma at the inked margin). A focally positive margin was defined as any invasive or in-situ carcinoma at the margin in ≤3 LPF. The extent of positivity was not evaluable in 2 patients and the distance of the tumor from the margin was not evaluable in 48 patients with a negative margin. Thirty-nine percent of EIC-negative and 46% of EIC-positive patients underwent a re-excision and, for these, the final margin analyzed was from the re-excised specimen. The median dose to the tumor bed was 63 Gy for patients with positive margins and 62 Gy for patients with negative margins. Recurrent disease was classified as ipsilateral breast recurrence (IBR) or distant metastasis/regional nodal failure (DM/RNF). RESULTS: Five year crude rates for the first site of recurrence were calculated for 340 patients evaluable at 5 years. Results were tabulated separately for all patients, EIC-negative and EIC-positive. All p-values tested for differences in the distribution of sites of first failure. CONCLUSIONS: The risk of ipsilateral breast recurrence is equally low for patients with close (≤1 mm) or negative (>1 mm

  10. Systems considerations in seismic margin evaluations

    International Nuclear Information System (INIS)

    Buttermer, D.R.

    1987-01-01

    Increasing knowledge in the geoscience field has led to the understanding that, although highly unlikely, it is possible for a nuclear power plant to be subjected to earthquake ground motion greater than that for which the plant was designed. While it is recognized that there are conservatisms inherent in current design practices, interest has developed in evaluating the seismic risk of operating plants. Several plant-specific seismic probabilistic risk assessments (SPRA) have been completed to address questions related to the seismic risk of a plant. The results from such SPRAs are quite informative, but such studies may entail a considerable amount of expensive analysis of large portions of the plant. As an alternative to an SPRA, it may be more practical to select an earthquake level above the design basis for which plant survivability is to be demonstrated. The principal question to be addressed in a seismic margin evaluation is: At what ground motion levels does one have a high confidence that the probability of seismically induced core damage is sufficiently low? In a seismic margin evaluation, an earthquake level is selected (based on site-specific geoscience considerations) for which a stable, long-term safe shutdown condition is to be demonstrated. This prespecified earthquake level is commonly referred to as the seismic margin earthquake (SME). The Electric Power Research Institute is currently supporting a research project to develop procedures for use by the utilities to allow them to perform nuclear plant seismic margin evaluations. This paper describes the systems-related aspects of these procedures

  11. Reinforcement Learning Based Artificial Immune Classifier

    Directory of Open Access Journals (Sweden)

    Mehmet Karakose

    2013-01-01

    Full Text Available One of the widely used methods for classification that is a decision-making process is artificial immune systems. Artificial immune systems based on natural immunity system can be successfully applied for classification, optimization, recognition, and learning in real-world problems. In this study, a reinforcement learning based artificial immune classifier is proposed as a new approach. This approach uses reinforcement learning to find better antibody with immune operators. The proposed new approach has many contributions according to other methods in the literature such as effectiveness, less memory cell, high accuracy, speed, and data adaptability. The performance of the proposed approach is demonstrated by simulation and experimental results using real data in Matlab and FPGA. Some benchmark data and remote image data are used for experimental results. The comparative results with supervised/unsupervised based artificial immune system, negative selection classifier, and resource limited artificial immune classifier are given to demonstrate the effectiveness of the proposed new method.

  12. Classifier Fusion With Contextual Reliability Evaluation.

    Science.gov (United States)

    Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You

    2018-05-01

    Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.

  13. Classifying sows' activity types from acceleration patterns

    DEFF Research Database (Denmark)

    Cornou, Cecile; Lundbye-Christensen, Søren

    2008-01-01

    An automated method of classifying sow activity using acceleration measurements would allow the individual sow's behavior to be monitored throughout the reproductive cycle; applications for detecting behaviors characteristic of estrus and farrowing or to monitor illness and welfare can be foreseen....... This article suggests a method of classifying five types of activity exhibited by group-housed sows. The method involves the measurement of acceleration in three dimensions. The five activities are: feeding, walking, rooting, lying laterally and lying sternally. Four time series of acceleration (the three...

  14. Data characteristics that determine classifier performance

    CSIR Research Space (South Africa)

    Van der Walt, Christiaan M

    2006-11-01

    Full Text Available available at [11]. The kNN uses a LinearNN nearest neighbour search algorithm with an Euclidean distance metric [8]. The optimal k value is determined by performing 10-fold cross-validation. An optimal k value between 1 and 10 is used for Experiments 1... classifiers. 10-fold cross-validation is used to evaluate and compare the performance of the classifiers on the different data sets. 3.1. Artificial data generation Multivariate Gaussian distributions are used to generate artificial data sets. We use d...

  15. A Customizable Text Classifier for Text Mining

    Directory of Open Access Journals (Sweden)

    Yun-liang Zhang

    2007-12-01

    Full Text Available Text mining deals with complex and unstructured texts. Usually a particular collection of texts that is specified to one or more domains is necessary. We have developed a customizable text classifier for users to mine the collection automatically. It derives from the sentence category of the HNC theory and corresponding techniques. It can start with a few texts, and it can adjust automatically or be adjusted by user. The user can also control the number of domains chosen and decide the standard with which to choose the texts based on demand and abundance of materials. The performance of the classifier varies with the user's choice.

  16. A survey of decision tree classifier methodology

    Science.gov (United States)

    Safavian, S. R.; Landgrebe, David

    1991-01-01

    Decision tree classifiers (DTCs) are used successfully in many diverse areas such as radar signal classification, character recognition, remote sensing, medical diagnosis, expert systems, and speech recognition. Perhaps the most important feature of DTCs is their capability to break down a complex decision-making process into a collection of simpler decisions, thus providing a solution which is often easier to interpret. A survey of current methods is presented for DTC designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed.

  17. Radiotherapy margin design with particular consideration of high curvature CTVs

    International Nuclear Information System (INIS)

    Herschtal, Alan; Kron, Tomas; Fox, Chris

    2009-01-01

    In applying 3D conformal radiation therapy to a tumor clinical target volume (CTV), a margin is added around the CTV to account for any sources of error in the application of treatment which may result in misalignment between the CTV and the dose distribution actually delivered. The volume enclosed within the CTV plus the margin is known as the PTV, or planning target volume. The larger the errors are anticipated to be, the wider the margin will need to be to accommodate those errors. Based on the approach of van Herk et al. [''The probability of correct target dosage: Dose-population histograms for deriving treatment margins in radiotherapy,'' Int. J. Radiat. Oncol. Biol., Phys. 47(4), 1121-1135 (2000)] this paper develops the mathematical theory behind the calculation of the margin width required to ensure that the entire CTV receives sufficiently high dose with sufficiently high probability. The margin recipe developed not only considers the magnitude of the errors but also includes a term to adjust for curved CTV surfaces. In doing so, the accuracy of the margin recipe is enhanced yet remains mathematically concise enough to be readily implemented in the clinical setting. The results are particularly relevant for clinical situations in which the uncertainties in treatment are large relative to the size of the CTV.

  18. Influence of Crack Morphology on Leak Before Break Margins

    International Nuclear Information System (INIS)

    Weilin Zang

    2007-11-01

    The purpose of the project is to evaluate the deterministic LBB-margins for different pipe systems in a Swedish PWR-plant and using different crack morphology parameters. Results: - The influence of crack morphology on Leak Before Break (LBB) margins is studied. The subject of the report is a number of LBB-submittals to SKI where deterministic LBB-margins are reported. These submittals typically uses a surface roughness of 0.0762 mm (300 microinch) and number of turns equal to zero and an in-house code for the leak rate evaluations. The present report has shown that these conditions give the largest LBB-margins both in terms of the quotient between the critical crack length and the leakage crack size and for the leak rate margin. - Crack morphology parameters have a strong influence on the leak rate evaluations. Using the SQUIRT code and more recent recommendations for crack morphology parameters, it is shown that in many cases the evaluated margins, using 1 gpm as the reference leak rate detection limit, are below the safety factor of 2 on crack size and 10 on leak rate, which is generally required for LBB approval. - The effect of including weld residual stresses on the LBB margins is also investigated. It is shown that for the two examples studied, weld residual stresses were important for the small diameter thin wall pipe whereas it was negligible for the large diameter thick wall pipe which had a self-balanced weld residual stress distribution

  19. The Marginal Source of Finance

    OpenAIRE

    Lindhe, Tobias

    2002-01-01

    This paper addresses the ongoingdebate on which view of equity, traditional or new, that best describes firm behavior. According to the traditional view, the marginal source of finance is new equity, whereas under to the new view, marginal financing comes from retained earnings. In the theoretical part, we set up a model where the firm faces a cost of adjusting the dividend level because of an aggravated free cash flow problem. The existence of such a cost - which has been used in arguing the...

  20. Characterizing Convexity of Games using Marginal Vectors

    NARCIS (Netherlands)

    van Velzen, S.; Hamers, H.J.M.; Norde, H.W.

    2003-01-01

    In this paper we study the relation between convexity of TU games and marginal vectors.We show that if specfic marginal vectors are core elements, then the game is convex.We characterize sets of marginal vectors satisfying this property, and we derive the formula for the minimum number of marginal

  1. Volcanic passive margins: another way to break up continents.

    Science.gov (United States)

    Geoffroy, L; Burov, E B; Werner, P

    2015-10-07

    Two major types of passive margins are recognized, i.e. volcanic and non-volcanic, without proposing distinctive mechanisms for their formation. Volcanic passive margins are associated with the extrusion and intrusion of large volumes of magma, predominantly mafic, and represent distinctive features of Larges Igneous Provinces, in which regional fissural volcanism predates localized syn-magmatic break-up of the lithosphere. In contrast with non-volcanic margins, continentward-dipping detachment faults accommodate crustal necking at both conjugate volcanic margins. These faults root on a two-layer deformed ductile crust that appears to be partly of igneous nature. This lower crust is exhumed up to the bottom of the syn-extension extrusives at the outer parts of the margin. Our numerical modelling suggests that strengthening of deep continental crust during early magmatic stages provokes a divergent flow of the ductile lithosphere away from a central continental block, which becomes thinner with time due to the flow-induced mechanical erosion acting at its base. Crustal-scale faults dipping continentward are rooted over this flowing material, thus isolating micro-continents within the future oceanic domain. Pure-shear type deformation affects the bulk lithosphere at VPMs until continental breakup, and the geometry of the margin is closely related to the dynamics of an active and melting mantle.

  2. 75 FR 37253 - Classified National Security Information

    Science.gov (United States)

    2010-06-28

    ... ``Secret.'' (3) Each interior page of a classified document shall be marked at the top and bottom either... ``(TS)'' for Top Secret, ``(S)'' for Secret, and ``(C)'' for Confidential will be used. (2) Portions... from the informational text. (1) Conspicuously place the overall classification at the top and bottom...

  3. 75 FR 707 - Classified National Security Information

    Science.gov (United States)

    2010-01-05

    ... classified at one of the following three levels: (1) ``Top Secret'' shall be applied to information, the... exercise this authority. (2) ``Top Secret'' original classification authority may be delegated only by the... official has been delegated ``Top Secret'' original classification authority by the agency head. (4) Each...

  4. Neural Network Classifier Based on Growing Hyperspheres

    Czech Academy of Sciences Publication Activity Database

    Jiřina Jr., Marcel; Jiřina, Marcel

    2000-01-01

    Roč. 10, č. 3 (2000), s. 417-428 ISSN 1210-0552. [Neural Network World 2000. Prague, 09.07.2000-12.07.2000] Grant - others:MŠMT ČR(CZ) VS96047; MPO(CZ) RP-4210 Institutional research plan: AV0Z1030915 Keywords : neural network * classifier * hyperspheres * big -dimensional data Subject RIV: BA - General Mathematics

  5. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  6. Classifying web pages with visual features

    NARCIS (Netherlands)

    de Boer, V.; van Someren, M.; Lupascu, T.; Filipe, J.; Cordeiro, J.

    2010-01-01

    To automatically classify and process web pages, current systems use the textual content of those pages, including both the displayed content and the underlying (HTML) code. However, a very important feature of a web page is its visual appearance. In this paper, we show that using generic visual

  7. The marginal social cost of headway for a scheduled service

    DEFF Research Database (Denmark)

    Fosgerau, Mogens

    2009-01-01

    waiting time costs as well as schedule delay costs measured relative to their desired time of arrival at the destination. They may either arrive at the station to choose just the next departure or they may plan for a specific departure in which case they incur also a planning cost. Then planning......This brief paper derives the marginal social cost of headway for a scheduled service, i.e. the cost for users of marginal increases to the time interval between departures. In brief we may call it the value of headway in analogy with the value of travel time and the value of reliability. Users have...... for a specific departure is costly but becomes more attractive at longer headways. Simple expressions for the user cost result. In particular, the marginal cost of headway is large at short headways and smaller at long headways. The difference in marginal costs is the value of time multiplied by half the headway....

  8. Marginality and Variability in Esperanto.

    Science.gov (United States)

    Brent, Edmund

    This paper discusses Esperanto as a planned language and refutes three myths connected to it, namely, that Esperanto is achronical, atopical, and apragmatic. The focus here is on a synchronic analysis. Synchronic variability is studied with reference to the structuralist determination of "marginality" and the dynamic linguistic…

  9. Texas curve margin of safety.

    Science.gov (United States)

    2013-01-01

    This software can be used to assist with the assessment of margin of safety for a horizontal curve. It is intended for use by engineers and technicians responsible for safety analysis or management of rural highway pavement or traffic control devices...

  10. Ethnographies of marginality [Review article

    NARCIS (Netherlands)

    Beuving, J.J.

    2016-01-01

    Africanist discourse today displays a strong, widespread and growing sense of optimism about Africa's economic future. After decades of decline and stagnation in which Africa found itself reduced to the margins of the global economic stage, upbeat Afro-optimism seems fully justified. One only needs

  11. Profit margins in Japanese retailing

    NARCIS (Netherlands)

    J.C.A. Potjes; A.R. Thurik (Roy)

    1993-01-01

    textabstractUsing a rich data source, we explain differences and developments in profit margins of medium-sized stores in Japan. We conclude that the protected environment enables the retailer to pass on all operating costs to the customers and to obtain a relatively high basic income. High service

  12. Pushing the Margins of Responsibility

    DEFF Research Database (Denmark)

    Santoni de Sio, Filippo; Di Nucci, Ezio

    2018-01-01

    David Shoemaker has claimed that a binary approach to moral responsibility leaves out something important, namely instances of marginal agency, cases where agents seem to be eligible for some responsibility responses but not others. In this paper we endorse and extend Shoemaker’s approach by pres...

  13. Scoring and Classifying Examinees Using Measurement Decision Theory

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2009-04-01

    Full Text Available This paper describes and evaluates the use of measurement decision theory (MDT to classify examinees based on their item response patterns. The model has a simple framework that starts with the conditional probabilities of examinees in each category or mastery state responding correctly to each item. The presented evaluation investigates: (1 the classification accuracy of tests scored using decision theory; (2 the effectiveness of different sequential testing procedures; and (3 the number of items needed to make a classification. A large percentage of examinees can be classified accurately with very few items using decision theory. A Java Applet for self instruction and software for generating, calibrating and scoring MDT data are provided.

  14. Security Enrichment in Intrusion Detection System Using Classifier Ensemble

    Directory of Open Access Journals (Sweden)

    Uma R. Salunkhe

    2017-01-01

    Full Text Available In the era of Internet and with increasing number of people as its end users, a large number of attack categories are introduced daily. Hence, effective detection of various attacks with the help of Intrusion Detection Systems is an emerging trend in research these days. Existing studies show effectiveness of machine learning approaches in handling Intrusion Detection Systems. In this work, we aim to enhance detection rate of Intrusion Detection System by using machine learning technique. We propose a novel classifier ensemble based IDS that is constructed using hybrid approach which combines data level and feature level approach. Classifier ensembles combine the opinions of different experts and improve the intrusion detection rate. Experimental results show the improved detection rates of our system compared to reference technique.

  15. Classifying features in CT imagery: accuracy for some single- and multiple-species classifiers

    Science.gov (United States)

    Daniel L. Schmoldt; Jing He; A. Lynn Abbott

    1998-01-01

    Our current approach to automatically label features in CT images of hardwood logs classifies each pixel of an image individually. These feature classifiers use a back-propagation artificial neural network (ANN) and feature vectors that include a small, local neighborhood of pixels and the distance of the target pixel to the center of the log. Initially, this type of...

  16. Disassembly and Sanitization of Classified Matter

    International Nuclear Information System (INIS)

    Stockham, Dwight J.; Saad, Max P.

    2008-01-01

    The Disassembly Sanitization Operation (DSO) process was implemented to support weapon disassembly and disposition by using recycling and waste minimization measures. This process was initiated by treaty agreements and reconfigurations within both the DOD and DOE Complexes. The DOE is faced with disassembling and disposing of a huge inventory of retired weapons, components, training equipment, spare parts, weapon maintenance equipment, and associated material. In addition, regulations have caused a dramatic increase in the need for information required to support the handling and disposition of these parts and materials. In the past, huge inventories of classified weapon components were required to have long-term storage at Sandia and at many other locations throughout the DoE Complex. These materials are placed in onsite storage unit due to classification issues and they may also contain radiological and/or hazardous components. Since no disposal options exist for this material, the only choice was long-term storage. Long-term storage is costly and somewhat problematic, requiring a secured storage area, monitoring, auditing, and presenting the potential for loss or theft of the material. Overall recycling rates for materials sent through the DSO process have enabled 70 to 80% of these components to be recycled. These components are made of high quality materials and once this material has been sanitized, the demand for the component metals for recycling efforts is very high. The DSO process for NGPF, classified components established the credibility of this technique for addressing the long-term storage requirements of the classified weapons component inventory. The success of this application has generated interest from other Sandia organizations and other locations throughout the complex. Other organizations are requesting the help of the DSO team and the DSO is responding to these requests by expanding its scope to include Work-for- Other projects. For example

  17. The Large Margin Mechanism for Differentially Private Maximization

    OpenAIRE

    Chaudhuri, Kamalika; Hsu, Daniel; Song, Shuang

    2014-01-01

    A basic problem in the design of privacy-preserving algorithms is the private maximization problem: the goal is to pick an item from a universe that (approximately) maximizes a data-dependent function, all under the constraint of differential privacy. This problem has been used as a sub-routine in many privacy-preserving algorithms for statistics and machine-learning. Previous algorithms for this problem are either range-dependent---i.e., their utility diminishes with the size of the universe...

  18. Contributions to knowledge of the continental margin of Uruguay. Uruguayan continental margin: Physiographic and seismic analysis

    International Nuclear Information System (INIS)

    Preciozzi, F

    2014-01-01

    This work is about the kind of continental margins such as a )Atlantic type passive margins which can be hard or soft b) An active or Pacific margins that because of the very frequent earthquakes develop a morphology dominated by tectonic processes. The Uruguayan continental margin belongs to a soft Atlantic margin

  19. Comparing cosmic web classifiers using information theory

    International Nuclear Information System (INIS)

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin; Jasche, Jens

    2016-01-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  20. Design of Robust Neural Network Classifiers

    DEFF Research Database (Denmark)

    Larsen, Jan; Andersen, Lars Nonboe; Hintz-Madsen, Mads

    1998-01-01

    This paper addresses a new framework for designing robust neural network classifiers. The network is optimized using the maximum a posteriori technique, i.e., the cost function is the sum of the log-likelihood and a regularization term (prior). In order to perform robust classification, we present...... a modified likelihood function which incorporates the potential risk of outliers in the data. This leads to the introduction of a new parameter, the outlier probability. Designing the neural classifier involves optimization of network weights as well as outlier probability and regularization parameters. We...... suggest to adapt the outlier probability and regularisation parameters by minimizing the error on a validation set, and a simple gradient descent scheme is derived. In addition, the framework allows for constructing a simple outlier detector. Experiments with artificial data demonstrate the potential...

  1. Comparing cosmic web classifiers using information theory

    Energy Technology Data Exchange (ETDEWEB)

    Leclercq, Florent [Institute of Cosmology and Gravitation (ICG), University of Portsmouth, Dennis Sciama Building, Burnaby Road, Portsmouth PO1 3FX (United Kingdom); Lavaux, Guilhem; Wandelt, Benjamin [Institut d' Astrophysique de Paris (IAP), UMR 7095, CNRS – UPMC Université Paris 6, Sorbonne Universités, 98bis boulevard Arago, F-75014 Paris (France); Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: lavaux@iap.fr, E-mail: j.jasche@tum.de, E-mail: wandelt@iap.fr [Excellence Cluster Universe, Technische Universität München, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2016-08-01

    We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Our study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.

  2. Detection of Fundus Lesions Using Classifier Selection

    Science.gov (United States)

    Nagayoshi, Hiroto; Hiramatsu, Yoshitaka; Sako, Hiroshi; Himaga, Mitsutoshi; Kato, Satoshi

    A system for detecting fundus lesions caused by diabetic retinopathy from fundus images is being developed. The system can screen the images in advance in order to reduce the inspection workload on doctors. One of the difficulties that must be addressed in completing this system is how to remove false positives (which tend to arise near blood vessels) without decreasing the detection rate of lesions in other areas. To overcome this difficulty, we developed classifier selection according to the position of a candidate lesion, and we introduced new features that can distinguish true lesions from false positives. A system incorporating classifier selection and these new features was tested in experiments using 55 fundus images with some lesions and 223 images without lesions. The results of the experiments confirm the effectiveness of the proposed system, namely, degrees of sensitivity and specificity of 98% and 81%, respectively.

  3. Classifying objects in LWIR imagery via CNNs

    Science.gov (United States)

    Rodger, Iain; Connor, Barry; Robertson, Neil M.

    2016-10-01

    The aim of the presented work is to demonstrate enhanced target recognition and improved false alarm rates for a mid to long range detection system, utilising a Long Wave Infrared (LWIR) sensor. By exploiting high quality thermal image data and recent techniques in machine learning, the system can provide automatic target recognition capabilities. A Convolutional Neural Network (CNN) is trained and the classifier achieves an overall accuracy of > 95% for 6 object classes related to land defence. While the highly accurate CNN struggles to recognise long range target classes, due to low signal quality, robust target discrimination is achieved for challenging candidates. The overall performance of the methodology presented is assessed using human ground truth information, generating classifier evaluation metrics for thermal image sequences.

  4. Learning for VMM + WTA Embedded Classifiers

    Science.gov (United States)

    2016-03-31

    Learning for VMM + WTA Embedded Classifiers Jennifer Hasler and Sahil Shah Electrical and Computer Engineering Georgia Institute of Technology...enabling correct classification of each novel acoustic signal (generator, idle car, and idle truck ). The classification structure requires, after...measured on our SoC FPAA IC. The test input is composed of signals from urban environment for 3 objects (generator, idle car, and idle truck

  5. Bayes classifiers for imbalanced traffic accidents datasets.

    Science.gov (United States)

    Mujalli, Randa Oqab; López, Griselda; Garach, Laura

    2016-03-01

    Traffic accidents data sets are usually imbalanced, where the number of instances classified under the killed or severe injuries class (minority) is much lower than those classified under the slight injuries class (majority). This, however, supposes a challenging problem for classification algorithms and may cause obtaining a model that well cover the slight injuries instances whereas the killed or severe injuries instances are misclassified frequently. Based on traffic accidents data collected on urban and suburban roads in Jordan for three years (2009-2011); three different data balancing techniques were used: under-sampling which removes some instances of the majority class, oversampling which creates new instances of the minority class and a mix technique that combines both. In addition, different Bayes classifiers were compared for the different imbalanced and balanced data sets: Averaged One-Dependence Estimators, Weightily Average One-Dependence Estimators, and Bayesian networks in order to identify factors that affect the severity of an accident. The results indicated that using the balanced data sets, especially those created using oversampling techniques, with Bayesian networks improved classifying a traffic accident according to its severity and reduced the misclassification of killed and severe injuries instances. On the other hand, the following variables were found to contribute to the occurrence of a killed causality or a severe injury in a traffic accident: number of vehicles involved, accident pattern, number of directions, accident type, lighting, surface condition, and speed limit. This work, to the knowledge of the authors, is the first that aims at analyzing historical data records for traffic accidents occurring in Jordan and the first to apply balancing techniques to analyze injury severity of traffic accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A Bayesian classifier for symbol recognition

    OpenAIRE

    Barrat , Sabine; Tabbone , Salvatore; Nourrissier , Patrick

    2007-01-01

    URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...

  7. Human factors quantification via boundary identification of flight performance margin

    Directory of Open Access Journals (Sweden)

    Yang Changpeng

    2014-08-01

    Full Text Available A systematic methodology including a computational pilot model and a pattern recognition method is presented to identify the boundary of the flight performance margin for quantifying the human factors. The pilot model is proposed to correlate a set of quantitative human factors which represent the attributes and characteristics of a group of pilots. Three information processing components which are influenced by human factors are modeled: information perception, decision making, and action execution. By treating the human factors as stochastic variables that follow appropriate probability density functions, the effects of human factors on flight performance can be investigated through Monte Carlo (MC simulation. Kernel density estimation algorithm is selected to find and rank the influential human factors. Subsequently, human factors are quantified through identifying the boundary of the flight performance margin by the k-nearest neighbor (k-NN classifier. Simulation-based analysis shows that flight performance can be dramatically improved with the quantitative human factors.

  8. Optimization of short amino acid sequences classifier

    Science.gov (United States)

    Barcz, Aleksy; Szymański, Zbigniew

    This article describes processing methods used for short amino acid sequences classification. The data processed are 9-symbols string representations of amino acid sequences, divided into 49 data sets - each one containing samples labeled as reacting or not with given enzyme. The goal of the classification is to determine for a single enzyme, whether an amino acid sequence would react with it or not. Each data set is processed separately. Feature selection is performed to reduce the number of dimensions for each data set. The method used for feature selection consists of two phases. During the first phase, significant positions are selected using Classification and Regression Trees. Afterwards, symbols appearing at the selected positions are substituted with numeric values of amino acid properties taken from the AAindex database. In the second phase the new set of features is reduced using a correlation-based ranking formula and Gram-Schmidt orthogonalization. Finally, the preprocessed data is used for training LS-SVM classifiers. SPDE, an evolutionary algorithm, is used to obtain optimal hyperparameters for the LS-SVM classifier, such as error penalty parameter C and kernel-specific hyperparameters. A simple score penalty is used to adapt the SPDE algorithm to the task of selecting classifiers with best performance measures values.

  9. SVM classifier on chip for melanoma detection.

    Science.gov (United States)

    Afifi, Shereen; GholamHosseini, Hamid; Sinha, Roopak

    2017-07-01

    Support Vector Machine (SVM) is a common classifier used for efficient classification with high accuracy. SVM shows high accuracy for classifying melanoma (skin cancer) clinical images within computer-aided diagnosis systems used by skin cancer specialists to detect melanoma early and save lives. We aim to develop a medical low-cost handheld device that runs a real-time embedded SVM-based diagnosis system for use in primary care for early detection of melanoma. In this paper, an optimized SVM classifier is implemented onto a recent FPGA platform using the latest design methodology to be embedded into the proposed device for realizing online efficient melanoma detection on a single system on chip/device. The hardware implementation results demonstrate a high classification accuracy of 97.9% and a significant acceleration factor of 26 from equivalent software implementation on an embedded processor, with 34% of resources utilization and 2 watts for power consumption. Consequently, the implemented system meets crucial embedded systems constraints of high performance and low cost, resources utilization and power consumption, while achieving high classification accuracy.

  10. Margins related to equipment design

    International Nuclear Information System (INIS)

    Devos, J.

    1994-01-01

    Safety margins related to design of reactor equipment are defined according to safety regulations. Advanced best estimate methods are proposed including some examples which were computed and compared to experimental results. Best estimate methods require greater computation effort and more material data but give better variable accuracy and need careful experimental validation. Simplified methods compared to the previous are less sensitive to material data, sometimes are more accurate but very long to elaborate

  11. Classifier-Guided Sampling for Complex Energy System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    This report documents the results of a Laboratory Directed Research and Development (LDRD) effort enti tled "Classifier - Guided Sampling for Complex Energy System Optimization" that was conducted during FY 2014 and FY 2015. The goal of this proj ect was to develop, implement, and test major improvements to the classifier - guided sampling (CGS) algorithm. CGS is type of evolutionary algorithm for perform ing search and optimization over a set of discrete design variables in the face of one or more objective functions. E xisting evolutionary algorithms, such as genetic algorithms , may require a large number of o bjecti ve function evaluations to identify optimal or near - optimal solutions . Reducing the number of evaluations can result in significant time savings, especially if the objective function is computationally expensive. CGS reduce s the evaluation count by us ing a Bayesian network classifier to filter out non - promising candidate designs , prior to evaluation, based on their posterior probabilit ies . In this project, b oth the single - objective and multi - objective version s of the CGS are developed and tested on a set of benchm ark problems. As a domain - specific case study, CGS is used to design a microgrid for use in islanded mode during an extended bulk power grid outage.

  12. Indigenous women's voices: marginalization and health.

    Science.gov (United States)

    Dodgson, Joan E; Struthers, Roxanne

    2005-10-01

    Marginalization may affect health care delivery. Ways in which indigenous women experienced marginalization were examined. Data from 57 indigenous women (18 to 65 years) were analyzed for themes. Three themes emerged: historical trauma as lived marginalization, biculturalism experienced as marginalization, and interacting within a complex health care system. Experienced marginalization reflected participants' unique perspective and were congruent with previous research. It is necessary for health care providers to assess the detrimental impact of marginalization on the health status of individuals and/or communities.

  13. Local curvature analysis for classifying breast tumors: Preliminary analysis in dedicated breast CT

    International Nuclear Information System (INIS)

    Lee, Juhun; Nishikawa, Robert M.; Reiser, Ingrid; Boone, John M.; Lindfors, Karen K.

    2015-01-01

    Purpose: The purpose of this study is to measure the effectiveness of local curvature measures as novel image features for classifying breast tumors. Methods: A total of 119 breast lesions from 104 noncontrast dedicated breast computed tomography images of women were used in this study. Volumetric segmentation was done using a seed-based segmentation algorithm and then a triangulated surface was extracted from the resulting segmentation. Total, mean, and Gaussian curvatures were then computed. Normalized curvatures were used as classification features. In addition, traditional image features were also extracted and a forward feature selection scheme was used to select the optimal feature set. Logistic regression was used as a classifier and leave-one-out cross-validation was utilized to evaluate the classification performances of the features. The area under the receiver operating characteristic curve (AUC, area under curve) was used as a figure of merit. Results: Among curvature measures, the normalized total curvature (C_T) showed the best classification performance (AUC of 0.74), while the others showed no classification power individually. Five traditional image features (two shape, two margin, and one texture descriptors) were selected via the feature selection scheme and its resulting classifier achieved an AUC of 0.83. Among those five features, the radial gradient index (RGI), which is a margin descriptor, showed the best classification performance (AUC of 0.73). A classifier combining RGI and C_T yielded an AUC of 0.81, which showed similar performance (i.e., no statistically significant difference) to the classifier with the above five traditional image features. Additional comparisons in AUC values between classifiers using different combinations of traditional image features and C_T were conducted. The results showed that C_T was able to replace the other four image features for the classification task. Conclusions: The normalized curvature measure

  14. Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

    Directory of Open Access Journals (Sweden)

    Shehzad Khalid

    2014-01-01

    Full Text Available We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise. An extension of m-Mediods based modeling is presented that generates model of various classes whilst identifying and filtering noisy training data. This noise free data is further used to learn model for other classifiers such as GMM and SVM. A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble. For this purpose, we applied genetic algorithm to search for an optimal weight vector on which classifier ensemble is expected to give the best accuracy. The proposed approach is evaluated on variety of real life datasets. It is also compared with existing standard ensemble techniques such as Adaboost, Bagging, and Random Subspace Methods. Experimental results show the superiority of proposed ensemble method as compared to its competitors, especially in the presence of class label noise and imbalance classes.

  15. The Protection of Classified Information: The Legal Framework

    National Research Council Canada - National Science Library

    Elsea, Jennifer K

    2006-01-01

    Recent incidents involving leaks of classified information have heightened interest in the legal framework that governs security classification, access to classified information, and penalties for improper disclosure...

  16. Classifying Radio Galaxies with the Convolutional Neural Network

    International Nuclear Information System (INIS)

    Aniyan, A. K.; Thorat, K.

    2017-01-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff–Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ∼200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  17. Classifying Radio Galaxies with the Convolutional Neural Network

    Energy Technology Data Exchange (ETDEWEB)

    Aniyan, A. K.; Thorat, K. [Department of Physics and Electronics, Rhodes University, Grahamstown (South Africa)

    2017-06-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff–Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ∼200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  18. Classifying Radio Galaxies with the Convolutional Neural Network

    Science.gov (United States)

    Aniyan, A. K.; Thorat, K.

    2017-06-01

    We present the application of a deep machine learning technique to classify radio images of extended sources on a morphological basis using convolutional neural networks (CNN). In this study, we have taken the case of the Fanaroff-Riley (FR) class of radio galaxies as well as radio galaxies with bent-tailed morphology. We have used archival data from the Very Large Array (VLA)—Faint Images of the Radio Sky at Twenty Centimeters survey and existing visually classified samples available in the literature to train a neural network for morphological classification of these categories of radio sources. Our training sample size for each of these categories is ˜200 sources, which has been augmented by rotated versions of the same. Our study shows that CNNs can classify images of the FRI and FRII and bent-tailed radio galaxies with high accuracy (maximum precision at 95%) using well-defined samples and a “fusion classifier,” which combines the results of binary classifications, while allowing for a mechanism to find sources with unusual morphologies. The individual precision is highest for bent-tailed radio galaxies at 95% and is 91% and 75% for the FRI and FRII classes, respectively, whereas the recall is highest for FRI and FRIIs at 91% each, while the bent-tailed class has a recall of 79%. These results show that our results are comparable to that of manual classification, while being much faster. Finally, we discuss the computational and data-related challenges associated with the morphological classification of radio galaxies with CNNs.

  19. PORTRAIT GRAFFITI IN MARGINS OF ANTIQUE LITHUANIAN BOOKS

    Directory of Open Access Journals (Sweden)

    Burba, Domininkas

    2006-12-01

    discharge. The marginal portraits in personal books are more artistical and their composition is more relaxed. Overall the GDL marginal portraits reveal quite a few similarities to the graffiti (in italian scarabocchi left in the documents by the workers of Naples bank archive. They were properly examined and classified by the artist and archivist Giuseppe Zevola. According to him this documental graffiti was born out of opposition to the grey everyday routine and experience of “pleasure of anxiety”.

  20. Marginalism, quasi-marginalism and critical phenomena in micellar solutions

    International Nuclear Information System (INIS)

    Reatto, L.

    1986-01-01

    The observed nonuniversal critical behaviour of some micellar solutions is interpreted in terms of quasi-marginalism, i.e. the presence of a coupling which scales with an exponent very close to the spatial dimensionality. This can give rise to a preasymptotic region with varying effective critical exponents with a final crossover to the Ising ones. The reduced crossover temperature is estimated to be below 10 -6 . The exponents β and γ measured in C 12 e 5 are in good agreement with the scaling law expected to hold for the effective exponents. The model considered by Shnidman is found unable to explain the nonuniversal critical behaviour

  1. Classifying smoking urges via machine learning.

    Science.gov (United States)

    Dumortier, Antoine; Beckjord, Ellen; Shiffman, Saul; Sejdić, Ervin

    2016-12-01

    Smoking is the largest preventable cause of death and diseases in the developed world, and advances in modern electronics and machine learning can help us deliver real-time intervention to smokers in novel ways. In this paper, we examine different machine learning approaches to use situational features associated with having or not having urges to smoke during a quit attempt in order to accurately classify high-urge states. To test our machine learning approaches, specifically, Bayes, discriminant analysis and decision tree learning methods, we used a dataset collected from over 300 participants who had initiated a quit attempt. The three classification approaches are evaluated observing sensitivity, specificity, accuracy and precision. The outcome of the analysis showed that algorithms based on feature selection make it possible to obtain high classification rates with only a few features selected from the entire dataset. The classification tree method outperformed the naive Bayes and discriminant analysis methods, with an accuracy of the classifications up to 86%. These numbers suggest that machine learning may be a suitable approach to deal with smoking cessation matters, and to predict smoking urges, outlining a potential use for mobile health applications. In conclusion, machine learning classifiers can help identify smoking situations, and the search for the best features and classifier parameters significantly improves the algorithms' performance. In addition, this study also supports the usefulness of new technologies in improving the effect of smoking cessation interventions, the management of time and patients by therapists, and thus the optimization of available health care resources. Future studies should focus on providing more adaptive and personalized support to people who really need it, in a minimum amount of time by developing novel expert systems capable of delivering real-time interventions. Copyright © 2016 Elsevier Ireland Ltd. All rights

  2. Classifying spaces of degenerating polarized Hodge structures

    CERN Document Server

    Kato, Kazuya

    2009-01-01

    In 1970, Phillip Griffiths envisioned that points at infinity could be added to the classifying space D of polarized Hodge structures. In this book, Kazuya Kato and Sampei Usui realize this dream by creating a logarithmic Hodge theory. They use the logarithmic structures begun by Fontaine-Illusie to revive nilpotent orbits as a logarithmic Hodge structure. The book focuses on two principal topics. First, Kato and Usui construct the fine moduli space of polarized logarithmic Hodge structures with additional structures. Even for a Hermitian symmetric domain D, the present theory is a refinem

  3. Gearbox Condition Monitoring Using Advanced Classifiers

    Directory of Open Access Journals (Sweden)

    P. Večeř

    2010-01-01

    Full Text Available New efficient and reliable methods for gearbox diagnostics are needed in automotive industry because of growing demand for production quality. This paper presents the application of two different classifiers for gearbox diagnostics – Kohonen Neural Networks and the Adaptive-Network-based Fuzzy Interface System (ANFIS. Two different practical applications are presented. In the first application, the tested gearboxes are separated into two classes according to their condition indicators. In the second example, ANFIS is applied to label the tested gearboxes with a Quality Index according to the condition indicators. In both applications, the condition indicators were computed from the vibration of the gearbox housing. 

  4. Cubical sets as a classifying topos

    DEFF Research Database (Denmark)

    Spitters, Bas

    Coquand’s cubical set model for homotopy type theory provides the basis for a computational interpretation of the univalence axiom and some higher inductive types, as implemented in the cubical proof assistant. We show that the underlying cube category is the opposite of the Lawvere theory of De...... Morgan algebras. The topos of cubical sets itself classifies the theory of ‘free De Morgan algebras’. This provides us with a topos with an internal ‘interval’. Using this interval we construct a model of type theory following van den Berg and Garner. We are currently investigating the precise relation...

  5. Double Ramp Loss Based Reject Option Classifier

    Science.gov (United States)

    2015-05-22

    of convex (DC) functions. To minimize it, we use DC programming approach [1]. The proposed method has following advantages: (1) the proposed loss LDR ...space constraints. We see that LDR does not put any restriction on ρ for it to be an upper bound of L0−d−1. 2.2 Risk Formulation Using LDR Let S = {(xn...classifier learnt using LDR based approach (C = 100, μ = 1, d = .2). Filled circles and triangles represent the support vectors. 4 Experimental Results We show

  6. Deep Structures of The Angola Margin

    Science.gov (United States)

    Moulin, M.; Contrucci, I.; Olivet, J.-L.; Aslanian, D.; Géli, L.; Sibuet, J.-C.

    1 Ifremer Centre de Brest, DRO/Géosciences Marines, B.P. 70, 29280 Plouzané cedex (France) mmoulin@ifremer.fr/Fax : 33 2 98 22 45 49 2 Université de Bretagne Occidentale, Institut Universitaire Europeen de la Mer, Place Nicolas Copernic, 29280 Plouzane (France) 3 Total Fina Elf, DGEP/GSR/PN -GEOLOGIE, 2,place de la Coupole-La Defense 6, 92078 Paris la Defense Cedex Deep reflection and refraction seismic data were collected in April 2000 on the West African margin, offshore Angola, within the framework of the Zaiango Joint Project, conducted by Ifremer and Total Fina Elf Production. Vertical multichannel reflection seismic data generated by a « single-bubble » air gun array array (Avedik et al., 1993) were recorded on a 4.5 km long, digital streamer, while refraction and wide angle reflection seismic data were acquired on OBSs (Ocean Bottom Seismometers). Despite the complexity of the margin (5 s TWT of sediment, salt tectonics), the combination of seismic reflection and refraction methods results in an image and a velocity model of the ground structures below the Aptian salt layer. Three large seismic units appear in the reflection seismic section from the deep part on the margin under the base of salt. The upper seismic unit is layered with reflectors parallel to the base of the salt ; it represents unstructured sediments, filling a basin. The middle unit is seismically transparent. The lower unit is characterized by highly energetic reflectors. According to the OBS refraction data, these two units correspond to the continental crust and the base of the high energetic unit corresponds to the Moho. The margin appears to be divided in 3 domains, from east to west : i) a domain with an unthinned, 30 km thick, continental crust ; ii) a domain located between the hinge line and the foot of the continental slope, where the crust thins sharply, from 30 km to less than 7 km, this domain is underlain by an anormal layer with velocities comprising between 7,2 and 7

  7. Farm Household Survival Strategies and Diversification on Marginal Farms

    Science.gov (United States)

    Meert, H.; Van Huylenbroeck, G.; Vernimmen, T.; Bourgeois, M.; van Hecke, E.

    2005-01-01

    On marginal farms, and in agriculture in general, sustainability is largely guaranteed by a broad range of survival strategies, closely interlinked and embedded in the household structure of typical family farms. This paper reports results of a socio-economic study carried out among Belgian farmers, focusing specifically on the opportunities…

  8. A Bayesian method for comparing and combining binary classifiers in the absence of a gold standard

    Directory of Open Access Journals (Sweden)

    Keith Jonathan M

    2012-07-01

    Full Text Available Abstract Background Many problems in bioinformatics involve classification based on features such as sequence, structure or morphology. Given multiple classifiers, two crucial questions arise: how does their performance compare, and how can they best be combined to produce a better classifier? A classifier can be evaluated in terms of sensitivity and specificity using benchmark, or gold standard, data, that is, data for which the true classification is known. However, a gold standard is not always available. Here we demonstrate that a Bayesian model for comparing medical diagnostics without a gold standard can be successfully applied in the bioinformatics domain, to genomic scale data sets. We present a new implementation, which unlike previous implementations is applicable to any number of classifiers. We apply this model, for the first time, to the problem of finding the globally optimal logical combination of classifiers. Results We compared three classifiers of protein subcellular localisation, and evaluated our estimates of sensitivity and specificity against estimates obtained using a gold standard. The method overestimated sensitivity and specificity with only a small discrepancy, and correctly ranked the classifiers. Diagnostic tests for swine flu were then compared on a small data set. Lastly, classifiers for a genome-wide association study of macular degeneration with 541094 SNPs were analysed. In all cases, run times were feasible, and results precise. The optimal logical combination of classifiers was also determined for all three data sets. Code and data are available from http://bioinformatics.monash.edu.au/downloads/. Conclusions The examples demonstrate the methods are suitable for both small and large data sets, applicable to the wide range of bioinformatics classification problems, and robust to dependence between classifiers. In all three test cases, the globally optimal logical combination of the classifiers was found to be

  9. Reliabilityy and operating margins of LWR fuels

    International Nuclear Information System (INIS)

    Strasser, A.A.; Lindquist, K.O.

    1977-01-01

    The margins to fuel thermal operating limits under normal and accident conditions are key to plant operating flexibility and impact on availability and capacity factor. Fuel performance problems that do not result in clad breach, can reduce these margins. However, most have or can be solved with design changes. Regulatory changes have been major factors in eroding these margins. Various methods for regaining the margins are discussed

  10. Classifying Coding DNA with Nucleotide Statistics

    Directory of Open Access Journals (Sweden)

    Nicolas Carels

    2009-10-01

    Full Text Available In this report, we compared the success rate of classification of coding sequences (CDS vs. introns by Codon Structure Factor (CSF and by a method that we called Universal Feature Method (UFM. UFM is based on the scoring of purine bias (Rrr and stop codon frequency. We show that the success rate of CDS/intron classification by UFM is higher than by CSF. UFM classifies ORFs as coding or non-coding through a score based on (i the stop codon distribution, (ii the product of purine probabilities in the three positions of nucleotide triplets, (iii the product of Cytosine (C, Guanine (G, and Adenine (A probabilities in the 1st, 2nd, and 3rd positions of triplets, respectively, (iv the probabilities of G in 1st and 2nd position of triplets and (v the distance of their GC3 vs. GC2 levels to the regression line of the universal correlation. More than 80% of CDSs (true positives of Homo sapiens (>250 bp, Drosophila melanogaster (>250 bp and Arabidopsis thaliana (>200 bp are successfully classified with a false positive rate lower or equal to 5%. The method releases coding sequences in their coding strand and coding frame, which allows their automatic translation into protein sequences with 95% confidence. The method is a natural consequence of the compositional bias of nucleotides in coding sequences.

  11. A systematic comparison of supervised classifiers.

    Directory of Open Access Journals (Sweden)

    Diego Raphael Amancio

    Full Text Available Pattern recognition has been employed in a myriad of industrial, commercial and academic applications. Many techniques have been devised to tackle such a diversity of applications. Despite the long tradition of pattern recognition research, there is no technique that yields the best classification in all scenarios. Therefore, as many techniques as possible should be considered in high accuracy applications. Typical related works either focus on the performance of a given algorithm or compare various classification methods. In many occasions, however, researchers who are not experts in the field of machine learning have to deal with practical classification tasks without an in-depth knowledge about the underlying parameters. Actually, the adequate choice of classifiers and parameters in such practical circumstances constitutes a long-standing problem and is one of the subjects of the current paper. We carried out a performance study of nine well-known classifiers implemented in the Weka framework and compared the influence of the parameter configurations on the accuracy. The default configuration of parameters in Weka was found to provide near optimal performance for most cases, not including methods such as the support vector machine (SVM. In addition, the k-nearest neighbor method frequently allowed the best accuracy. In certain conditions, it was possible to improve the quality of SVM by more than 20% with respect to their default parameter configuration.

  12. STATISTICAL TOOLS FOR CLASSIFYING GALAXY GROUP DYNAMICS

    International Nuclear Information System (INIS)

    Hou, Annie; Parker, Laura C.; Harris, William E.; Wilman, David J.

    2009-01-01

    The dynamical state of galaxy groups at intermediate redshifts can provide information about the growth of structure in the universe. We examine three goodness-of-fit tests, the Anderson-Darling (A-D), Kolmogorov, and χ 2 tests, in order to determine which statistical tool is best able to distinguish between groups that are relaxed and those that are dynamically complex. We perform Monte Carlo simulations of these three tests and show that the χ 2 test is profoundly unreliable for groups with fewer than 30 members. Power studies of the Kolmogorov and A-D tests are conducted to test their robustness for various sample sizes. We then apply these tests to a sample of the second Canadian Network for Observational Cosmology Redshift Survey (CNOC2) galaxy groups and find that the A-D test is far more reliable and powerful at detecting real departures from an underlying Gaussian distribution than the more commonly used χ 2 and Kolmogorov tests. We use this statistic to classify a sample of the CNOC2 groups and find that 34 of 106 groups are inconsistent with an underlying Gaussian velocity distribution, and thus do not appear relaxed. In addition, we compute velocity dispersion profiles (VDPs) for all groups with more than 20 members and compare the overall features of the Gaussian and non-Gaussian groups, finding that the VDPs of the non-Gaussian groups are distinct from those classified as Gaussian.

  13. Silenced, Silence, Silent: Motherhood in the Margins

    Science.gov (United States)

    Carpenter, Lorelei; Austin, Helena

    2007-01-01

    This project explores the experiences of women who mother children with ADHD. The authors use the metaphor of the text and the margin. The text is the "motherhood myth" that describes a particular sort of "good" mothering. The margin is the space beyond that text. This marginal space is inhabited by some or all of the mothers they spoke with, some…

  14. 12 CFR 220.4 - Margin account.

    Science.gov (United States)

    2010-01-01

    ... Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM CREDIT BY... securities. The required margin on a net long or net short commitment in a when-issued security is the margin...) Interest charged on credit maintained in the margin account; (ii) Premiums on securities borrowed in...

  15. A Summary of Results From the IPOD Transects Across the Japan, Mariana, and Middle-America Convergent Margins

    OpenAIRE

    Von Huene, R; Uyeda, S

    1981-01-01

    Investigations of convergent margins along the IPOD transects support the concept of ocean floor spreading in back-arc basins and the concept of tectonically accreted sediment at the front of convergent margins. However, not all convergent margins have large accreted complexes, and other less frequently used concepts are required in the interpretations of these convergent margins. If the present rates of plate convergence are accepted, then much sediment that entered the trenches studied is p...

  16. 36 CFR 1256.46 - National security-classified information.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false National security-classified... Restrictions § 1256.46 National security-classified information. In accordance with 5 U.S.C. 552(b)(1), NARA... properly classified under the provisions of the pertinent Executive Order on Classified National Security...

  17. Safety margins associated with containment structures under dynamic loading

    International Nuclear Information System (INIS)

    Lu, S.C.

    1978-01-01

    A technical basis for assessing the true safety margins of containment structures involved with MARK I boiling water reactor reevaluation activities is presented. It is based on the results of a plane-strain, large displacement, elasto-plastic, finite-element analysis of a thin cylindrical shell subjected to external and internal pressure pulses. An analytical procedure is presented for estimating the ultimate load capacity of the thin shell structure, and subsequently, for quantifying the design margins of safety for the type of loads under consideration. For defining failure of structures, a finite strain failure criterion is derived that accounts for multiaxiality effects

  18. Marginal Shape Deep Learning: Applications to Pediatric Lung Field Segmentation.

    Science.gov (United States)

    Mansoor, Awais; Cerrolaza, Juan J; Perez, Geovanny; Biggs, Elijah; Nino, Gustavo; Linguraru, Marius George

    2017-02-11

    Representation learning through deep learning (DL) architecture has shown tremendous potential for identification, localization, and texture classification in various medical imaging modalities. However, DL applications to segmentation of objects especially to deformable objects are rather limited and mostly restricted to pixel classification. In this work, we propose marginal shape deep learning (MaShDL), a framework that extends the application of DL to deformable shape segmentation by using deep classifiers to estimate the shape parameters. MaShDL combines the strength of statistical shape models with the automated feature learning architecture of DL. Unlike the iterative shape parameters estimation approach of classical shape models that often leads to a local minima, the proposed framework is robust to local minima optimization and illumination changes. Furthermore, since the direct application of DL framework to a multi-parameter estimation problem results in a very high complexity, our framework provides an excellent run-time performance solution by independently learning shape parameter classifiers in marginal eigenspaces in the decreasing order of variation. We evaluated MaShDL for segmenting the lung field from 314 normal and abnormal pediatric chest radiographs and obtained a mean Dice similarity coefficient of 0.927 using only the four highest modes of variation (compared to 0.888 with classical ASM 1 (p-value=0.01) using same configuration). To the best of our knowledge this is the first demonstration of using DL framework for parametrized shape learning for the delineation of deformable objects.

  19. Two channel EEG thought pattern classifier.

    Science.gov (United States)

    Craig, D A; Nguyen, H T; Burchey, H A

    2006-01-01

    This paper presents a real-time electro-encephalogram (EEG) identification system with the goal of achieving hands free control. With two EEG electrodes placed on the scalp of the user, EEG signals are amplified and digitised directly using a ProComp+ encoder and transferred to the host computer through the RS232 interface. Using a real-time multilayer neural network, the actual classification for the control of a powered wheelchair has a very fast response. It can detect changes in the user's thought pattern in 1 second. Using only two EEG electrodes at positions O(1) and C(4) the system can classify three mental commands (forward, left and right) with an accuracy of more than 79 %

  20. Classifying Drivers' Cognitive Load Using EEG Signals.

    Science.gov (United States)

    Barua, Shaibal; Ahmed, Mobyen Uddin; Begum, Shahina

    2017-01-01

    A growing traffic safety issue is the effect of cognitive loading activities on traffic safety and driving performance. To monitor drivers' mental state, understanding cognitive load is important since while driving, performing cognitively loading secondary tasks, for example talking on the phone, can affect the performance in the primary task, i.e. driving. Electroencephalography (EEG) is one of the reliable measures of cognitive load that can detect the changes in instantaneous load and effect of cognitively loading secondary task. In this driving simulator study, 1-back task is carried out while the driver performs three different simulated driving scenarios. This paper presents an EEG based approach to classify a drivers' level of cognitive load using Case-Based Reasoning (CBR). The results show that for each individual scenario as well as using data combined from the different scenarios, CBR based system achieved approximately over 70% of classification accuracy.

  1. Hybrid Neuro-Fuzzy Classifier Based On Nefclass Model

    Directory of Open Access Journals (Sweden)

    Bogdan Gliwa

    2011-01-01

    Full Text Available The paper presents hybrid neuro-fuzzy classifier, based on NEFCLASS model, which wasmodified. The presented classifier was compared to popular classifiers – neural networks andk-nearest neighbours. Efficiency of modifications in classifier was compared with methodsused in original model NEFCLASS (learning methods. Accuracy of classifier was testedusing 3 datasets from UCI Machine Learning Repository: iris, wine and breast cancer wisconsin.Moreover, influence of ensemble classification methods on classification accuracy waspresented.

  2. Classifying Transition Behaviour in Postural Activity Monitoring

    Directory of Open Access Journals (Sweden)

    James BRUSEY

    2009-10-01

    Full Text Available A few accelerometers positioned on different parts of the body can be used to accurately classify steady state behaviour, such as walking, running, or sitting. Such systems are usually built using supervised learning approaches. Transitions between postures are, however, difficult to deal with using posture classification systems proposed to date, since there is no label set for intermediary postures and also the exact point at which the transition occurs can sometimes be hard to pinpoint. The usual bypass when using supervised learning to train such systems is to discard a section of the dataset around each transition. This leads to poorer classification performance when the systems are deployed out of the laboratory and used on-line, particularly if the regimes monitored involve fast paced activity changes. Time-based filtering that takes advantage of sequential patterns is a potential mechanism to improve posture classification accuracy in such real-life applications. Also, such filtering should reduce the number of event messages needed to be sent across a wireless network to track posture remotely, hence extending the system’s life. To support time-based filtering, understanding transitions, which are the major event generators in a classification system, is a key. This work examines three approaches to post-process the output of a posture classifier using time-based filtering: a naïve voting scheme, an exponentially weighted voting scheme, and a Bayes filter. Best performance is obtained from the exponentially weighted voting scheme although it is suspected that a more sophisticated treatment of the Bayes filter might yield better results.

  3. Safety margins in deterministic safety analysis

    International Nuclear Information System (INIS)

    Viktorov, A.

    2011-01-01

    The concept of safety margins has acquired certain prominence in the attempts to demonstrate quantitatively the level of the nuclear power plant safety by means of deterministic analysis, especially when considering impacts from plant ageing and discovery issues. A number of international or industry publications exist that discuss various applications and interpretations of safety margins. The objective of this presentation is to bring together and examine in some detail, from the regulatory point of view, the safety margins that relate to deterministic safety analysis. In this paper, definitions of various safety margins are presented and discussed along with the regulatory expectations for them. Interrelationships of analysis input and output parameters with corresponding limits are explored. It is shown that the overall safety margin is composed of several components each having different origins and potential uses; in particular, margins associated with analysis output parameters are contrasted with margins linked to the analysis input. While these are separate, it is possible to influence output margins through the analysis input, and analysis method. Preserving safety margins is tantamount to maintaining safety. At the same time, efficiency of operation requires optimization of safety margins taking into account various technical and regulatory considerations. For this, basic definitions and rules for safety margins must be first established. (author)

  4. Just-in-time adaptive classifiers-part II: designing the classifier.

    Science.gov (United States)

    Alippi, Cesare; Roveri, Manuel

    2008-12-01

    Aging effects, environmental changes, thermal drifts, and soft and hard faults affect physical systems by changing their nature and behavior over time. To cope with a process evolution adaptive solutions must be envisaged to track its dynamics; in this direction, adaptive classifiers are generally designed by assuming the stationary hypothesis for the process generating the data with very few results addressing nonstationary environments. This paper proposes a methodology based on k-nearest neighbor (NN) classifiers for designing adaptive classification systems able to react to changing conditions just-in-time (JIT), i.e., exactly when it is needed. k-NN classifiers have been selected for their computational-free training phase, the possibility to easily estimate the model complexity k and keep under control the computational complexity of the classifier through suitable data reduction mechanisms. A JIT classifier requires a temporal detection of a (possible) process deviation (aspect tackled in a companion paper) followed by an adaptive management of the knowledge base (KB) of the classifier to cope with the process change. The novelty of the proposed approach resides in the general framework supporting the real-time update of the KB of the classification system in response to novel information coming from the process both in stationary conditions (accuracy improvement) and in nonstationary ones (process tracking) and in providing a suitable estimate of k. It is shown that the classification system grants consistency once the change targets the process generating the data in a new stationary state, as it is the case in many real applications.

  5. Assessment of bioenergy potential on marginal land in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, Dafang; Jiang, Dong; Liu, Lei; Huang, Yaohuan [Data Center for Resources and Environmental Sciences, Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, 11A Datun Road, Chaoyang District, Beijing 100101 (China)

    2011-02-15

    Bioenergy developed from energy plants will play a more and more important role in future energy supply. Much attention has been paid to energy plants in recent years. As China has fairly limited cultivated land resources, the bioenergy development may mainly rely on the exploitation of marginal land. This study focused on the assessment of marginal land resources and bio-fuel potential in China using newly acquired data and Geographic Information System (GIS) techniques. A multi-factor analysis method was adopted to identify marginal lands for bioenergy development in China, with data of several main types of energy plants on the eco-environmental requirements and natural habits employed. A combined planting zonation strategy was proposed, which was targeted for five species of energy plants including Helianthus tuberous L., Pistacia chinensis, Jatropha curcas L., Cassava and Vernicia fordii. The results indicated that total area of marginal land exploitable for development of energy plants on a large scale was about 43.75 million ha. If 10% of this marginal land was fully utilized for growing the energy plants, the production of bio-fuel would be 13.39 million tons. (author)

  6. Analysis and minimization of overtraining effect in rule-based classifiers for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Li Qiang; Doi Kunio

    2006-01-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists detect various lesions in medical images. In CAD schemes, classifiers play a key role in achieving a high lesion detection rate and a low false-positive rate. Although many popular classifiers such as linear discriminant analysis and artificial neural networks have been employed in CAD schemes for reduction of false positives, a rule-based classifier has probably been the simplest and most frequently used one since the early days of development of various CAD schemes. However, with existing rule-based classifiers, there are major disadvantages that significantly reduce their practicality and credibility. The disadvantages include manual design, poor reproducibility, poor evaluation methods such as resubstitution, and a large overtraining effect. An automated rule-based classifier with a minimized overtraining effect can overcome or significantly reduce the extent of the above-mentioned disadvantages. In this study, we developed an 'optimal' method for the selection of cutoff thresholds and a fully automated rule-based classifier. Experimental results performed with Monte Carlo simulation and a real lung nodule CT data set demonstrated that the automated threshold selection method can completely eliminate overtraining effect in the procedure of cutoff threshold selection, and thus can minimize overall overtraining effect in the constructed rule-based classifier. We believe that this threshold selection method is very useful in the construction of automated rule-based classifiers with minimized overtraining effect

  7. A distributed approach for optimizing cascaded classifier topologies in real-time stream mining systems.

    Science.gov (United States)

    Foo, Brian; van der Schaar, Mihaela

    2010-11-01

    In this paper, we discuss distributed optimization techniques for configuring classifiers in a real-time, informationally-distributed stream mining system. Due to the large volume of streaming data, stream mining systems must often cope with overload, which can lead to poor performance and intolerable processing delay for real-time applications. Furthermore, optimizing over an entire system of classifiers is a difficult task since changing the filtering process at one classifier can impact both the feature values of data arriving at classifiers further downstream and thus, the classification performance achieved by an ensemble of classifiers, as well as the end-to-end processing delay. To address this problem, this paper makes three main contributions: 1) Based on classification and queuing theoretic models, we propose a utility metric that captures both the performance and the delay of a binary filtering classifier system. 2) We introduce a low-complexity framework for estimating the system utility by observing, estimating, and/or exchanging parameters between the inter-related classifiers deployed across the system. 3) We provide distributed algorithms to reconfigure the system, and analyze the algorithms based on their convergence properties, optimality, information exchange overhead, and rate of adaptation to non-stationary data sources. We provide results using different video classifier systems.

  8. Novel maximum-margin training algorithms for supervised neural networks.

    Science.gov (United States)

    Ludwig, Oswaldo; Nunes, Urbano

    2010-06-01

    This paper proposes three novel training methods, two of them based on the backpropagation approach and a third one based on information theory for multilayer perceptron (MLP) binary classifiers. Both backpropagation methods are based on the maximal-margin (MM) principle. The first one, based on the gradient descent with adaptive learning rate algorithm (GDX) and named maximum-margin GDX (MMGDX), directly increases the margin of the MLP output-layer hyperplane. The proposed method jointly optimizes both MLP layers in a single process, backpropagating the gradient of an MM-based objective function, through the output and hidden layers, in order to create a hidden-layer space that enables a higher margin for the output-layer hyperplane, avoiding the testing of many arbitrary kernels, as occurs in case of support vector machine (SVM) training. The proposed MM-based objective function aims to stretch out the margin to its limit. An objective function based on Lp-norm is also proposed in order to take into account the idea of support vectors, however, overcoming the complexity involved in solving a constrained optimization problem, usually in SVM training. In fact, all the training methods proposed in this paper have time and space complexities O(N) while usual SVM training methods have time complexity O(N (3)) and space complexity O(N (2)) , where N is the training-data-set size. The second approach, named minimization of interclass interference (MICI), has an objective function inspired on the Fisher discriminant analysis. Such algorithm aims to create an MLP hidden output where the patterns have a desirable statistical distribution. In both training methods, the maximum area under ROC curve (AUC) is applied as stop criterion. The third approach offers a robust training framework able to take the best of each proposed training method. The main idea is to compose a neural model by using neurons extracted from three other neural networks, each one previously trained by

  9. PRE-RIFT COMPRESSIONAL STRUCTURES AS A CONTROL ON PASSIVE MARGIN FORMATION

    DEFF Research Database (Denmark)

    Schiffer, Christian; Petersen, Kenni Dinesen

    Passive margins are commonly separated into volcanic and non-volcanic modes, each with a distinct formation mechanism and structure. Both form the transition from continental to oceanic crust. Large amounts of geophysical data at passive margins show that the tapering continental crust is often u...

  10. Asymmetric rifting, breakup and magmatism across conjugate margin pairs: insights from Newfoundland to Ireland

    Science.gov (United States)

    Peace, Alexander L.; Welford, J. Kim; Foulger, Gillian R.; McCaffrey, Ken J. W.

    2017-04-01

    Continental extension, subsequent rifting and eventual breakup result in the development of passive margins with transitional crust between extended continental crust and newly created oceanic crust. Globally, passive margins are typically classified as either magma-rich or magma-poor. Despite this simple classification, magma-poor margins like the West Orphan Basin, offshore Newfoundland, do exhibit some evidence of localized magmatism, as magmatism to some extent invariably accompanies all continental breakup. For example, on the Newfoundland margin, a small volcanic province has been interpreted near the termination of the Charlie Gibbs Fracture Zone, whereas on the conjugate Irish margin within the Rockall Basin, magmatism appears to be more widespread and has been documented both in the north and in the south. The broader region over which volcanism has been identified on the Irish margin is suggestive of magmatic asymmetry across this conjugate margin pair and this may have direct implications for the mechanisms governing the nature of rifting and breakup. Possible causes of the magmatic asymmetry include asymmetric rifting (simple shear), post-breakup thermal anomalies in the mantle, or pre-existing compositional zones in the crust that predispose one of the margins to more melting than its conjugate. A greater understanding of the mechanisms leading to conjugate margin asymmetry will enhance our fundamental understanding of rifting processes and will also reduce hydrocarbon exploration risk by better characterizing the structural and thermal evolution of hydrocarbon bearing basins on magma-poor margins where evidence of localized magmatism exists. Here, the latest results of a conjugate margin study of the Newfoundland-Ireland pair utilizing seismic interpretation integrated with other geological and geophysical datasets are presented. Our analysis has begun to reveal the nature and timing of rift-related magmatism and the degree to which magmatic asymmetry

  11. Controlling marginally detached divertor plasmas

    Science.gov (United States)

    Eldon, D.; Kolemen, E.; Barton, J. L.; Briesemeister, A. R.; Humphreys, D. A.; Leonard, A. W.; Maingi, R.; Makowski, M. A.; McLean, A. G.; Moser, A. L.; Stangeby, P. C.

    2017-06-01

    A new control system at DIII-D has stabilized the inter-ELM detached divertor plasma state for H-mode in close proximity to the threshold for reattachment, thus demonstrating the ability to maintain detachment with minimal gas puffing. When the same control system was instead ordered to hold the plasma at the threshold (here defined as T e  =  5 eV near the divertor target plate), the resulting T e profiles separated into two groups with one group consistent with marginal detachment, and the other with marginal attachment. The plasma dithers between the attached and detached states when the control system attempts to hold at the threshold. The control system is upgraded from the one described in Kolemen et al (2015 J. Nucl. Mater. 463 1186) and it handles ELMing plasmas by using real time D α measurements to remove during-ELM slices from real time T e measurements derived from divertor Thomson scattering. The difference between measured and requested inter-ELM T e is passed to a PID (proportional-integral-derivative) controller to determine gas puff commands. While some degree of detachment is essential for the health of ITER’s divertor, more deeply detached plasmas have greater radiative losses and, at the extreme, confinement degradation, making it desirable to limit detachment to the minimum level needed to protect the target plate (Kolemen et al 2015 J. Nucl. Mater. 463 1186). However, the observed bifurcation in plasma conditions at the outer strike point with the ion B   ×  \

  12. Classifying magnetic resonance image modalities with convolutional neural networks

    Science.gov (United States)

    Remedios, Samuel; Pham, Dzung L.; Butman, John A.; Roy, Snehashis

    2018-02-01

    Magnetic Resonance (MR) imaging allows the acquisition of images with different contrast properties depending on the acquisition protocol and the magnetic properties of tissues. Many MR brain image processing techniques, such as tissue segmentation, require multiple MR contrasts as inputs, and each contrast is treated differently. Thus it is advantageous to automate the identification of image contrasts for various purposes, such as facilitating image processing pipelines, and managing and maintaining large databases via content-based image retrieval (CBIR). Most automated CBIR techniques focus on a two-step process: extracting features from data and classifying the image based on these features. We present a novel 3D deep convolutional neural network (CNN)- based method for MR image contrast classification. The proposed CNN automatically identifies the MR contrast of an input brain image volume. Specifically, we explored three classification problems: (1) identify T1-weighted (T1-w), T2-weighted (T2-w), and fluid-attenuated inversion recovery (FLAIR) contrasts, (2) identify pre vs postcontrast T1, (3) identify pre vs post-contrast FLAIR. A total of 3418 image volumes acquired from multiple sites and multiple scanners were used. To evaluate each task, the proposed model was trained on 2137 images and tested on the remaining 1281 images. Results showed that image volumes were correctly classified with 97.57% accuracy.

  13. Classifying Adverse Events in the Dental Office.

    Science.gov (United States)

    Kalenderian, Elsbeth; Obadan-Udoh, Enihomo; Maramaldi, Peter; Etolue, Jini; Yansane, Alfa; Stewart, Denice; White, Joel; Vaderhobli, Ram; Kent, Karla; Hebballi, Nutan B; Delattre, Veronique; Kahn, Maria; Tokede, Oluwabunmi; Ramoni, Rachel B; Walji, Muhammad F

    2017-06-30

    Dentists strive to provide safe and effective oral healthcare. However, some patients may encounter an adverse event (AE) defined as "unnecessary harm due to dental treatment." In this research, we propose and evaluate two systems for categorizing the type and severity of AEs encountered at the dental office. Several existing medical AE type and severity classification systems were reviewed and adapted for dentistry. Using data collected in previous work, two initial dental AE type and severity classification systems were developed. Eight independent reviewers performed focused chart reviews, and AEs identified were used to evaluate and modify these newly developed classifications. A total of 958 charts were independently reviewed. Among the reviewed charts, 118 prospective AEs were found and 101 (85.6%) were verified as AEs through a consensus process. At the end of the study, a final AE type classification comprising 12 categories, and an AE severity classification comprising 7 categories emerged. Pain and infection were the most common AE types representing 73% of the cases reviewed (56% and 17%, respectively) and 88% were found to cause temporary, moderate to severe harm to the patient. Adverse events found during the chart review process were successfully classified using the novel dental AE type and severity classifications. Understanding the type of AEs and their severity are important steps if we are to learn from and prevent patient harm in the dental office.

  14. Is it important to classify ischaemic stroke?

    LENUS (Irish Health Repository)

    Iqbal, M

    2012-02-01

    Thirty-five percent of all ischemic events remain classified as cryptogenic. This study was conducted to ascertain the accuracy of diagnosis of ischaemic stroke based on information given in the medical notes. It was tested by applying the clinical information to the (TOAST) criteria. Hundred and five patients presented with acute stroke between Jan-Jun 2007. Data was collected on 90 patients. Male to female ratio was 39:51 with age range of 47-93 years. Sixty (67%) patients had total\\/partial anterior circulation stroke; 5 (5.6%) had a lacunar stroke and in 25 (28%) the mechanism of stroke could not be identified. Four (4.4%) patients with small vessel disease were anticoagulated; 5 (5.6%) with atrial fibrillation received antiplatelet therapy and 2 (2.2%) patients with atrial fibrillation underwent CEA. This study revealed deficiencies in the clinical assessment of patients and treatment was not tailored to the mechanism of stroke in some patients.

  15. Stress fracture development classified by bone scintigraphy

    International Nuclear Information System (INIS)

    Zwas, S.T.; Elkanovich, R.; Frank, G.; Aharonson, Z.

    1985-01-01

    There is no consensus on classifying stress fractures (SF) appearing on bone scans. The authors present a system of classification based on grading the severity and development of bone lesions by visual inspection, according to three main scintigraphic criteria: focality and size, intensity of uptake compare to adjacent bone, and local medular extension. Four grades of development (I-IV) were ranked, ranging from ill defined slightly increased cortical uptake to well defined regions with markedly increased uptake extending transversely bicortically. 310 male subjects aged 19-2, suffering several weeks from leg pains occurring during intensive physical training underwent bone scans of the pelvis and lower extremities using Tc-99-m-MDP. 76% of the scans were positive with 354 lesions, of which 88% were in th4e mild (I-II) grades and 12% in the moderate (III) and severe (IV) grades. Post-treatment scans were obtained in 65 cases having 78 lesions during 1- to 6-month intervals. Complete resolution was found after 1-2 months in 36% of the mild lesions but in only 12% of the moderate and severe ones, and after 3-6 months in 55% of the mild lesions and 15% of the severe ones. 75% of the moderate and severe lesions showed residual uptake in various stages throughout the follow-up period. Early recognition and treatment of mild SF lesions in this study prevented protracted disability and progression of the lesions and facilitated complete healing

  16. Extensive management of field margins enhances their potential for off-site soil erosion mitigation.

    Science.gov (United States)

    Ali, Hamada E; Reineking, Björn

    2016-03-15

    Soil erosion is a widespread problem in agricultural landscapes, particularly in regions with strong rainfall events. Vegetated field margins can mitigate negative impacts of soil erosion off-site by trapping eroded material. Here we analyse how local management affects the trapping capacity of field margins in a monsoon region of South Korea, contrasting intensively and extensively managed field margins on both steep and shallow slopes. Prior to the beginning of monsoon season, we equipped a total of 12 sites representing three replicates for each of four different types of field margins ("intensive managed flat", "intensive managed steep", "extensive managed flat" and "extensive managed steep") with Astroturf mats. The mats (n = 15/site) were placed before, within and after the field margin. Sediment was collected after each rain event until the end of the monsoon season. The effect of management and slope on sediment trapping was analysed using linear mixed effects models, using as response variable either the sediment collected within the field margin or the difference in sediment collected after and before the field margin. There was no difference in the amount of sediment reaching the different field margin types. In contrast, extensively managed field margins showed a large reduction in collected sediment before and after the field margins. This effect was pronounced in steep field margins, and increased with the size of rainfall events. We conclude that a field margin management promoting a dense vegetation cover is a key to mitigating negative off-site effects of soil erosion in monsoon regions, particularly in field margins with steep slopes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Marginal cost application in the power industry

    International Nuclear Information System (INIS)

    Twardy, L.; Rusak, H.

    1994-01-01

    Two kind of marginal costs, the short-run and the long-run, are defined. The former are applied in conditions when the load increase is not accompanied neither by the increase of the transmission capacity not the installed capacity while the latter assume new investments to expand the power system. The long-run marginal costs be used to forecast optimized development of the system. They contain two main components: the marginal costs of capacity and the marginal costs of energy. When the long-run marginal costs are calculated, each component is considered for particular voltage levels, seasons of the year, hours of the day - selected depending on the system reliability factor as well as on its load level. In the market economy countries the long-run marginal costs can be used for setting up the electric energy tariffs. (author). 7 refs, 11 figs

  18. A robust dataset-agnostic heart disease classifier from Phonocardiogram.

    Science.gov (United States)

    Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M

    2017-07-01

    Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.

  19. City and sea margins. Porto’s Marginal as scale and measure of new spaces

    Directory of Open Access Journals (Sweden)

    Giuseppe Parità

    2014-06-01

    Full Text Available The city has always been confronting with its own end and the beginning of the water system. Among the different kind of margin areas, the ones that border the cities on their watersides are particularly interesting. These new liminal territories are rich in variety and differences and are set up of several elements made of different morphologies that should be carefully read and interpreted: the need of re-thinking the morphological elements that mark an urban edge leads to the identification of several shapes and forms of the water borderlands. Borders, limits, boundaries, edges, margin areas - usually considered as an obstacle to the construction of the city - turn themselves as new possible “design materials” for building that ambiguous distance between city and the sea. The article aims to focus on the case-study of Porto’s Marginal that well explain how many ways a city can live its water edges. On a large scale, it is configured as a strip of 15 kilometers of public space. Within this continuity, the different extent of the distance between city and water leads to reflect on the different types of relationships (and therefore projects between the end of one side and the beginning of another. For Porto, those are not only urban parts, but also different geographical parts (sea, rivers, topography that distance puts in relation through the design sometimes of the line, at time of the border or of a surface. So, the analysis of these heterogeneous but continuous projects aim to focus on the several techniques of urban composition to build contemporary public spaces. On one hand they give form to a continuous “public figure”, on the other hand each one of the project can be considered as part of a “atlas” of liminal places, giving form to public spaces. 

  20. 41 CFR 105-62.102 - Authority to originally classify.

    Science.gov (United States)

    2010-07-01

    ... originally classify. (a) Top secret, secret, and confidential. The authority to originally classify information as Top Secret, Secret, or Confidential may be exercised only by the Administrator and is delegable...

  1. Naive Bayesian classifiers for multinomial features: a theoretical analysis

    CSIR Research Space (South Africa)

    Van Dyk, E

    2007-11-01

    Full Text Available The authors investigate the use of naive Bayesian classifiers for multinomial feature spaces and derive error estimates for these classifiers. The error analysis is done by developing a mathematical model to estimate the probability density...

  2. Ensemble of classifiers based network intrusion detection system performance bound

    CSIR Research Space (South Africa)

    Mkuzangwe, Nenekazi NP

    2017-11-01

    Full Text Available This paper provides a performance bound of a network intrusion detection system (NIDS) that uses an ensemble of classifiers. Currently researchers rely on implementing the ensemble of classifiers based NIDS before they can determine the performance...

  3. Aspects of marginal expenditures in energy sector

    International Nuclear Information System (INIS)

    Stojchev, D.; Kynev, K.

    1994-01-01

    Technical and economical problems of marginal analysis methodology, its application procedure in energy sector and marginal expenditures determination are outlined. A comparative characteristics of the application is made for different periods of time. The differences in calculation of the marginal expenditures and prices are discussed. The operational costs, investments and inflation are analyzed. The mechanism of application of this approach in different planing horizon is outlined. The role of the change in the costs in time, the time unit, volume, the scope of application, etc. are determined. The areas of transition from one to other form of marginal expenditures are shown. 4 refs. (orig.)

  4. Assessment of seismic margin calculation methods

    International Nuclear Information System (INIS)

    Kennedy, R.P.; Murray, R.C.; Ravindra, M.K.; Reed, J.W.; Stevenson, J.D.

    1989-03-01

    Seismic margin review of nuclear power plants requires that the High Confidence of Low Probability of Failure (HCLPF) capacity be calculated for certain components. The candidate methods for calculating the HCLPF capacity as recommended by the Expert Panel on Quantification of Seismic Margins are the Conservative Deterministic Failure Margin (CDFM) method and the Fragility Analysis (FA) method. The present study evaluated these two methods using some representative components in order to provide further guidance in conducting seismic margin reviews. It is concluded that either of the two methods could be used for calculating HCLPF capacities. 21 refs., 9 figs., 6 tabs

  5. Regional Marginal Abatement Cost Curves for NOx

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data underlying the figures included in the manuscript "Marginal abatement cost curve for NOx incorporating controls, renewable electricity, energy efficiency and...

  6. [Resection margins in conservative breast cancer surgery].

    Science.gov (United States)

    Medina Fernández, Francisco Javier; Ayllón Terán, María Dolores; Lombardo Galera, María Sagrario; Rioja Torres, Pilar; Bascuñana Estudillo, Guillermo; Rufián Peña, Sebastián

    2013-01-01

    Conservative breast cancer surgery is facing a new problem: the potential tumour involvement of resection margins. This eventuality has been closely and negatively associated with disease-free survival. Various factors may influence the likelihood of margins being affected, mostly related to the characteristics of the tumour, patient or surgical technique. In the last decade, many studies have attempted to find predictive factors for margin involvement. However, it is currently the new techniques used in the study of margins and tumour localisation that are significantly reducing reoperations in conservative breast cancer surgery. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  7. Classifying and evaluating architecture design methods

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, B.

    1999-01-01

    The concept of software architecture has gained a wide popularity and is generally considered to play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. This document first gives a definition of architectures. Second, a

  8. Classifying and Evaluating Architecture Design Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Aksit, Mehmet; Aksit, Mehmet

    2002-01-01

    The concept of software architecture has gained a wide popularity and is generally considered to play a fundamental role in coping with the inherent difficulties of the development of large-scale and complex software systems. This chapter first gives a definition of architecture. Second, a

  9. A deep learning method for classifying mammographic breast density categories.

    Science.gov (United States)

    Mohamed, Aly A; Berg, Wendie A; Peng, Hong; Luo, Yahong; Jankowitz, Rachel C; Wu, Shandong

    2018-01-01

    Mammographic breast density is an established risk marker for breast cancer and is visually assessed by radiologists in routine mammogram image reading, using four qualitative Breast Imaging and Reporting Data System (BI-RADS) breast density categories. It is particularly difficult for radiologists to consistently distinguish the two most common and most variably assigned BI-RADS categories, i.e., "scattered density" and "heterogeneously dense". The aim of this work was to investigate a deep learning-based breast density classifier to consistently distinguish these two categories, aiming at providing a potential computerized tool to assist radiologists in assigning a BI-RADS category in current clinical workflow. In this study, we constructed a convolutional neural network (CNN)-based model coupled with a large (i.e., 22,000 images) digital mammogram imaging dataset to evaluate the classification performance between the two aforementioned breast density categories. All images were collected from a cohort of 1,427 women who underwent standard digital mammography screening from 2005 to 2016 at our institution. The truths of the density categories were based on standard clinical assessment made by board-certified breast imaging radiologists. Effects of direct training from scratch solely using digital mammogram images and transfer learning of a pretrained model on a large nonmedical imaging dataset were evaluated for the specific task of breast density classification. In order to measure the classification performance, the CNN classifier was also tested on a refined version of the mammogram image dataset by removing some potentially inaccurately labeled images. Receiver operating characteristic (ROC) curves and the area under the curve (AUC) were used to measure the accuracy of the classifier. The AUC was 0.9421 when the CNN-model was trained from scratch on our own mammogram images, and the accuracy increased gradually along with an increased size of training samples

  10. Three data partitioning strategies for building local classifiers (Chapter 14)

    NARCIS (Netherlands)

    Zliobaite, I.; Okun, O.; Valentini, G.; Re, M.

    2011-01-01

    Divide-and-conquer approach has been recognized in multiple classifier systems aiming to utilize local expertise of individual classifiers. In this study we experimentally investigate three strategies for building local classifiers that are based on different routines of sampling data for training.

  11. Recognition of pornographic web pages by classifying texts and images.

    Science.gov (United States)

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  12. 32 CFR 2400.28 - Dissemination of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Dissemination of classified information. 2400.28... SECURITY PROGRAM Safeguarding § 2400.28 Dissemination of classified information. Heads of OSTP offices... originating official may prescribe specific restrictions on dissemination of classified information when...

  13. Marginal cost pricing of electricity

    International Nuclear Information System (INIS)

    Edsbaecker, G.

    1980-01-01

    The discipline is economics and the phenomenon is the power system. The purpose of this system is to produce, transmit and consume electricity in such a way that the sum of consumers and suppliers surplus in maximized. This is accomplished by the means of marginal cost pricing. The concepts of the power system and the relations prevailing between and among them are picked out, defined and analyzed in the frames of economic theory and operations research. Methods are developed aiming at efficient prices so that the short run function of the power system is managed in such a way that the sum of conumers and suppliers surplus is maximized within the framwork of this system, i.e. value of service of the power system is maximized. The task of developing such methods is accomplished subject to mixed production resources, transmission losses, periodic demand and also when there is lack of information concerning future and cost conditions. The main results are methods which take to account the conditions stated above. Methods not only allowing for traditional cost minimizing but also for maximation of value of service including a process of reaching optimum by gradual adaption when demand and cost curves are not known in advance. (author)

  14. Robust Template Decomposition without Weight Restriction for Cellular Neural Networks Implementing Arbitrary Boolean Functions Using Support Vector Classifiers

    Directory of Open Access Journals (Sweden)

    Yih-Lon Lin

    2013-01-01

    Full Text Available If the given Boolean function is linearly separable, a robust uncoupled cellular neural network can be designed as a maximal margin classifier. On the other hand, if the given Boolean function is linearly separable but has a small geometric margin or it is not linearly separable, a popular approach is to find a sequence of robust uncoupled cellular neural networks implementing the given Boolean function. In the past research works using this approach, the control template parameters and thresholds are restricted to assume only a given finite set of integers, and this is certainly unnecessary for the template design. In this study, we try to remove this restriction. Minterm- and maxterm-based decomposition algorithms utilizing the soft margin and maximal margin support vector classifiers are proposed to design a sequence of robust templates implementing an arbitrary Boolean function. Several illustrative examples are simulated to demonstrate the efficiency of the proposed method by comparing our results with those produced by other decomposition methods with restricted weights.

  15. Peat classified as slowly renewable biomass fuel

    International Nuclear Information System (INIS)

    2001-01-01

    thousands of years. The report states also that peat should be classified as biomass fuel instead of biofuels, such as wood, or fossil fuels such as coal. According to the report peat is a renewable biomass fuel like biofuels, but due to slow accumulation it should be considered as slowly renewable fuel. The report estimates that bonding of carbon in both virgin and forest drained peatlands are so high that it can compensate the emissions formed in combustion of energy peat

  16. Categorical marginal models: quite extensive package for the estimation of marginal models for categorical data

    OpenAIRE

    Wicher Bergsma; Andries van der Ark

    2015-01-01

    A package accompanying the book Marginal Models for Dependent, Clustered, and Longitudinal Categorical Data by Bergsma, Croon, & Hagenaars, 2009. It’s purpose is fitting and testing of marginal models.

  17. Margins for geometric uncertainty around organs at risk in radiotherapy

    International Nuclear Information System (INIS)

    McKenzie, Alan; Herk, Marcel van; Mijnheer, Ben

    2002-01-01

    Background and purpose: ICRU Report 62 suggests drawing margins around organs at risk (ORs) to produce planning organ at risk volumes (PRVs) to account for geometric uncertainty in the radiotherapy treatment process. This paper proposes an algorithm for drawing such margins, and compares the recommended margin widths with examples from clinical practice and discusses the limitations of the approach. Method: The use of the PRV defined in this way is that, despite the geometric uncertainties, the dose calculated within the PRV by the treatment planning system can be used to represent the dose in the OR with a certain confidence level. A suitable level is where, in the majority of cases (90%), the dose-volume histogram of the PRV will not under-represent the high-dose components in the OR. In order to provide guidelines on how to do this in clinical practice, this paper distinguishes types of OR in terms of the tolerance doses relative to the prescription dose and suggests appropriate margins for serial-structure and parallel-structure ORs. Results: In some instances of large and parallel ORs, the clinician may judge that the complication risk in omitting a margin is acceptable. Otherwise, for all types of OR, systematic, treatment preparation uncertainties may be accommodated by an OR→PRV margin width of 1.3Σ. Here, Σ is the standard deviation of the combined systematic (treatment preparation) uncertainties. In the case of serial ORs or small, parallel ORs, the effects of blurring caused by daily treatment execution errors (set-up and organ motion) should be taken into account. Near a region of high dose, blurring tends to shift the isodoses away from the unblurred edge as shown on the treatment planning system by an amount that may be represented by 0.5σ. This margin may be used either to increase or to decrease the margin already calculated for systematic uncertainties, depending upon the size of the tolerance dose relative to the detailed planned dose

  18. Technical specification improvement through safety margin considerations

    International Nuclear Information System (INIS)

    Howard, R.C.; Jansen, R.L.

    1986-01-01

    Westinghouse has developed an approach for utilizing safety analysis margin considerations to improve plant operability through technical specification revision. This approach relies on the identification and use of parameter interrelations and sensitivities to identify acceptable operating envelopes. This paper summarizes technical specification activities to date and presents the use of safety margin considerations as another viable method to obtain technical specification improvement

  19. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  20. Values and marginal preferences in international business

    NARCIS (Netherlands)

    Maseland, Robbert; van Hoorn, Andre

    2010-01-01

    In a recent paper in this journal, Maseland and van Hoorn argued that values surveys tend to conflate values and marginal preferences. This assertion has been challenged by Brewer and Venaik, who claim that the wording of most survey items does not suggest that these elicit marginal preferences.

  1. Exactly marginal deformations from exceptional generalised geometry

    Energy Technology Data Exchange (ETDEWEB)

    Ashmore, Anthony [Merton College, University of Oxford,Merton Street, Oxford, OX1 4JD (United Kingdom); Mathematical Institute, University of Oxford,Andrew Wiles Building, Woodstock Road, Oxford, OX2 6GG (United Kingdom); Gabella, Maxime [Institute for Advanced Study,Einstein Drive, Princeton, NJ 08540 (United States); Graña, Mariana [Institut de Physique Théorique, CEA/Saclay,91191 Gif-sur-Yvette (France); Petrini, Michela [Sorbonne Université, UPMC Paris 05, UMR 7589, LPTHE,75005 Paris (France); Waldram, Daniel [Department of Physics, Imperial College London,Prince Consort Road, London, SW7 2AZ (United Kingdom)

    2017-01-27

    We apply exceptional generalised geometry to the study of exactly marginal deformations of N=1 SCFTs that are dual to generic AdS{sub 5} flux backgrounds in type IIB or eleven-dimensional supergravity. In the gauge theory, marginal deformations are parametrised by the space of chiral primary operators of conformal dimension three, while exactly marginal deformations correspond to quotienting this space by the complexified global symmetry group. We show how the supergravity analysis gives a geometric interpretation of the gauge theory results. The marginal deformations arise from deformations of generalised structures that solve moment maps for the generalised diffeomorphism group and have the correct charge under the generalised Reeb vector, generating the R-symmetry. If this is the only symmetry of the background, all marginal deformations are exactly marginal. If the background possesses extra isometries, there are obstructions that come from fixed points of the moment maps. The exactly marginal deformations are then given by a further quotient by these extra isometries. Our analysis holds for any N=2 AdS{sub 5} flux background. Focussing on the particular case of type IIB Sasaki-Einstein backgrounds we recover the result that marginal deformations correspond to perturbing the solution by three-form flux at first order. In various explicit examples, we show that our expression for the three-form flux matches those in the literature and the obstruction conditions match the one-loop beta functions of the dual SCFT.

  2. Steep microbial boundstone-dominated plaform margins

    NARCIS (Netherlands)

    Kenter, J.A.M.; Harris, P.M.; Della Porta, G.P.

    2005-01-01

    Seaward progradation of several kilometers has been documented mostly for leeward margin low-angle carbonate slope systems with a dominant platform top sediment source. However, steep and high-relief margins fronting deep basins can also prograde and as such are somewhat perplexing. Characteristics

  3. Grounding grammatical categories: attention bias in hand space influences grammatical congruency judgment of Chinese nominal classifiers.

    Science.gov (United States)

    Lobben, Marit; D'Ascenzo, Stefania

    2015-01-01

    Embodied cognitive theories predict that linguistic conceptual representations are grounded and continually represented in real world, sensorimotor experiences. However, there is an on-going debate on whether this also holds for abstract concepts. Grammar is the archetype of abstract knowledge, and therefore constitutes a test case against embodied theories of language representation. Former studies have largely focussed on lexical-level embodied representations. In the present study we take the grounding-by-modality idea a step further by using reaction time (RT) data from the linguistic processing of nominal classifiers in Chinese. We take advantage of an independent body of research, which shows that attention in hand space is biased. Specifically, objects near the hand consistently yield shorter RTs as a function of readiness for action on graspable objects within reaching space, and the same biased attention inhibits attentional disengagement. We predicted that this attention bias would equally apply to the graspable object classifier but not to the big object classifier. Chinese speakers (N = 22) judged grammatical congruency of classifier-noun combinations in two conditions: graspable object classifier and big object classifier. We found that RTs for the graspable object classifier were significantly faster in congruent combinations, and significantly slower in incongruent combinations, than the big object classifier. There was no main effect on grammatical violations, but rather an interaction effect of classifier type. Thus, we demonstrate here grammatical category-specific effects pertaining to the semantic content and by extension the visual and tactile modality of acquisition underlying the acquisition of these categories. We conclude that abstract grammatical categories are subjected to the same mechanisms as general cognitive and neurophysiological processes and may therefore be grounded.

  4. Action Recognition Using 3D Histograms of Texture and A Multi-Class Boosting Classifier.

    Science.gov (United States)

    Zhang, Baochang; Yang, Yun; Chen, Chen; Yang, Linlin; Han, Jungong; Shao, Ling

    2017-10-01

    Human action recognition is an important yet challenging task. This paper presents a low-cost descriptor called 3D histograms of texture (3DHoTs) to extract discriminant features from a sequence of depth maps. 3DHoTs are derived from projecting depth frames onto three orthogonal Cartesian planes, i.e., the frontal, side, and top planes, and thus compactly characterize the salient information of a specific action, on which texture features are calculated to represent the action. Besides this fast feature descriptor, a new multi-class boosting classifier (MBC) is also proposed to efficiently exploit different kinds of features in a unified framework for action classification. Compared with the existing boosting frameworks, we add a new multi-class constraint into the objective function, which helps to maintain a better margin distribution by maximizing the mean of margin, whereas still minimizing the variance of margin. Experiments on the MSRAction3D, MSRGesture3D, MSRActivity3D, and UTD-MHAD data sets demonstrate that the proposed system combining 3DHoTs and MBC is superior to the state of the art.

  5. A quantitative evaluation of seismic margin of typical sodium piping

    International Nuclear Information System (INIS)

    Morishita, Masaki

    1999-05-01

    It is widely recognized that the current seismic design methods for piping involve a large amount of safety margin. From this viewpoint, a series of seismic analyses and evaluations with various design codes were made on typical LMFBR main sodium piping systems. Actual capability against seismic loads were also estimated on the piping systems. Margins contained in the current codes were quantified based on these results, and potential benefits and impacts to the piping seismic design were assessed on possible mitigation of the current code allowables. From the study, the following points were clarified; 1) A combination of inelastic time history analysis and true (without margin)strength capability allows several to twenty times as large seismic load compared with the allowable load with the current methods. 2) The new rule of the ASME is relatively compatible with the results of inelastic analysis evaluation. Hence, this new rule might be a goal for the mitigation of seismic design rule. 3) With this mitigation, seismic design accommodation such as equipping with a large number of seismic supports may become unnecessary. (author)

  6. Testing for marginal linear effects in quantile regression

    KAUST Repository

    Wang, Huixia Judy

    2017-10-23

    The paper develops a new marginal testing procedure to detect significant predictors that are associated with the conditional quantiles of a scalar response. The idea is to fit the marginal quantile regression on each predictor one at a time, and then to base the test on the t-statistics that are associated with the most predictive predictors. A resampling method is devised to calibrate this test statistic, which has non-regular limiting behaviour due to the selection of the most predictive variables. Asymptotic validity of the procedure is established in a general quantile regression setting in which the marginal quantile regression models can be misspecified. Even though a fixed dimension is assumed to derive the asymptotic results, the test proposed is applicable and computationally feasible for large dimensional predictors. The method is more flexible than existing marginal screening test methods based on mean regression and has the added advantage of being robust against outliers in the response. The approach is illustrated by using an application to a human immunodeficiency virus drug resistance data set.

  7. Testing for marginal linear effects in quantile regression

    KAUST Repository

    Wang, Huixia Judy; McKeague, Ian W.; Qian, Min

    2017-01-01

    The paper develops a new marginal testing procedure to detect significant predictors that are associated with the conditional quantiles of a scalar response. The idea is to fit the marginal quantile regression on each predictor one at a time, and then to base the test on the t-statistics that are associated with the most predictive predictors. A resampling method is devised to calibrate this test statistic, which has non-regular limiting behaviour due to the selection of the most predictive variables. Asymptotic validity of the procedure is established in a general quantile regression setting in which the marginal quantile regression models can be misspecified. Even though a fixed dimension is assumed to derive the asymptotic results, the test proposed is applicable and computationally feasible for large dimensional predictors. The method is more flexible than existing marginal screening test methods based on mean regression and has the added advantage of being robust against outliers in the response. The approach is illustrated by using an application to a human immunodeficiency virus drug resistance data set.

  8. Establishing seismic design criteria to achieve an acceptable seismic margin

    International Nuclear Information System (INIS)

    Kennedy, R.P.

    1997-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2). What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the Safe Shutdown Earthquake ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented

  9. Margin Requirements and Equity Option Returns

    DEFF Research Database (Denmark)

    Hitzemann, Steffen; Hofmann, Michael; Uhrig-Homburg, Marliese

    In equity option markets, traders face margin requirements both for the options themselves and for hedging-related positions in the underlying stock market. We show that these requirements carry a significant margin premium in the cross-section of equity option returns. The sign of the margin...... premium depends on demand pressure: If end-users are on the long side of the market, option returns decrease with margins, while they increase otherwise. Our results are statistically and economically significant and robust to different margin specifications and various control variables. We explain our...... findings by a model of funding-constrained derivatives dealers that require compensation for satisfying end-users’ option demand....

  10. Margin Requirements and Equity Option Returns

    DEFF Research Database (Denmark)

    Hitzemann, Steffen; Hofmann, Michael; Uhrig-Homburg, Marliese

    In equity option markets, traders face margin requirements both for the options themselves and for hedging-related positions in the underlying stock market. We show that these requirements carry a significant "margin premium" in the cross-section of equity option returns. The sign of the margin...... premium depends on demand pressure: If end-users are on the long side of the market, option returns decrease with margins, while they increase otherwise. Our results are statistically and economically significant and robust to different margin specifications and various control variables. We explain our...... findings by a model of funding-constrained derivatives dealers that require compensation for satisfying end-users’ option demand....

  11. MARGINS: Toward a novel science plan

    Science.gov (United States)

    Mutter, John C.

    A science plan to study continental margins has been in the works for the past 3 years, with almost 200 Earth scientists from a wide variety of disciplines gathering at meetings and workshops. Most geological hazards and resources are found at continental margins, yet our understanding of the processes that shape the margins is meager.In formulating this MARGINS research initiative, fundamental issues concerning our understanding of basic Earth-forming processes have arisen. It is clear that a business-as-usual approach will not solve the class of problems defined by the MARGINS program; the solutions demand approaches different from those used in the past. In many cases, a different class of experiment will be required, one that is well beyond the capability of individual principle investigators to undertake on their own. In most cases, broadly based interdisciplinary studies will be needed.

  12. Buccal mucosa carcinoma: surgical margin less than 3 mm, not 5 mm, predicts locoregional recurrence

    Directory of Open Access Journals (Sweden)

    Chiou Wen-Yen

    2010-09-01

    Full Text Available Abstract Background Most treatment failure of buccal mucosal cancer post surgery is locoregional recurrence. We tried to figure out how close the surgical margin being unsafe and needed further adjuvant treatment. Methods Between August 2000 and June 2008, a total of 110 patients with buccal mucosa carcinoma (25 with stage I, 31 with stage II, 11 with stage III, and 43 with Stage IV classified according to the American Joint Committee on Cancer 6th edition were treated with surgery alone (n = 32, surgery plus postoperative radiotherapy (n = 38 or surgery plus adjuvant concurrent chemoradiotherapy (n = 40. Main outcome measures: The primary endpoint was locoregional disease control. Results The median follow-up time at analysis was 25 months (range, 4-104 months. The 3-year locoregional control rates were significantly different when a 3-mm surgical margin (≤3 versus >3 mm, 71% versus 95%, p = 0.04 but not a 5-mm margin (75% versus 92%, p = 0.22 was used as the cut-off level. We also found a quantitative correlation between surgical margin and locoregional failure (hazard ratio, 2.16; 95% confidence interval, 1.14 - 4.11; p = 0.019. Multivariate analysis identified pN classification and surgical margin as independent factors affecting disease-free survival and locoregional control. Conclusions Narrow surgical margin ≤3 mm, but not 5 mm, is associated with high risk for locoregional recurrence of buccal mucosa carcinoma. More aggressive treatment after surgery is suggested.

  13. Buccal mucosa carcinoma: surgical margin less than 3 mm, not 5 mm, predicts locoregional recurrence

    International Nuclear Information System (INIS)

    Chiou, Wen-Yen; Hung, Shih-Kai; Lin, Hon-Yi; Hsu, Feng-Chun; Lee, Moon-Sing; Ho, Hsu-Chueh; Su, Yu-Chieh; Lee, Ching-Chih; Hsieh, Chen-Hsi; Wang, Yao-Ching

    2010-01-01

    Most treatment failure of buccal mucosal cancer post surgery is locoregional recurrence. We tried to figure out how close the surgical margin being unsafe and needed further adjuvant treatment. Between August 2000 and June 2008, a total of 110 patients with buccal mucosa carcinoma (25 with stage I, 31 with stage II, 11 with stage III, and 43 with Stage IV classified according to the American Joint Committee on Cancer 6 th edition) were treated with surgery alone (n = 32), surgery plus postoperative radiotherapy (n = 38) or surgery plus adjuvant concurrent chemoradiotherapy (n = 40). Main outcome measures: The primary endpoint was locoregional disease control. The median follow-up time at analysis was 25 months (range, 4-104 months). The 3-year locoregional control rates were significantly different when a 3-mm surgical margin (≤3 versus >3 mm, 71% versus 95%, p = 0.04) but not a 5-mm margin (75% versus 92%, p = 0.22) was used as the cut-off level. We also found a quantitative correlation between surgical margin and locoregional failure (hazard ratio, 2.16; 95% confidence interval, 1.14 - 4.11; p = 0.019). Multivariate analysis identified pN classification and surgical margin as independent factors affecting disease-free survival and locoregional control. Narrow surgical margin ≤3 mm, but not 5 mm, is associated with high risk for locoregional recurrence of buccal mucosa carcinoma. More aggressive treatment after surgery is suggested

  14. Marginal Maximum Likelihood Estimation of Item Response Models in R

    Directory of Open Access Journals (Sweden)

    Matthew S. Johnson

    2007-02-01

    Full Text Available Item response theory (IRT models are a class of statistical models used by researchers to describe the response behaviors of individuals to a set of categorically scored items. The most common IRT models can be classified as generalized linear fixed- and/or mixed-effect models. Although IRT models appear most often in the psychological testing literature, researchers in other fields have successfully utilized IRT-like models in a wide variety of applications. This paper discusses the three major methods of estimation in IRT and develops R functions utilizing the built-in capabilities of the R environment to find the marginal maximum likelihood estimates of the generalized partial credit model. The currently available R packages ltm is also discussed.

  15. Supervised learning with decision margins in pools of spiking neurons.

    Science.gov (United States)

    Le Mouel, Charlotte; Harris, Kenneth D; Yger, Pierre

    2014-10-01

    Learning to categorise sensory inputs by generalising from a few examples whose category is precisely known is a crucial step for the brain to produce appropriate behavioural responses. At the neuronal level, this may be performed by adaptation of synaptic weights under the influence of a training signal, in order to group spiking patterns impinging on the neuron. Here we describe a framework that allows spiking neurons to perform such "supervised learning", using principles similar to the Support Vector Machine, a well-established and robust classifier. Using a hinge-loss error function, we show that requesting a margin similar to that of the SVM improves performance on linearly non-separable problems. Moreover, we show that using pools of neurons to discriminate categories can also increase the performance by sharing the load among neurons.

  16. Asymptotic performance of regularized quadratic discriminant analysis based classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-12-13

    This paper carries out a large dimensional analysis of the standard regularized quadratic discriminant analysis (QDA) classifier designed on the assumption that data arise from a Gaussian mixture model. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that depends only on the covariances and means associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized QDA and can be used to determine the optimal regularization parameter that minimizes the misclassification error probability. Despite being valid only for Gaussian data, our theoretical findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from popular real data bases, thereby making an interesting connection between theory and practice.

  17. Maximum margin semi-supervised learning with irrelevant data.

    Science.gov (United States)

    Yang, Haiqin; Huang, Kaizhu; King, Irwin; Lyu, Michael R

    2015-10-01

    Semi-supervised learning (SSL) is a typical learning paradigms training a model from both labeled and unlabeled data. The traditional SSL models usually assume unlabeled data are relevant to the labeled data, i.e., following the same distributions of the targeted labeled data. In this paper, we address a different, yet formidable scenario in semi-supervised classification, where the unlabeled data may contain irrelevant data to the labeled data. To tackle this problem, we develop a maximum margin model, named tri-class support vector machine (3C-SVM), to utilize the available training data, while seeking a hyperplane for separating the targeted data well. Our 3C-SVM exhibits several characteristics and advantages. First, it does not need any prior knowledge and explicit assumption on the data relatedness. On the contrary, it can relieve the effect of irrelevant unlabeled data based on the logistic principle and maximum entropy principle. That is, 3C-SVM approaches an ideal classifier. This classifier relies heavily on labeled data and is confident on the relevant data lying far away from the decision hyperplane, while maximally ignoring the irrelevant data, which are hardly distinguished. Second, theoretical analysis is provided to prove that in what condition, the irrelevant data can help to seek the hyperplane. Third, 3C-SVM is a generalized model that unifies several popular maximum margin models, including standard SVMs, Semi-supervised SVMs (S(3)VMs), and SVMs learned from the universum (U-SVMs) as its special cases. More importantly, we deploy a concave-convex produce to solve the proposed 3C-SVM, transforming the original mixed integer programming, to a semi-definite programming relaxation, and finally to a sequence of quadratic programming subproblems, which yields the same worst case time complexity as that of S(3)VMs. Finally, we demonstrate the effectiveness and efficiency of our proposed 3C-SVM through systematical experimental comparisons. Copyright

  18. Effect of Margin Designs on the Marginal Adaptation of Zirconia Copings.

    Science.gov (United States)

    Habib, Syed Rashid; Al Ajmi, Mohammed Ginan; Al Dhafyan, Mohammed; Jomah, Abdulrehman; Abualsaud, Haytham; Almashali, Mazen

    2017-09-01

    The aim of this in vitro study was to investigate the effect of Shoulder versus Chamfer margin design on the marginal adaptation of zirconia (Zr) copings. 40 extracted molar teeth were mounted in resin and prepared for zirconia crowns with two margin preparation designs (20=Shoulder and 20=Chamfer). The copings were manufactured by Cercon® (DeguDent GmbH, Germany) using the CAD/CAM system for each tooth. They were tried on each tooth, cemented, thermocycled, re-embedded in resin and were subsequently cross sectioned centrally into two equal mesial and distal halves. They were examined under electron microscope at 200 X magnification and the measurements were recorded at 5 predetermined points in micrometers (µm). The o verall mean marginal gap for the two groups was found to be 206.98+42.78µm with Shoulder margin design (Marginal Gap=199.50+40.72µm) having better adaptation compared to Chamfer (Marginal Gap=214.46+44.85µm). The independent-samples t-test showed a statistically non-significant difference (p=.113) between the means of marginal gap for Shoulder and Chamfer margin designs and the measurements were recorded at 5 predetermined points for the two groups. The Chamfer margin design appeared to offer the same adaptation results as the Shoulder margin design.

  19. Marginal and happy? The need for uniqueness predicts the adjustment of marginal immigrants.

    Science.gov (United States)

    Debrosse, Régine; de la Sablonnière, Roxane; Rossignac-Milon, Maya

    2015-12-01

    Marginalization is often presented as the strategy associated with the worst adjustment for immigrants. This study identifies a critical variable that buffers marginal immigrants from the negative effects of marginalization on adjustment: The need for uniqueness. In three studies, we surveyed immigrants recruited on university campuses (n = 119, n = 116) and in the field (n = 61). Among marginal immigrants, a higher need for uniqueness predicted higher self-esteem (Study 1), affect (Study 2), and life satisfaction (Study 3), and marginally higher happiness (Study 2) and self-esteem (Study 3). No relationship between the need for uniqueness and adjustment was found among non-marginal immigrants. The adaptive value of the need for uniqueness for marginal immigrants is discussed. © 2015 The British Psychological Society.

  20. Read margin analysis of crossbar arrays using the cell-variability-aware simulation method

    Science.gov (United States)

    Sun, Wookyung; Choi, Sujin; Shin, Hyungsoon

    2018-02-01

    This paper proposes a new concept of read margin analysis of crossbar arrays using cell-variability-aware simulation. The size of the crossbar array should be considered to predict the read margin characteristic of the crossbar array because the read margin depends on the number of word lines and bit lines. However, an excessively high-CPU time is required to simulate large arrays using a commercial circuit simulator. A variability-aware MATLAB simulator that considers independent variability sources is developed to analyze the characteristics of the read margin according to the array size. The developed MATLAB simulator provides an effective method for reducing the simulation time while maintaining the accuracy of the read margin estimation in the crossbar array. The simulation is also highly efficient in analyzing the characteristic of the crossbar memory array considering the statistical variations in the cell characteristics.

  1. Energizing marginal soils: A perennial cropping system for Sida hermaphrodita

    Science.gov (United States)

    Nabel, Moritz; Poorter, Hendrik; Temperton, Vicky; Schrey, Silvia D.; Koller, Robert; Schurr, Ulrich; Jablonowski, Nicolai D.

    2017-04-01

    As a way to avoid land use conflicts, the use of marginal soils for the production of plant biomass can be a sustainable alternative to conventional biomass production (e.g. maize). However, new cropping strategies have to be found that meet the challenge of crop production under marginal soil conditions. We aim for increased soil fertility by the use of the perennial crop Sida hermaphrodita in combination with organic fertilization and legume intercropping to produce substantial biomass yield. We present results of a three-year outdoor mesocosm experiment testing the perennial energy crop Sida hermaphrodita grown on a marginal model substrate (sand) with four kinds of fertilization (Digestate broadcast, Digestate Depot, mineral NPK and unfertilized control) in combination with legume intercropping. After three years, organic fertilization (via biogas digestate) compared to mineral fertilization (NPK), reduced the nitrate concentration in leachate and increased the soil carbon content. Biomass yields of Sida were 25% higher when fertilized organically, compared to mineral fertilizer. In general, digestate broadcast application reduced root growth and the wettability of the sandy substrate. However, when digestate was applied locally as depot to the rhizosphere, root growth increased and the wettability of the sandy substrate was preserved. Depot fertilization increased biomass yield by 10% compared to digestate broadcast fertilization. We intercropped Sida with various legumes (Trifolium repens, Trifolium pratense, Melilotus spp. and Medicago sativa) to enable biological nitrogen fixation and make the cropping system independent from synthetically produced fertilizers. We could show that Medicago sativa grown on marginal substrate fixed large amounts of N, especially when fertilized organically, whereas mineral fertilization suppressed biological nitrogen fixation. We conclude that the perennial energy crop Sida in combination with organic fertilization has great

  2. A comparative study of machine learning classifiers for modeling travel mode choice

    NARCIS (Netherlands)

    Hagenauer, J; Helbich, M

    2017-01-01

    The analysis of travel mode choice is an important task in transportation planning and policy making in order to understand and predict travel demands. While advances in machine learning have led to numerous powerful classifiers, their usefulness for modeling travel mode choice remains largely

  3. From marginality to further marginalization: Experiences from the victims of the July 2000 Payatas trashslide in the Philippines

    Directory of Open Access Journals (Sweden)

    JC Gaillard

    2009-04-01

    Full Text Available Victims of disasters are disproportionately drawn from the marginalized segments of society. Disaster victims are marginalized geographically because they live in hazardous places, socially because they are members of minority groups, economically because they are poor, and marginalized politically because their voice is disregarded by those with political power. #e victims of the July 2000 Payatas trash slide in the Philippines show all these characteristics. Most of the victims of the disaster were urban migrants who came all the way from their poor provinces to settle on the lower slopes of the largest dumpsite of the country. #ey scavenged recyclable materials to sell as a way to make a living, but their limited incomes did not allow them to a$ord safer locations for their homes, farther removed from the slopes of the dumpsite. On the morning of 10 July 2000, 300 of them lost their lives when a large section of the dumpsite collapsed in a massive debris %ow which buried their houses. In the aftermath of the disaster, the survivors who used to live on the dumpsite, and who were the poorest victims, were also those who were relocated by the Philippine government. In the present case, the most vulnerable families in the face of the trash slide were eventually those who had to su#er again from life-disrupting relocation while being the less able to recover quickly from the disaster. Daily incomes of relocated families are today much lower than those who remained in the vicinity of the dumpsite. For the victims of the July 2000 Payatas tragedy, poverty thus acted as a vicious, worsening circle which ranged from vulnerability to poor recovery, or from marginality to further marginalization.

  4. Complex force network in marginally and deeply jammed solids

    International Nuclear Information System (INIS)

    Hu Mao-Bin; Jiang Rui; Wu Qing-Song

    2013-01-01

    This paper studies the force network properties of marginally and deeply jammed packings of frictionless soft particles from the perspective of complex network theory. We generate zero-temperature granular packings at different pressures by minimizing the inter-particle potential energy. The force networks are constructed as nodes representing particles and links representing normal forces between the particles. Deeply jammed solids show remarkably different behavior from marginally jammed solids in their degree distribution, strength distribution, degree correlation, and clustering coefficient. Bimodal and multi-modal distributions emerge when the system enters the deep jamming region. The results also show that small and large particles can show different correlation behavior in this simple system

  5. The Role of Field Margins in Agro-biodiversity Management at the Farm Level

    Directory of Open Access Journals (Sweden)

    Giulio Lazzerini

    2007-06-01

    Full Text Available The agroecosystem could be considered as a mosaic so large to involve fields with annual and perennial crops, pastures, spots of wildwood, semi-natural habitats, vegetation in the edges of fields. In the agroecosystem these ecological infrastructures have a positive effects on the crops because of the exchange among community of organisms, materials and energy. The aim of this research is to evaluate the effects of field margins on some biodiversity components (plant species and carabid beetles of four farms located in Val d’Orcia (Tuscany. We compared three types of field margins: 1. Cultivated margin strips; 2. Sown grass margin strips; 3. Wild margin strips with hedgerow. In a very simplified typology of farming system, like the one studied (Val d’Orcia, the presence of field margins (hedges, margin strips and semi-natural habitats associated with the boundary is very important for its ecological effects: it improves the planned biodiversity, gives habitat, refuge, food and corridors for the movement to the different species of organisms in the area. Applying the multivariate analysis to the experimental data, we can notice a positive effect of the presence of field margins on the trend of both components of biodiversity. This positive effect, which support the mechanisms of autoregulation of the agroecosystems, is very important especially for organic and biodynamic agriculture, where the use of pesticides is not allowed.

  6. Spectrum estimation method based on marginal spectrum

    International Nuclear Information System (INIS)

    Cai Jianhua; Hu Weiwen; Wang Xianchun

    2011-01-01

    FFT method can not meet the basic requirements of power spectrum for non-stationary signal and short signal. A new spectrum estimation method based on marginal spectrum from Hilbert-Huang transform (HHT) was proposed. The procession of obtaining marginal spectrum in HHT method was given and the linear property of marginal spectrum was demonstrated. Compared with the FFT method, the physical meaning and the frequency resolution of marginal spectrum were further analyzed. Then the Hilbert spectrum estimation algorithm was discussed in detail, and the simulation results were given at last. The theory and simulation shows that under the condition of short data signal and non-stationary signal, the frequency resolution and estimation precision of HHT method is better than that of FFT method. (authors)

  7. Pathology of nodal marginal zone lymphomas.

    Science.gov (United States)

    Pileri, Stefano; Ponzoni, Maurilio

    Nodal marginal zone B cell lymphomas (NMZLs) are a rare group of lymphoid disorders part of the spectrum of marginal zone B-cell lymphomas, which encompass splenic marginal one B-cell lymphoma (SMZL) and extra nodal marginal zone of B-cell lymphoma (EMZL), often of MALT-type. Two clinicopathological forms of NMZL are recognized: adult-type and pediatric-type, respectively. NMZLs show overlapping features with other types of MZ, but distinctive features as well. In this review, we will focus on the salient distinguishing features of NMZL mostly under morphological/immunophenotypical/molecular perspectives in views of the recent acquisitions and forthcoming updated 2016 WHO classification of lymphoid malignancies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Policy Implementation, Role Conflict and Marginalization

    African Journals Online (AJOL)

    Prince Acheampong

    governance, their role has been politically, administratively, and financially ... of marginalization of the Traditional Systems in terms of legal, financial and ..... the President as the Chief Executive Officer of the district is another controlling factor.

  9. Mental Depreciation and Marginal Decision Making

    Science.gov (United States)

    Heath; Fennema

    1996-11-01

    We propose that individuals practice "mental depreciation," that is, they implicitly spread the fixed costs of their expenses over time or use. Two studies explore how people spread fixed costs on durable goods. A third study shows that depreciation can lead to two distinct errors in marginal decisions: First, people sometimes invest too much effort to get their money's worth from an expense (e.g., they may use a product a lot to spread the fixed expense across more uses). Second, people sometimes invest too little effort to get their money's worth: When people add a portion of the fixed cost to the current costs, their perceived marginal (i.e., incremental) costs exceed their true marginal costs. In response, they may stop investing because their perceived costs surpass the marginal benefits they are receiving. The latter effect is supported by two field studies that explore real board plan decisions by university students.

  10. Marketing margins and agricultural technology in Mozambique

    DEFF Research Database (Denmark)

    Arndt, Channing; Jensen, Henning Tarp; Robinson, Sherman

    2000-01-01

    of improved agricultural technology and lower marketing margins yield welfare gains across the economy. In addition, a combined scenario reveals significant synergy effects, as gains exceed the sum of gains from the individual scenarios. Relative welfare improvements are higher for poor rural households......Improvements in agricultural productivity and reductions in marketing costs in Mozambique are analysed using a computable general equilibrium (CGE) model. The model incorporates detailed marketing margins and separates household demand for marketed and home-produced goods. Individual simulations...

  11. Time Safety Margin: Theory and Practice

    Science.gov (United States)

    2016-09-01

    Air Education and Training Command Handbook 99-107, T-38 Road to Wings, Randolph Air Force Base, Texas, July 2013. 65 This page was intentionally left...412TW-TIH-16-01 TIME SAFETY MARGIN: THEORY AND PRACTICE WILLIAM R. GRAY, III Chief Test Pilot USAF Test Pilot School SEPTEMBER 2016... Safety Margin: The01y and Practice) was submitted by the Commander, 4 I 2th Test Wing, Edwards AFB, Ca lifornia 93524-6843. Foreign announcement and

  12. In silico particle margination in blood flow

    OpenAIRE

    Müller, Kathrin

    2015-01-01

    A profound knowledge of margination, the migration of blood components to the vessel wall in blood flow, is required in order to understand the genesis of various diseases, as e.g., cardiovascular diseases or bleeding disorders. Margination of particles is a pre-condition for potential adhesion. Adhesion to the vessel wall is required for platelets, the protein von Willebrand factor (VWF), but also for drug and imaging agent carriers in order to perform their particular tasks. In the haemosta...

  13. The role of the margins in ice stream dynamics

    Science.gov (United States)

    Echelmeyer, Keith; Harrison, William

    1993-07-01

    At first glance, it would appear that the bed of the active ice stream plays a much more important role in the overall force balance than do the margins, especially because the ratio of the half-width to depth for a typical ice stream is large (15:1 to 50:1). On the other hand, recent observations indicate that at least part of the ice stream is underlain by a layer of very weak till (shear strength about 2 kPa), and this weak basal layer would then imply that some or all of the resistive drag is transferred to the margins. In order to address this question, a detailed velocity profile near Upstream B Camp, which extends from the center of the ice stream, across the chaotic shear margin, and onto the Unicorn, which is part of the slow-moving ice sheet was measured. Comparison of this observed velocity profile with finite-element models of flow shows several interesting features. First, the shear stress at the margin is on the order of 130 kPa, while the mean value along the bed is about 15 kPa. Integration of these stresses along the boundaries indicates that the margins provide 40 to 50 percent, and the bed, 60 to 40 percent of the total resistive drag needed to balance the gravitational driving stress in this region. (The range of values represents calculations for different values of surface slope.) Second, the mean basal stress predicted by the models shows that the entire bed cannot be blanketed by the weak till observed beneath upstream B - instead there must be a distribution of weak till and 'sticky spots' (e.g., 85 percent till and 15 percent sticky spots of resistive stress equal to 100 kPa). If more of the bed were composed of weak till, then the modeled velocity would not match that observed. Third, the ice must exhibit an increasing enhancement factor as the margins are approached (E equals 10 in the chaotic zone), in keeping with laboratory measurements on ice under prolonged shear strain. Also, there is either a narrow zone of somewhat stiffer ice (E

  14. Benefit–cost analysis of non-marginal climate and energy projects

    International Nuclear Information System (INIS)

    Dietz, Simon; Hepburn, Cameron

    2013-01-01

    Conventional benefit–cost analysis incorporates the normally reasonable assumption that the policy or project under examination is marginal. Among the assumptions this entails is that the policy or project is small, so the underlying growth rate of the economy does not change. However, this assumption may be inappropriate in some important circumstances, including in climate-change and energy policy. One example is global targets for carbon emissions, while another is a large renewable energy project in a small economy, such as a hydropower dam. This paper develops some theory on the evaluation of non-marginal projects, with empirical applications to climate change and energy. We examine the conditions under which evaluation of a non-marginal project using marginal methods may be wrong, and in our empirical examples we show that both qualitative and large quantitative errors are plausible. - Highlights: • This paper develops the theory of the evaluation of non-marginal projects. • It also includes empirical applications to climate change and energy. • We show when evaluation of a non-marginal project using marginal methods is wrong

  15. Professional Commitment and Professional Marginalism in Teachers

    Directory of Open Access Journals (Sweden)

    Kalashnikov A.I.

    2017-11-01

    Full Text Available The article reviews teachers' attitudes towards the teaching profession which can be expressed both in professional commitment and in professional marginalism. The dominance of professional marginalism could affect destructively the students as well as the teacher’s personality, hence the issues related to the content of personal position of a marginal and the rate of marginalism among teachers. It was suggested that marginalism could be revealed in the study of professional commitment. The study involved 81 teachers of Sverdlovsk secondary schools aged 21—60 years with work experience ranging from 1 month to 39 years. The Professional Commitment Questionnaire was used as the study technique. The results showed that negative emotional attitude towards the profession and reluctance to leave the profession were grouped as a separate factor. The dispersion factor was 12,5%. The factor loadings ranged from 0.42 to 0.84. The study proved that professional marginalism in teachers includes dissatisfaction with work, feelings of resentment against profession and an unwillingness to leave the profession.

  16. NRC Seismic Design Margins Program Plan

    International Nuclear Information System (INIS)

    Cummings, G.E.; Johnson, J.J.; Budnitz, R.J.

    1985-08-01

    Recent studies estimate that seismically induced core melt comes mainly from earthquakes in the peak ground acceleration range from 2 to 4 times the safe shutdown earthquake (SSE) acceleration used in plant design. However, from the licensing perspective of the US Nuclear Regulatory Commission, there is a continuing need for consideration of the inherent quantitative seismic margins because of, among other things, the changing perceptions of the seismic hazard. This paper discusses a Seismic Design Margins Program Plan, developed under the auspices of the US NRC, that provides the technical basis for assessing the significance of design margins in terms of overall plant safety. The Plan will also identify potential weaknesses that might have to be addressed, and will recommend technical methods for assessing margins at existing plants. For the purposes of this program, a general definition of seismic design margin is expressed in terms of how much larger that the design basis earthquake an earthquake must be to compromise plant safety. In this context, margin needs to be determined at the plant, system/function, structure, and component levels. 14 refs., 1 fig

  17. An automatic classifier of emotions built from entropy of noise.

    Science.gov (United States)

    Ferreira, Jacqueline; Brás, Susana; Silva, Carlos F; Soares, Sandra C

    2017-04-01

    The electrocardiogram (ECG) signal has been widely used to study the physiological substrates of emotion. However, searching for better filtering techniques in order to obtain a signal with better quality and with the maximum relevant information remains an important issue for researchers in this field. Signal processing is largely performed for ECG analysis and interpretation, but this process can be susceptible to error in the delineation phase. In addition, it can lead to the loss of important information that is usually considered as noise and, consequently, discarded from the analysis. The goal of this study was to evaluate if the ECG noise allows for the classification of emotions, while using its entropy as an input in a decision tree classifier. We collected the ECG signal from 25 healthy participants while they were presented with videos eliciting negative (fear and disgust) and neutral emotions. The results indicated that the neutral condition showed a perfect identification (100%), whereas the classification of negative emotions indicated good identification performances (60% of sensitivity and 80% of specificity). These results suggest that the entropy of noise contains relevant information that can be useful to improve the analysis of the physiological correlates of emotion. © 2016 Society for Psychophysiological Research.

  18. Addressing the Challenge of Defining Valid Proteomic Biomarkers and Classifiers

    LENUS (Irish Health Repository)

    Dakna, Mohammed

    2010-12-10

    Abstract Background The purpose of this manuscript is to provide, based on an extensive analysis of a proteomic data set, suggestions for proper statistical analysis for the discovery of sets of clinically relevant biomarkers. As tractable example we define the measurable proteomic differences between apparently healthy adult males and females. We choose urine as body-fluid of interest and CE-MS, a thoroughly validated platform technology, allowing for routine analysis of a large number of samples. The second urine of the morning was collected from apparently healthy male and female volunteers (aged 21-40) in the course of the routine medical check-up before recruitment at the Hannover Medical School. Results We found that the Wilcoxon-test is best suited for the definition of potential biomarkers. Adjustment for multiple testing is necessary. Sample size estimation can be performed based on a small number of observations via resampling from pilot data. Machine learning algorithms appear ideally suited to generate classifiers. Assessment of any results in an independent test-set is essential. Conclusions Valid proteomic biomarkers for diagnosis and prognosis only can be defined by applying proper statistical data mining procedures. In particular, a justification of the sample size should be part of the study design.

  19. Online Feature Selection for Classifying Emphysema in HRCT Images

    Directory of Open Access Journals (Sweden)

    M. Prasad

    2008-06-01

    Full Text Available Feature subset selection, applied as a pre- processing step to machine learning, is valuable in dimensionality reduction, eliminating irrelevant data and improving classifier performance. In the classic formulation of the feature selection problem, it is assumed that all the features are available at the beginning. However, in many real world problems, there are scenarios where not all features are present initially and must be integrated as they become available. In such scenarios, online feature selection provides an efficient way to sort through a large space of features. It is in this context that we introduce online feature selection for the classification of emphysema, a smoking related disease that appears as low attenuation regions in High Resolution Computer Tomography (HRCT images. The technique was successfully evaluated on 61 HRCT scans and compared with different online feature selection approaches, including hill climbing, best first search, grafting, and correlation-based feature selection. The results were also compared against ldensity maskr, a standard approach used for emphysema detection in medical image analysis.

  20. Salient Region Detection via Feature Combination and Discriminative Classifier

    Directory of Open Access Journals (Sweden)

    Deming Kong

    2015-01-01

    Full Text Available We introduce a novel approach to detect salient regions of an image via feature combination and discriminative classifier. Our method, which is based on hierarchical image abstraction, uses the logistic regression approach to map the regional feature vector to a saliency score. Four saliency cues are used in our approach, including color contrast in a global context, center-boundary priors, spatially compact color distribution, and objectness, which is as an atomic feature of segmented region in the image. By mapping a four-dimensional regional feature to fifteen-dimensional feature vector, we can linearly separate the salient regions from the clustered background by finding an optimal linear combination of feature coefficients in the fifteen-dimensional feature space and finally fuse the saliency maps across multiple levels. Furthermore, we introduce the weighted salient image center into our saliency analysis task. Extensive experiments on two large benchmark datasets show that the proposed approach achieves the best performance over several state-of-the-art approaches.

  1. Marginalization among the marginalized: gay men's anti-effeminacy attitudes.

    Science.gov (United States)

    Taywaditep, K J

    2001-01-01

    Contemporary research has shown that a significant portion of gay men have traits, interests, occupations, and behaviors that are consistent with the stereotype of gay men as effeminate, androgynous, or unmasculine. A great number of gay men exhibit gender nonconformity during childhood; most, however, "defeminize" during adolescence, possibly in response to stigmatization and society's gender-role prescription. Only a relatively small percentage of gay men continue to be gender-nonconforming in their adulthood, often at a price, as they also tend to have lower psychological well-being. Although gay culture historically appreciated camp and drag, which subvert the gender-based power hierarchy and celebrate gender nonconformity, anti-effeminacy prejudice is widespread among gay men. Ironically, gender-nonconforming gay men may suffer from discrimination not only from society at large, but from other gay men, who are most likely to have experienced stigmatization and may have been effeminate earlier in their lives. Drawing from anecdotes and findings from various sources, this article suggests that beyond many gay men's erotic preference for masculinity lies contempt and hostility toward effeminacy and effeminate men on sociopolitical and personal levels. Two correlates of gay men's anti-effeminacy attitudes are proposed: (a) hegemonic masculinity ideology, or the degree to which one subscribes to the value system in which masculinity is an asset, and men and masculinity are considered superior to women and femininity; and (b) masculinity consciousness, or the saliency of masculinity in one's self-monitoring, public self-consciousness, and self-concept. These two variables are hypothesized to interact with gay men's self-perceived masculinity-femininity and their history of defeminization in predicting attitudes toward effeminacy. Research is underway to measure levels of anti-effeminacy attitudes and explore hypothesized correlates.

  2. 18 CFR 3a.12 - Authority to classify official information.

    Science.gov (United States)

    2010-04-01

    ... efficient administration. (b) The authority to classify information or material originally as Top Secret is... classify information or material originally as Secret is exercised only by: (1) Officials who have Top... information or material originally as Confidential is exercised by officials who have Top Secret or Secret...

  3. Fisher classifier and its probability of error estimation

    Science.gov (United States)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  4. Performance of classification confidence measures in dynamic classifier systems

    Czech Academy of Sciences Publication Activity Database

    Štefka, D.; Holeňa, Martin

    2013-01-01

    Roč. 23, č. 4 (2013), s. 299-319 ISSN 1210-0552 R&D Projects: GA ČR GA13-17187S Institutional support: RVO:67985807 Keywords : classifier combining * dynamic classifier systems * classification confidence Subject RIV: IN - Informatics, Computer Science Impact factor: 0.412, year: 2013

  5. 32 CFR 2400.30 - Reproduction of classified information.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Reproduction of classified information. 2400.30... SECURITY PROGRAM Safeguarding § 2400.30 Reproduction of classified information. Documents or portions of... the originator or higher authority. Any stated prohibition against reproduction shall be strictly...

  6. Classifying spaces with virtually cyclic stabilizers for linear groups

    DEFF Research Database (Denmark)

    Degrijse, Dieter Dries; Köhl, Ralf; Petrosyan, Nansen

    2015-01-01

    We show that every discrete subgroup of GL(n, ℝ) admits a finite-dimensional classifying space with virtually cyclic stabilizers. Applying our methods to SL(3, ℤ), we obtain a four-dimensional classifying space with virtually cyclic stabilizers and a decomposition of the algebraic K-theory of its...

  7. Dynamic integration of classifiers in the space of principal components

    NARCIS (Netherlands)

    Tsymbal, A.; Pechenizkiy, M.; Puuronen, S.; Patterson, D.W.; Kalinichenko, L.A.; Manthey, R.; Thalheim, B.; Wloka, U.

    2003-01-01

    Recent research has shown the integration of multiple classifiers to be one of the most important directions in machine learning and data mining. It was shown that, for an ensemble to be successful, it should consist of accurate and diverse base classifiers. However, it is also important that the

  8. Feature Import Vector Machine: A General Classifier with Flexible Feature Selection.

    Science.gov (United States)

    Ghosh, Samiran; Wang, Yazhen

    2015-02-01

    The support vector machine (SVM) and other reproducing kernel Hilbert space (RKHS) based classifier systems are drawing much attention recently due to its robustness and generalization capability. General theme here is to construct classifiers based on the training data in a high dimensional space by using all available dimensions. The SVM achieves huge data compression by selecting only few observations which lie close to the boundary of the classifier function. However when the number of observations are not very large (small n ) but the number of dimensions/features are large (large p ), then it is not necessary that all available features are of equal importance in the classification context. Possible selection of an useful fraction of the available features may result in huge data compression. In this paper we propose an algorithmic approach by means of which such an optimal set of features could be selected. In short, we reverse the traditional sequential observation selection strategy of SVM to that of sequential feature selection. To achieve this we have modified the solution proposed by Zhu and Hastie (2005) in the context of import vector machine (IVM), to select an optimal sub-dimensional model to build the final classifier with sufficient accuracy.

  9. An ensemble of dissimilarity based classifiers for Mackerel gender determination

    International Nuclear Information System (INIS)

    Blanco, A; Rodriguez, R; Martinez-Maranon, I

    2014-01-01

    Mackerel is an infravalored fish captured by European fishing vessels. A manner to add value to this specie can be achieved by trying to classify it attending to its sex. Colour measurements were performed on Mackerel females and males (fresh and defrozen) extracted gonads to obtain differences between sexes. Several linear and non linear classifiers such as Support Vector Machines (SVM), k Nearest Neighbors (k-NN) or Diagonal Linear Discriminant Analysis (DLDA) can been applied to this problem. However, theyare usually based on Euclidean distances that fail to reflect accurately the sample proximities. Classifiers based on non-Euclidean dissimilarities misclassify a different set of patterns. We combine different kind of dissimilarity based classifiers. The diversity is induced considering a set of complementary dissimilarities for each model. The experimental results suggest that our algorithm helps to improve classifiers based on a single dissimilarity

  10. An ensemble of dissimilarity based classifiers for Mackerel gender determination

    Science.gov (United States)

    Blanco, A.; Rodriguez, R.; Martinez-Maranon, I.

    2014-03-01

    Mackerel is an infravalored fish captured by European fishing vessels. A manner to add value to this specie can be achieved by trying to classify it attending to its sex. Colour measurements were performed on Mackerel females and males (fresh and defrozen) extracted gonads to obtain differences between sexes. Several linear and non linear classifiers such as Support Vector Machines (SVM), k Nearest Neighbors (k-NN) or Diagonal Linear Discriminant Analysis (DLDA) can been applied to this problem. However, theyare usually based on Euclidean distances that fail to reflect accurately the sample proximities. Classifiers based on non-Euclidean dissimilarities misclassify a different set of patterns. We combine different kind of dissimilarity based classifiers. The diversity is induced considering a set of complementary dissimilarities for each model. The experimental results suggest that our algorithm helps to improve classifiers based on a single dissimilarity.

  11. Just-in-time classifiers for recurrent concepts.

    Science.gov (United States)

    Alippi, Cesare; Boracchi, Giacomo; Roveri, Manuel

    2013-04-01

    Just-in-time (JIT) classifiers operate in evolving environments by classifying instances and reacting to concept drift. In stationary conditions, a JIT classifier improves its accuracy over time by exploiting additional supervised information coming from the field. In nonstationary conditions, however, the classifier reacts as soon as concept drift is detected; the current classification setup is discarded and a suitable one activated to keep the accuracy high. We present a novel generation of JIT classifiers able to deal with recurrent concept drift by means of a practical formalization of the concept representation and the definition of a set of operators working on such representations. The concept-drift detection activity, which is crucial in promptly reacting to changes exactly when needed, is advanced by considering change-detection tests monitoring both inputs and classes distributions.

  12. Margins for treatment planning of proton therapy

    International Nuclear Information System (INIS)

    Thomas, Simon J

    2006-01-01

    For protons and other charged particles, the effect of set-up errors on the position of isodoses is considerably less in the direction of the incident beam than it is laterally. Therefore, the margins required between the clinical target volume (CTV) and planning target volume (PTV) can be less in the direction of the incident beam than laterally. Margins have been calculated for a typical head plan and a typical prostate plan, for a single field, a parallel opposed and a four-field arrangement of protons, and compared with margins calculated for photons, assuming identical geometrical uncertainties for each modality. In the head plan, where internal motion was assumed negligible, the CTV-PTV margin reduced from approximately 10 mm to 3 mm in the axial direction for the single field and parallel opposed plans. For a prostate plan, where internal motion cannot be ignored, the corresponding reduction in margin was from 11 mm to 7 mm. The planning organ at risk (PRV) margin in the axial direction reduced from 6 mm to 2 mm for the head plan, and from 7 mm to 4 mm for the prostate plan. No reduction was seen on the other axes, or for any axis of the four-field plans. Owing to the shape of proton dose distributions, there are many clinical cases in which good dose distributions can be obtained with one or two fields. When this is done, it is possible to use smaller PTV and PRV margins. This has the potential to convert untreatable cases, in which the PTV and PRV overlap, into cases with a gap between PTV and PRV of adequate size for treatment planning

  13. Geomorphology and Neogene tectonic evolution of the Palomares continental margin (Western Mediterranean)

    Science.gov (United States)

    Gómez de la Peña, Laura; Gràcia, Eulàlia; Muñoz, Araceli; Acosta, Juan; Gómez-Ballesteros, María; R. Ranero, César; Uchupi, Elazar

    2016-10-01

    The Palomares continental margin is located in the southeastern part of Spain. The margin main structure was formed during Miocene times, and it is currently part of the wide deformation zone characterizing the region between the Iberian and African plates, where no well-defined plate boundary occurs. The convergence between these two plates is here accommodated by several structures, including the left lateral strike-slip Palomares Fault. The region is characterized by sparse, low to moderate magnitude (Mw shallow instrumental earthquakes, although large historical events have also occurred. To understand the recent tectonic history of the margin we analyze new high-resolution multibeam bathymetry data and re-processed three multichannel seismic reflection profiles crossing the main structures. The analysis of seafloor morphology and associated subsurface structure provides new insights of the active tectonic features of the area. In contrast to other segments of the southeastern Iberian margin, the Palomares margin contains numerous large and comparatively closely spaced canyons with heads that reach near the coast. The margin relief is also characterized by the presence of three prominent igneous submarine ridges that include the Aguilas, Abubacer and Maimonides highs. Erosive processes evidenced by a number of scars, slope failures, gullies and canyon incisions shape the present-day relief of the Palomares margin. Seismic images reveal the deep structure distinguishing between Miocene structures related to the formation of the margin and currently active features, some of which may reactivate inherited structures. The structure of the margin started with an extensional phase accompanied by volcanic accretion during the Serravallian, followed by a compressional pulse that started during the Latemost Tortonian. Nowadays, tectonic activity offshore is subdued and limited to few, minor faults, in comparison with the activity recorded onshore. The deep Algero

  14. Extraction of data from margin calculations in prostate radiotherapy from a commercial record and verify system

    International Nuclear Information System (INIS)

    Fox, C.; Kron, T.; Fisher, R.; Tai, K.H.; Thompson, A.; Owen, R.

    2008-01-01

    Full text: Radiation therapy is a widely prescribed and effective modality for the treatment of prostate cancer.1 3 Radiation therapy relies on precise targeting of the treatment site to deliver the required dose to the tumour while sparing critical organs nearby. To achieve this, it is necessary to allow for the effects of organ and patient motion, both during and between treatment fractions. In the treatment planning process, a margin is added to the clinical target volume (CTV) to create the planning target volume (PTV) to allow for targeting uncertainties which Iare dominated by these movements.4 5 Deciding the appropriate margin size is important since an excessively large margin will result in increased damage to adjacent normal tissues while an undersized margin will leave parts of the target underdosed. With the marked improvement in technology available with new treatment machines, remote online setup correction using high quality kilovoltage images has become straightforward and widely available. Used together with implanted radio-opaque markers, remote online setup correction allows direct targeting of the prostate organ, and significant reduction in the effects of interfraction motion.6 1 1 The introduction of this technology into a therapy department makes a reduction of CTV to PTV margin size possible. There are many published works dealing with margin size calculation for prostate treatment planning. The best known and most widely cited work is that of van Herk which modelled the prostate using simple geometry to calculate a minimum dose coverage probability.13 The outcome of this modelling was a simple and easily understood formula with just the patient group random and systematic setup errors used to calculate margin size. To apply such margin recipes, the patient group's random and systematic error performance must be well known, which requires the collection of a substantial quantity of data. The aim of the project described here was to collect

  15. Accountable Accounting: Carbon-Based Management on Marginal Lands

    Directory of Open Access Journals (Sweden)

    Tara L. DiRocco

    2014-04-01

    Full Text Available Substantial discussion exists concerning the best land use options for mitigating greenhouse gas (GHG emissions on marginal land. Emissions-mitigating land use options include displacement of fossil fuels via biofuel production and afforestation. Comparing C recovery dynamics under these different options is crucial to assessing the efficacy of offset programs. In this paper, we focus on forest recovery on marginal land, and show that there is substantial inaccuracy and discrepancy in the literature concerning carbon accumulation. We find that uncertainty in carbon accumulation occurs in estimations of carbon stocks and models of carbon dynamics over time. We suggest that analyses to date have been largely unsuccessful at determining reliable trends in site recovery due to broad land use categories, a failure to consider the effect of current and post-restoration management, and problems with meta-analysis. Understanding of C recovery could be greatly improved with increased data collection on pre-restoration site quality, prior land use history, and management practices as well as increased methodological standardization. Finally, given the current and likely future uncertainty in C dynamics, we recommend carbon mitigation potential should not be the only environmental service driving land use decisions on marginal lands.

  16. On the dynamics of turbulent transport near marginal stability

    International Nuclear Information System (INIS)

    Diamond, P.H.; Hahm, T.S.

    1995-03-01

    A general methodology for describing the dynamics of transport near marginal stability is formulated. Marginal stability is a special case of the more general phenomenon of self-organized criticality. Simple, one field models of the dynamics of tokamak plasma self-organized criticality have been constructed, and include relevant features such as sheared mean flow and transport bifurcations. In such models, slow mode (i.e. large scale, low frequency transport events) correlation times determine the behavior of transport dynamics near marginal stability. To illustrate this, impulse response scaling exponents (z) and turbulent diffusivities (D) have been calculated for the minimal (Burgers) and sheared flow models. For the minimal model, z = 1 (indicating ballastic propagation) and D ∼(S 0 2 ) 1/3 , where S 0 2 is the noise strength. With an identically structured noise spectrum and flow with shearing rate exceeding the ambient decorrelation rate for the largest scale transport events, diffusion is recovered with z = 2 and D ∼ (S 0 2 ) 3/5 . This indicates a qualitative change in the dynamics, as well as a reduction in losses. These results are consistent with recent findings from ρ scaling scans. Several tokamak transport experiments are suggested

  17. Evaluation of thermal margin during BWR neutron flux oscillation

    International Nuclear Information System (INIS)

    Takeuchi, Yutaka; Takigawa, Yukio; Chuman, Kazuto; Ebata, Shigeo

    1992-01-01

    Fuel integrity is very important, from the view point of nuclear power plant safety. Recently, neutron flux oscillations were observed at several BWR plants. The present paper describes the evaluations of the thermal margin during BWR neutron flux oscillations, using a three-dimensional transient code. The thermal margin is evaluated as MCPR (minimum critical power ratio). The LaSalle-2 event was simulated and the MCPR during the event was evaluated. It was a core-wide oscillation, at which a large neutron flux oscillation amplitude was observed. The results indicate that the MCPR had a sufficient margin with regard to the design limit. A regional oscillation mode, which is different from a core-wide oscillation, was simulated and the MCPR response was compared with that for the LaSalle-2 event. The MCPR decrement is greater in the regional oscillation, than in the core wide -oscillation, because of the sensitivity difference in a flow-to-power gain. A study was carried out about regional oscillation detectability, from the MCPR response view point. Even in a hypothetically severe case, the regional oscillation is detectable by LPRM signals. (author)

  18. Reconstructing Rodinia by Fitting Neoproterozoic Continental Margins

    Science.gov (United States)

    Stewart, John H.

    2009-01-01

    Reconstructions of Phanerozoic tectonic plates can be closely constrained by lithologic correlations across conjugate margins by paleontologic information, by correlation of orogenic belts, by paleomagnetic location of continents, and by ocean floor magmatic stripes. In contrast, Proterozoic reconstructions are hindered by the lack of some of these tools or the lack of their precision. To overcome some of these difficulties, this report focuses on a different method of reconstruction, namely the use of the shape of continents to assemble the supercontinent of Rodinia, much like a jigsaw puzzle. Compared to the vast amount of information available for Phanerozoic systems, such a limited approach for Proterozoic rocks, may seem suspect. However, using the assembly of the southern continents (South America, Africa, India, Arabia, Antarctica, and Australia) as an example, a very tight fit of the continents is apparent and illustrates the power of the jigsaw puzzle method. This report focuses on Neoproterozoic rocks, which are shown on two new detailed geologic maps that constitute the backbone of the study. The report also describes the Neoproterozoic, but younger or older rocks are not discussed or not discussed in detail. The Neoproterozoic continents and continental margins are identified based on the distribution of continental-margin sedimentary and magmatic rocks that define the break-up margins of Rodinia. These Neoproterozoic continental exposures, as well as critical Neo- and Meso-Neoproterozoic tectonic features shown on the two new map compilations, are used to reconstruct the Mesoproterozoic supercontinent of Rodinia. This approach differs from the common approach of using fold belts to define structural features deemed important in the Rodinian reconstruction. Fold belts are difficult to date, and many are significantly younger than the time frame considered here (1,200 to 850 Ma). Identifying Neoproterozoic continental margins, which are primarily

  19. Comparison of histologic margin status in low-grade cutaneous and subcutaneous canine mast cell tumours examined by radial and tangential sections.

    Science.gov (United States)

    Dores, C B; Milovancev, M; Russell, D S

    2018-03-01

    Radial sections are widely used to estimate adequacy of excision in canine cutaneous mast cell tumours (MCTs); however, this sectioning technique estimates only a small fraction of total margin circumference. This study aimed to compare histologic margin status in grade II/low grade MCTs sectioned using both radial and tangential sectioning techniques. A total of 43 circumferential margins were evaluated from 21 different tumours. Margins were first sectioned radially, followed by tangential sections. Tissues were examined by routine histopathology. Tangential margin status differed in 10 of 43 (23.3%) margins compared with their initial status on radial section. Of 39 margins, 9 (23.1%) categorized as histologic tumour-free margin (HTFM) >0 mm were positive on tangential sectioning. Tangential sections detected a significantly higher proportion of positive margins relative to radial sections (exact 2-tailed P-value = .0215). The HTFM was significantly longer in negative tangential margins than positive tangential margins (mean 10.1 vs 3.2 mm; P = .0008). A receiver operating characteristic curve comparing HTFM and tangentially negative margins found an area under the curve of 0.83 (95% confidence interval: 0.71-0.96). Although correct classification peaked at the sixth cut-point of HTFM ≥1 mm, radial sections still incorrectly classified 50% of margins as lacking tumour cells. Radial sections had 100% specificity for predicting negative tangential margins at a cut-point of 10.9 mm. These data indicate that for low grade MCTs, HTFMs >0 mm should not be considered completely excised, particularly when HTFM is <10.9 mm. This will inform future studies that use HTFM and overall excisional status as dependent variables in multivariable prognostic models. © 2017 John Wiley & Sons Ltd.

  20. Medical Dataset Classification: A Machine Learning Paradigm Integrating Particle Swarm Optimization with Extreme Learning Machine Classifier

    Directory of Open Access Journals (Sweden)

    C. V. Subbulakshmi

    2015-01-01

    Full Text Available Medical data classification is a prime data mining problem being discussed about for a decade that has attracted several researchers around the world. Most classifiers are designed so as to learn from the data itself using a training process, because complete expert knowledge to determine classifier parameters is impracticable. This paper proposes a hybrid methodology based on machine learning paradigm. This paradigm integrates the successful exploration mechanism called self-regulated learning capability of the particle swarm optimization (PSO algorithm with the extreme learning machine (ELM classifier. As a recent off-line learning method, ELM is a single-hidden layer feedforward neural network (FFNN, proved to be an excellent classifier with large number of hidden layer neurons. In this research, PSO is used to determine the optimum set of parameters for the ELM, thus reducing the number of hidden layer neurons, and it further improves the network generalization performance. The proposed method is experimented on five benchmarked datasets of the UCI Machine Learning Repository for handling medical dataset classification. Simulation results show that the proposed approach is able to achieve good generalization performance, compared to the results of other classifiers.

  1. Obscenity detection using haar-like features and Gentle Adaboost classifier.

    Science.gov (United States)

    Mustafa, Rashed; Min, Yang; Zhu, Dingju

    2014-01-01

    Large exposure of skin area of an image is considered obscene. This only fact may lead to many false images having skin-like objects and may not detect those images which have partially exposed skin area but have exposed erotogenic human body parts. This paper presents a novel method for detecting nipples from pornographic image contents. Nipple is considered as an erotogenic organ to identify pornographic contents from images. In this research Gentle Adaboost (GAB) haar-cascade classifier and haar-like features used for ensuring detection accuracy. Skin filter prior to detection made the system more robust. The experiment showed that, considering accuracy, haar-cascade classifier performs well, but in order to satisfy detection time, train-cascade classifier is suitable. To validate the results, we used 1198 positive samples containing nipple objects and 1995 negative images. The detection rates for haar-cascade and train-cascade classifiers are 0.9875 and 0.8429, respectively. The detection time for haar-cascade is 0.162 seconds and is 0.127 seconds for train-cascade classifier.

  2. Obscenity Detection Using Haar-Like Features and Gentle Adaboost Classifier

    Directory of Open Access Journals (Sweden)

    Rashed Mustafa

    2014-01-01

    Full Text Available Large exposure of skin area of an image is considered obscene. This only fact may lead to many false images having skin-like objects and may not detect those images which have partially exposed skin area but have exposed erotogenic human body parts. This paper presents a novel method for detecting nipples from pornographic image contents. Nipple is considered as an erotogenic organ to identify pornographic contents from images. In this research Gentle Adaboost (GAB haar-cascade classifier and haar-like features used for ensuring detection accuracy. Skin filter prior to detection made the system more robust. The experiment showed that, considering accuracy, haar-cascade classifier performs well, but in order to satisfy detection time, train-cascade classifier is suitable. To validate the results, we used 1198 positive samples containing nipple objects and 1995 negative images. The detection rates for haar-cascade and train-cascade classifiers are 0.9875 and 0.8429, respectively. The detection time for haar-cascade is 0.162 seconds and is 0.127 seconds for train-cascade classifier.

  3. Ferritin associates with marginal band microtubules

    International Nuclear Information System (INIS)

    Infante, Anthony A.; Infante, Dzintra; Chan, M.-C.; How, P.-C.; Kutschera, Waltraud; Linhartova, Irena; Muellner, Ernst W.; Wiche, Gerhard; Propst, Friedrich

    2007-01-01

    We characterized chicken erythrocyte and human platelet ferritin by biochemical studies and immunofluorescence. Erythrocyte ferritin was found to be a homopolymer of H-ferritin subunits, resistant to proteinase K digestion, heat stable, and contained iron. In mature chicken erythrocytes and human platelets, ferritin was localized at the marginal band, a ring-shaped peripheral microtubule bundle, and displayed properties of bona fide microtubule-associated proteins such as tau. Red blood cell ferritin association with the marginal band was confirmed by temperature-induced disassembly-reassembly of microtubules. During erythrocyte differentiation, ferritin co-localized with coalescing microtubules during marginal band formation. In addition, ferritin was found in the nuclei of mature erythrocytes, but was not detectable in those of bone marrow erythrocyte precursors. These results suggest that ferritin has a function in marginal band formation and possibly in protection of the marginal band from damaging effects of reactive oxygen species by sequestering iron in the mature erythrocyte. Moreover, our data suggest that ferritin and syncolin, a previously identified erythrocyte microtubule-associated protein, are identical. Nuclear ferritin might contribute to transcriptional silencing or, alternatively, constitute a ferritin reservoir

  4. Risk insights from seismic margin reviews

    International Nuclear Information System (INIS)

    Budnitz, R.J.

    1990-01-01

    This paper discusses the information that has been derived from the three seismic-margin reviews conducted so far, and the information that is potentially available from using the seismic-margin method more generally. There are two different methodologies for conducting seismic margin reviews of nuclear power plants, one developed under NRC sponsorship and one developed under sponsorship of the Electric Power Research Institute. Both methodologies will be covered in this paper. The paper begins with a summary of the steps necessary to complete a margin review, and will then outline the key technical difficulties that need to be addressed. After this introduction, the paper covers the safety and operational insights derived from the three seismic-margin reviews already completed: the NRC-sponsored review at Maine Yankee; the EPRI-sponsored review at Catawba; and the joint EPRI/NRC/utility effort at Hatch. The emphasis is on engineering insights, with attention to the aspects of the reviews that are easiest to perform and that provide the most readily available insights

  5. A decision support system using combined-classifier for high-speed data stream in smart grid

    Science.gov (United States)

    Yang, Hang; Li, Peng; He, Zhian; Guo, Xiaobin; Fong, Simon; Chen, Huajun

    2016-11-01

    Large volume of high-speed streaming data is generated by big power grids continuously. In order to detect and avoid power grid failure, decision support systems (DSSs) are commonly adopted in power grid enterprises. Among all the decision-making algorithms, incremental decision tree is the most widely used one. In this paper, we propose a combined classifier that is a composite of a cache-based classifier (CBC) and a main tree classifier (MTC). We integrate this classifier into a stream processing engine on top of the DSS such that high-speed steaming data can be transformed into operational intelligence efficiently. Experimental results show that our proposed classifier can return more accurate answers than other existing ones.

  6. Class-specific Error Bounds for Ensemble Classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Prenger, R; Lemmond, T; Varshney, K; Chen, B; Hanley, W

    2009-10-06

    The generalization error, or probability of misclassification, of ensemble classifiers has been shown to be bounded above by a function of the mean correlation between the constituent (i.e., base) classifiers and their average strength. This bound suggests that increasing the strength and/or decreasing the correlation of an ensemble's base classifiers may yield improved performance under the assumption of equal error costs. However, this and other existing bounds do not directly address application spaces in which error costs are inherently unequal. For applications involving binary classification, Receiver Operating Characteristic (ROC) curves, performance curves that explicitly trade off false alarms and missed detections, are often utilized to support decision making. To address performance optimization in this context, we have developed a lower bound for the entire ROC curve that can be expressed in terms of the class-specific strength and correlation of the base classifiers. We present empirical analyses demonstrating the efficacy of these bounds in predicting relative classifier performance. In addition, we specify performance regions of the ROC curve that are naturally delineated by the class-specific strengths of the base classifiers and show that each of these regions can be associated with a unique set of guidelines for performance optimization of binary classifiers within unequal error cost regimes.

  7. Frog sound identification using extended k-nearest neighbor classifier

    Science.gov (United States)

    Mukahar, Nordiana; Affendi Rosdi, Bakhtiar; Athiar Ramli, Dzati; Jaafar, Haryati

    2017-09-01

    Frog sound identification based on the vocalization becomes important for biological research and environmental monitoring. As a result, different types of feature extractions and classifiers have been employed to evaluate the accuracy of frog sound identification. This paper presents a frog sound identification with Extended k-Nearest Neighbor (EKNN) classifier. The EKNN classifier integrates the nearest neighbors and mutual sharing of neighborhood concepts, with the aims of improving the classification performance. It makes a prediction based on who are the nearest neighbors of the testing sample and who consider the testing sample as their nearest neighbors. In order to evaluate the classification performance in frog sound identification, the EKNN classifier is compared with competing classifier, k -Nearest Neighbor (KNN), Fuzzy k -Nearest Neighbor (FKNN) k - General Nearest Neighbor (KGNN)and Mutual k -Nearest Neighbor (MKNN) on the recorded sounds of 15 frog species obtained in Malaysia forest. The recorded sounds have been segmented using Short Time Energy and Short Time Average Zero Crossing Rate (STE+STAZCR), sinusoidal modeling (SM), manual and the combination of Energy (E) and Zero Crossing Rate (ZCR) (E+ZCR) while the features are extracted by Mel Frequency Cepstrum Coefficient (MFCC). The experimental results have shown that the EKNCN classifier exhibits the best performance in terms of accuracy compared to the competing classifiers, KNN, FKNN, GKNN and MKNN for all cases.

  8. Magmatic development of the outer Vøring Margin

    Science.gov (United States)

    Breivik, Asbjorn; Faleide, Jan Inge; Mjelde, Rolf; Flueh, Ernst; Murai, Yoshio

    2013-04-01

    The Vøring Plateau off mid-Norway is a volcanic passive margin, located north of the East Jan Mayen Fracture Zone (EJMFZ). Large volumes of magmatic rocks were emplaced during Early Eocene margin formation. In 2003, an ocean bottom seismometer survey was acquired on the Vøring and Lofoten margins. One profile crosses from the Vøring Plateau to the Vøring Spur, an oceanic plateau north of the EJMFZ. The P-wave data were modeled by ray-tracing in a 2D velocity model of the crust. The process behind the excess magmatism can be estimated by comparing seismic velocity (VP) with igneous thickness (H). This profile and two other profiles farther north show a positive H-VP correlation, consistent with a hot mantle reservoir of finite extent under the margin at breakup. However, during the first two million years, magma production appears to be augmented by a secondary process. By 51-51.5 Ma melting may be caused by elevated mantle temperature alone. Seismic stratigraphy around the Vøring Spur shows at least two inversion events, with the main episode tentatively in the Upper Miocene, apparently through igneous growth to create the up to 15 km crustal thickness. The H-VP correlation of the spur is low, indicating constant and moderate-degree mantle melting not tied to the breakup magmatism. The admittance function between bathymetry and free-air gravity shows that the high is near local isostatic equilibrium, discounting that compressional flexure at the EJMFZ shaped the high. We also find no evidence for the proposed Early Eocene triple junction in the area.

  9. Removal of 230Th and 231Pa at ocean margins

    International Nuclear Information System (INIS)

    Anderson, R.F.; Bacon, M.P.; Brewer, P.G.

    1983-01-01

    Uranium, thorium and protactinium isotopes were measured in particulate matter collected by sediment traps deployed in the Panama Basin and by in-situ filtration of large volumes of seawater in the Panama and Guatemala Basins. Concentrations of dissolved Th and Pa isotopes were determined by extraction onto MnO 2 adsorbers placed in line behind the filters in the in-situ pumping systems. Concentrations of dissolved 230 Th and 231 Pa in the Panama and Guatemala Basins are lower than in the open ocean, whereas dissolved 230 Th/ 231 Pa ratios are equal to, or slightly greater than, ratios in the open ocean. Particulate 230 Th/ 231 Pa ratios in the sediment trap samples ranged from 4 to 8, in contrast to ratios of 30 or more at the open ocean sites previously studied. Particles collected by filtration in the Panama Basin and nearest to the continental margin in the Guatemala Basin contained 230 Th/ 231 Pa ratios similar to the ratios in the sediment trap samples. The ratios increased with distance away from the continent. Suspended particles near the margin show no preference for adsorption of Th or Pa and therefore must be chemically different from particles in the open ocean, which show a strong preference for adsorption of Th. Ocean margins, as typified by the Panama and Guatemala Basins, are preferential sinks for 231 Pa relative to 230 Th. Furthermore, the margins are sinks for 230 Th and, to a greater extent, 231 Pa transported by horizontal mixing from the open ocean. (orig.)

  10. Ship localization in Santa Barbara Channel using machine learning classifiers.

    Science.gov (United States)

    Niu, Haiqiang; Ozanich, Emma; Gerstoft, Peter

    2017-11-01

    Machine learning classifiers are shown to outperform conventional matched field processing for a deep water (600 m depth) ocean acoustic-based ship range estimation problem in the Santa Barbara Channel Experiment when limited environmental information is known. Recordings of three different ships of opportunity on a vertical array were used as training and test data for the feed-forward neural network and support vector machine classifiers, demonstrating the feasibility of machine learning methods to locate unseen sources. The classifiers perform well up to 10 km range whereas the conventional matched field processing fails at about 4 km range without accurate environmental information.

  11. Carfilzomib With or Without Rituximab in the Treatment of Waldenstrom Macroglobulinemia or Marginal Zone Lymphoma

    Science.gov (United States)

    2018-02-05

    Marginal Zone Lymphoma; Recurrent Marginal Zone Lymphoma; Recurrent Waldenstrom Macroglobulinemia; Refractory Marginal Zone Lymphoma; Refractory Waldenstrom Macroglobulinemia; Waldenstrom Macroglobulinemia

  12. Fission product margin in burnup credit analyses

    International Nuclear Information System (INIS)

    Finck, P.J.; Stenberg, C.G.

    1998-01-01

    The US Department of Energy (DOE) is currently working toward the licensing of a methodology for using actinide-only burnup credit for the transportation of spent nuclear fuel (SNF). Important margins are built into this methodology. By using comparisons with a representative experimental database to determine bias factors, the methodology ensures that actinide concentrations and worths are estimated conservatively; furthermore, the negative net reactivity of certain actinides and all fission products (FPs) is not taken into account, thus providing additional margin. A future step of DOE's effort might aim at establishing an actinide and FP burnup credit methodology. The objective of this work is to establish the uncertainty to be applied to the total FP worth in SNF. This will serve two ends. First, it will support the current actinide-only methodology by demonstrating the margin available from FPs. Second, it will identify the major contributions to the uncertainty and help set priorities for future work

  13. Refining prices and margins in 1998

    International Nuclear Information System (INIS)

    Favennec, J.P.; Baudoin, C.

    1999-01-01

    Despite a business environment that was globally mediocre due primarily to the Asian crisis and to a mild winter in the northern hemisphere, the signs of improvement noted in the refining activity in 1996 were borne out in 1997. But the situation is not yet satisfactory in this sector: the low return on invested capital and the financing of environmental protection expenditure are giving cause for concern. In 1998, the drop in crude oil prices and the concomitant fall in petroleum product prices was ultimately rather favorable to margins. Two elements tended to put a damper on this relative optimism. First of all, margins continue to be extremely volatile and, secondly, the worsening of the economic and financial crisis observed during the summer made for a sharp decline in margins in all geographic regions, especially Asia

  14. Gas-processing profit margin series begins in OGJ

    International Nuclear Information System (INIS)

    Kovacs, K.J.

    1991-01-01

    This paper reports on the bases and methods employed by the WK (Wright, Killen and Co, Houston) profit-margin indicator for U.S. gas-processing plants. Additionally, this article reviews the historical profitability of the gas-processing industry and key factors affecting these trends. Texas was selected as the most representative for the industry, reflecting the wide spectrum of gas-processing plants. The profit performance of Texas' gas plants is of special significance because of the large number of plants and high volume of NGL production in the region

  15. Digital Margins : How spatially and socially marginalized communities deal with digital exclusion

    NARCIS (Netherlands)

    Salemink, Koen

    2016-01-01

    The increasing importance of the Internet as a means of communication has transformed economies and societies. For spatially and socially marginalized communities, this transformation has resulted in digital exclusion and further marginalization. This book presents a study of two kinds of

  16. Impact of organ shape variations on margin concepts for cervix cancer ART.

    Science.gov (United States)

    Seppenwoolde, Yvette; Stock, Markus; Buschmann, Martin; Georg, Dietmar; Bauer-Novotny, Kwei-Yuang; Pötter, Richard; Georg, Petra

    2016-09-01

    Target and organ movement motivate adaptive radiotherapy for cervix cancer patients. We investigated the dosimetric impact of margin concepts with different levels of complexity on both organ at risk (OAR) sparing and PTV coverage. Weekly CT and daily CBCT scans were delineated for 10 patients. The dosimetric impact of organ shape variations were evaluated for four (isotropic) margin concepts: two static PTVs (PTV 6mm and PTV 15mm ), a PTV based on ITV of the planning CT and CBCTs of the first treatment week (PTV ART ITV ) and an adaptive PTV based on a library approach (PTV ART Library ). Using static concepts, OAR doses increased with large margins, while smaller margins compromised target coverage. ART PTVs resulted in comparable target coverage and better sparing of bladder (V40Gy: 15% and 7% less), rectum (V40Gy: 18 and 6cc less) and bowel (V40Gy: 106 and 15cc less) compared to PTV 15mm . Target coverage evaluation showed that for elective fields a static 5mm margin sufficed. PTV ART Library achieved the best dosimetric results. However when weighing clinical benefit against workload, ITV margins based on repetitive movement evaluation during the first week also provide improvements over static margin concepts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Estimation of Model's Marginal likelihood Using Adaptive Sparse Grid Surrogates in Bayesian Model Averaging

    Science.gov (United States)

    Zeng, X.

    2015-12-01

    A large number of model executions are required to obtain alternative conceptual models' predictions and their posterior probabilities in Bayesian model averaging (BMA). The posterior model probability is estimated through models' marginal likelihood and prior probability. The heavy computation burden hinders the implementation of BMA prediction, especially for the elaborated marginal likelihood estimator. For overcoming the computation burden of BMA, an adaptive sparse grid (SG) stochastic collocation method is used to build surrogates for alternative conceptual models through the numerical experiment of a synthetical groundwater model. BMA predictions depend on model posterior weights (or marginal likelihoods), and this study also evaluated four marginal likelihood estimators, including arithmetic mean estimator (AME), harmonic mean estimator (HME), stabilized harmonic mean estimator (SHME), and thermodynamic integration estimator (TIE). The results demonstrate that TIE is accurate in estimating conceptual models' marginal likelihoods. The BMA-TIE has better predictive performance than other BMA predictions. TIE has high stability for estimating conceptual model's marginal likelihood. The repeated estimated conceptual model's marginal likelihoods by TIE have significant less variability than that estimated by other estimators. In addition, the SG surrogates are efficient to facilitate BMA predictions, especially for BMA-TIE. The number of model executions needed for building surrogates is 4.13%, 6.89%, 3.44%, and 0.43% of the required model executions of BMA-AME, BMA-HME, BMA-SHME, and BMA-TIE, respectively.

  18. Deregulated model and locational marginal pricing

    International Nuclear Information System (INIS)

    Sood, Yog Raj; Padhy, N.P.; Gupta, H.O.

    2007-01-01

    This paper presents a generalized optimal model that dispatches the pool in combination with privately negotiated bilateral and multilateral contracts while maximizing social benefit has been proposed. This model determines the locational marginal pricing (LMP) based on marginal cost theory. It also determines the size of non-firm transactions as well as pool demand and generations. Both firms as well as non-firm transactions are considered in this model. The proposed model has been applied to IEEE-30 bus test system. In this test system different types of transactions are added for analysis of the proposed model. (author)

  19. Seismic safety margins research program overview

    International Nuclear Information System (INIS)

    Tokarz, F.J.; Smith, P.D.

    1978-01-01

    A multiyear seismic research program has been initiated at the Lawrence Livermore Laboratory. This program, the Seismic Safety Margins Research Program (SSMRP) is funded by the U.S. Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The program is designed to develop a probabilistic systems methodology for determining the seismic safety margins of nuclear power plants. Phase I, extending some 22 months, began in July 1978 at a funding level of approximately $4.3 million. Here we present an overview of the SSMRP. Included are discussions on the program objective, the approach to meet the program goal and objectives, end products, the probabilistic systems methodology, and planned activities for Phase I

  20. Slope failure of chalk channel margins

    DEFF Research Database (Denmark)

    Gale, A.; Anderskouv, Kresten; Surlyk, Finn

    2015-01-01

    provide evidence for recurring margin collapse of a long-lived Campanian channel. Compressionally deformed and thrust chalk hardgrounds are correlated to thicker, non-cemented chalk beds that form a broad, gentle anticline. These chalks represent a slump complex with a roll-over anticline of expanded, non......-cemented chalk in the head region and a culmination of condensed hardgrounds in the toe region. Observations strongly suggest that the slumping represents collapse of a channel margin. Farther northwards, the contemporaneous succession shows evidence of small-scale penecontemporaneous normal faulting towards...

  1. Evaluation of thermal margin for HANARO core

    Energy Technology Data Exchange (ETDEWEB)

    Park, Cheol; Chae, Hee Taek; Kim Heon Il; Lim, I. C.; Lee, C. S.; Kim, H

    1999-08-01

    During the commissioning and the start-up of the HANARO, various design parameters were confirmed and measured. For safer operation of HANARO and resolution of the CHF penalty issue which is one of unresolved licensing problems, thermal margins for normal and transient conditions were re-evaluated reflecting the commissioning and the start-up test results and the design modifications during operation. The re-evaluation shows that the HANARO meets the design criteria for ONB margin and fuel centerline temperature under normal condition. For upset condition, it also satisfies the safety limits for CHFR and fuel centerline temperature. (Author). 11 refs., 13 tabs., 4 figs.

  2. On probabilistically defined margins in radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Papiez, Lech; Langer, Mark [Department of Radiation Oncology, Indiana University, Indianapolis, IN (United States)

    2006-08-21

    Margins about a target volume subject to external beam radiation therapy are designed to assure that the target volume of tissue to be sterilized by treatment is adequately covered by a lethal dose. Thus, margins are meant to guarantee that all potential variation in tumour position relative to beams allows the tumour to stay within the margin. Variation in tumour position can be broken into two types of dislocations, reducible and irreducible. Reducible variations in tumour position are those that can be accommodated with the use of modern image-guided techniques that derive parameters for compensating motions of patient bodies and/or motions of beams relative to patient bodies. Irreducible variations in tumour position are those random dislocations of a target that are related to errors intrinsic in the design and performance limitations of the software and hardware, as well as limitations of human perception and decision making. Thus, margins in the era of image-guided treatments will need to accommodate only random errors residual in patient setup accuracy (after image-guided setup corrections) and in the accuracy of systems designed to track moving and deforming tissues of the targeted regions of the patient's body. Therefore, construction of these margins will have to be based on purely statistical data. The characteristics of these data have to be determined through the central limit theorem and Gaussian properties of limiting error distributions. In this paper, we show how statistically determined margins are to be designed in the general case of correlated distributions of position errors in three-dimensional space. In particular, we show how the minimal margins for a given level of statistical confidence are found. Then, how they are to be used to determine geometrically minimal PTV that provides coverage of GTV at the assumed level of statistical confidence. Our results generalize earlier recommendations for statistical, central limit theorem

  3. On probabilistically defined margins in radiation therapy

    International Nuclear Information System (INIS)

    Papiez, Lech; Langer, Mark

    2006-01-01

    Margins about a target volume subject to external beam radiation therapy are designed to assure that the target volume of tissue to be sterilized by treatment is adequately covered by a lethal dose. Thus, margins are meant to guarantee that all potential variation in tumour position relative to beams allows the tumour to stay within the margin. Variation in tumour position can be broken into two types of dislocations, reducible and irreducible. Reducible variations in tumour position are those that can be accommodated with the use of modern image-guided techniques that derive parameters for compensating motions of patient bodies and/or motions of beams relative to patient bodies. Irreducible variations in tumour position are those random dislocations of a target that are related to errors intrinsic in the design and performance limitations of the software and hardware, as well as limitations of human perception and decision making. Thus, margins in the era of image-guided treatments will need to accommodate only random errors residual in patient setup accuracy (after image-guided setup corrections) and in the accuracy of systems designed to track moving and deforming tissues of the targeted regions of the patient's body. Therefore, construction of these margins will have to be based on purely statistical data. The characteristics of these data have to be determined through the central limit theorem and Gaussian properties of limiting error distributions. In this paper, we show how statistically determined margins are to be designed in the general case of correlated distributions of position errors in three-dimensional space. In particular, we show how the minimal margins for a given level of statistical confidence are found. Then, how they are to be used to determine geometrically minimal PTV that provides coverage of GTV at the assumed level of statistical confidence. Our results generalize earlier recommendations for statistical, central limit theorem

  4. Classifying hot water chemistry: Application of MULTIVARIATE STATISTICS

    OpenAIRE

    Sumintadireja, Prihadi; Irawan, Dasapta Erwin; Rezky, Yuanno; Gio, Prana Ugiana; Agustin, Anggita

    2016-01-01

    This file is the dataset for the following paper "Classifying hot water chemistry: Application of MULTIVARIATE STATISTICS". Authors: Prihadi Sumintadireja1, Dasapta Erwin Irawan1, Yuano Rezky2, Prana Ugiana Gio3, Anggita Agustin1

  5. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    National Research Council Canada - National Science Library

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  6. An ensemble classifier to predict track geometry degradation

    International Nuclear Information System (INIS)

    Cárdenas-Gallo, Iván; Sarmiento, Carlos A.; Morales, Gilberto A.; Bolivar, Manuel A.; Akhavan-Tabatabaei, Raha

    2017-01-01

    Railway operations are inherently complex and source of several problems. In particular, track geometry defects are one of the leading causes of train accidents in the United States. This paper presents a solution approach which entails the construction of an ensemble classifier to forecast the degradation of track geometry. Our classifier is constructed by solving the problem from three different perspectives: deterioration, regression and classification. We considered a different model from each perspective and our results show that using an ensemble method improves the predictive performance. - Highlights: • We present an ensemble classifier to forecast the degradation of track geometry. • Our classifier considers three perspectives: deterioration, regression and classification. • We construct and test three models and our results show that using an ensemble method improves the predictive performance.

  7. A novel statistical method for classifying habitat generalists and specialists

    DEFF Research Database (Denmark)

    Chazdon, Robin L; Chao, Anne; Colwell, Robert K

    2011-01-01

    in second-growth (SG) and old-growth (OG) rain forests in the Caribbean lowlands of northeastern Costa Rica. We evaluate the multinomial model in detail for the tree data set. Our results for birds were highly concordant with a previous nonstatistical classification, but our method classified a higher......: (1) generalist; (2) habitat A specialist; (3) habitat B specialist; and (4) too rare to classify with confidence. We illustrate our multinomial classification method using two contrasting data sets: (1) bird abundance in woodland and heath habitats in southeastern Australia and (2) tree abundance...... fraction (57.7%) of bird species with statistical confidence. Based on a conservative specialization threshold and adjustment for multiple comparisons, 64.4% of tree species in the full sample were too rare to classify with confidence. Among the species classified, OG specialists constituted the largest...

  8. 6 CFR 7.23 - Emergency release of classified information.

    Science.gov (United States)

    2010-01-01

    ... Classified Information Non-disclosure Form. In emergency situations requiring immediate verbal release of... information through approved communication channels by the most secure and expeditious method possible, or by...

  9. DECISION TREE CLASSIFIERS FOR STAR/GALAXY SEPARATION

    International Nuclear Information System (INIS)

    Vasconcellos, E. C.; Ruiz, R. S. R.; De Carvalho, R. R.; Capelato, H. V.; Gal, R. R.; LaBarbera, F. L.; Frago Campos Velho, H.; Trevisan, M.

    2011-01-01

    We study the star/galaxy classification efficiency of 13 different decision tree algorithms applied to photometric objects in the Sloan Digital Sky Survey Data Release Seven (SDSS-DR7). Each algorithm is defined by a set of parameters which, when varied, produce different final classification trees. We extensively explore the parameter space of each algorithm, using the set of 884,126 SDSS objects with spectroscopic data as the training set. The efficiency of star-galaxy separation is measured using the completeness function. We find that the Functional Tree algorithm (FT) yields the best results as measured by the mean completeness in two magnitude intervals: 14 ≤ r ≤ 21 (85.2%) and r ≥ 19 (82.1%). We compare the performance of the tree generated with the optimal FT configuration to the classifications provided by the SDSS parametric classifier, 2DPHOT, and Ball et al. We find that our FT classifier is comparable to or better in completeness over the full magnitude range 15 ≤ r ≤ 21, with much lower contamination than all but the Ball et al. classifier. At the faintest magnitudes (r > 19), our classifier is the only one that maintains high completeness (>80%) while simultaneously achieving low contamination (∼2.5%). We also examine the SDSS parametric classifier (psfMag - modelMag) to see if the dividing line between stars and galaxies can be adjusted to improve the classifier. We find that currently stars in close pairs are often misclassified as galaxies, and suggest a new cut to improve the classifier. Finally, we apply our FT classifier to separate stars from galaxies in the full set of 69,545,326 SDSS photometric objects in the magnitude range 14 ≤ r ≤ 21.

  10. A support vector machine classifier reduces interscanner variation in the HRCT classification of regional disease pattern in diffuse lung disease: Comparison to a Bayesian classifier

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Yongjun; Lim, Jonghyuck; Kim, Namkug; Seo, Joon Beom [Department of Radiology, University of Ulsan College of Medicine, 388-1 Pungnap2-dong, Songpa-gu, Seoul 138-736 (Korea, Republic of); Lynch, David A. [Department of Radiology, National Jewish Medical and Research Center, Denver, Colorado 80206 (United States)

    2013-05-15

    integrated ROI data obtained from both scanners, the classification accuracies with the SVM and Bayesian classifiers were 92% and 77%, respectively. The selected features resulting from the classification process differed by scanner, with more features included for the classification of the integrated HRCT data than for the classification of the HRCT data from each scanner. For the integrated data, consisting of HRCT images of both scanners, the classification accuracy based on the SVM was statistically similar to the accuracy of the data obtained from each scanner. However, the classification accuracy of the integrated data using the Bayesian classifier was significantly lower than the classification accuracy of the ROI data of each scanner. Conclusions: The use of an integrated dataset along with a SVM classifier rather than a Bayesian classifier has benefits in terms of the classification accuracy of HRCT images acquired with more than one scanner. This finding is of relevance in studies involving large number of images, as is the case in a multicenter trial with different scanners.

  11. A support vector machine classifier reduces interscanner variation in the HRCT classification of regional disease pattern in diffuse lung disease: Comparison to a Bayesian classifier

    International Nuclear Information System (INIS)

    Chang, Yongjun; Lim, Jonghyuck; Kim, Namkug; Seo, Joon Beom; Lynch, David A.

    2013-01-01

    data obtained from both scanners, the classification accuracies with the SVM and Bayesian classifiers were 92% and 77%, respectively. The selected features resulting from the classification process differed by scanner, with more features included for the classification of the integrated HRCT data than for the classification of the HRCT data from each scanner. For the integrated data, consisting of HRCT images of both scanners, the classification accuracy based on the SVM was statistically similar to the accuracy of the data obtained from each scanner. However, the classification accuracy of the integrated data using the Bayesian classifier was significantly lower than the classification accuracy of the ROI data of each scanner. Conclusions: The use of an integrated dataset along with a SVM classifier rather than a Bayesian classifier has benefits in terms of the classification accuracy of HRCT images acquired with more than one scanner. This finding is of relevance in studies involving large number of images, as is the case in a multicenter trial with different scanners.

  12. Re-appraisal of the Magma-rich versus Magma-poor Paradigm at Rifted Margins: consequences for breakup processes

    Science.gov (United States)

    Tugend, J.; Gillard, M.; Manatschal, G.; Nirrengarten, M.; Harkin, C. J.; Epin, M. E.; Sauter, D.; Autin, J.; Kusznir, N. J.; McDermott, K.

    2017-12-01

    Rifted margins are often classified based on their magmatic budget only. Magma-rich margins are commonly considered to have excess decompression melting at lithospheric breakup compared with steady state seafloor spreading while magma-poor margins have suppressed melting. New observations derived from high quality geophysical data sets and drill-hole data have revealed the diversity of rifted margin architecture and variable distribution of magmatism. Recent studies suggest, however, that rifted margins have more complex and polyphase tectono-magmatic evolutions than previously assumed and cannot be characterized based on the observed volume of magma alone. We compare the magmatic budget related to lithospheric breakup along two high-resolution long-offset deep reflection seismic profiles across the SE-Indian (magma-poor) and Uruguayan (magma-rich) rifted margins. Resolving the volume of magmatic additions is difficult. Interpretations are non-unique and several of them appear plausible for each case involving variable magmatic volumes and mechanisms to achieve lithospheric breakup. A supposedly 'magma-poor' rifted margin (SE-India) may show a 'magma-rich' lithospheric breakup whereas a 'magma-rich' rifted margin (Uruguay) does not necessarily show excess magmatism at lithospheric breakup compared with steady-state seafloor spreading. This questions the paradigm that rifted margins can be subdivided in either magma-poor or magma-rich margins. The Uruguayan and other magma-rich rifted margins appear characterized by an early onset of decompression melting relative to crustal breakup. For the converse, where the onset of decompression melting is late compared with the timing of crustal breakup, mantle exhumation can occur (e.g. SE-India). Our work highlights the difficulty in determining a magmatic budget at rifted margins based on seismic reflection data alone, showing the limitations of margin classification based solely on magmatic volumes. The timing of

  13. Local-global classifier fusion for screening chest radiographs

    Science.gov (United States)

    Ding, Meng; Antani, Sameer; Jaeger, Stefan; Xue, Zhiyun; Candemir, Sema; Kohli, Marc; Thoma, George

    2017-03-01

    Tuberculosis (TB) is a severe comorbidity of HIV and chest x-ray (CXR) analysis is a necessary step in screening for the infective disease. Automatic analysis of digital CXR images for detecting pulmonary abnormalities is critical for population screening, especially in medical resource constrained developing regions. In this article, we describe steps that improve previously reported performance of NLM's CXR screening algorithms and help advance the state of the art in the field. We propose a local-global classifier fusion method where two complementary classification systems are combined. The local classifier focuses on subtle and partial presentation of the disease leveraging information in radiology reports that roughly indicates locations of the abnormalities. In addition, the global classifier models the dominant spatial structure in the gestalt image using GIST descriptor for the semantic differentiation. Finally, the two complementary classifiers are combined using linear fusion, where the weight of each decision is calculated by the confidence probabilities from the two classifiers. We evaluated our method on three datasets in terms of the area under the Receiver Operating Characteristic (ROC) curve, sensitivity, specificity and accuracy. The evaluation demonstrates the superiority of our proposed local-global fusion method over any single classifier.

  14. Verification of classified fissile material using unclassified attributes

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-01-01

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated

  15. A cardiorespiratory classifier of voluntary and involuntary electrodermal activity

    Directory of Open Access Journals (Sweden)

    Sejdic Ervin

    2010-02-01

    Full Text Available Abstract Background Electrodermal reactions (EDRs can be attributed to many origins, including spontaneous fluctuations of electrodermal activity (EDA and stimuli such as deep inspirations, voluntary mental activity and startling events. In fields that use EDA as a measure of psychophysiological state, the fact that EDRs may be elicited from many different stimuli is often ignored. This study attempts to classify observed EDRs as voluntary (i.e., generated from intentional respiratory or mental activity or involuntary (i.e., generated from startling events or spontaneous electrodermal fluctuations. Methods Eight able-bodied participants were subjected to conditions that would cause a change in EDA: music imagery, startling noises, and deep inspirations. A user-centered cardiorespiratory classifier consisting of 1 an EDR detector, 2 a respiratory filter and 3 a cardiorespiratory filter was developed to automatically detect a participant's EDRs and to classify the origin of their stimulation as voluntary or involuntary. Results Detected EDRs were classified with a positive predictive value of 78%, a negative predictive value of 81% and an overall accuracy of 78%. Without the classifier, EDRs could only be correctly attributed as voluntary or involuntary with an accuracy of 50%. Conclusions The proposed classifier may enable investigators to form more accurate interpretations of electrodermal activity as a measure of an individual's psychophysiological state.

  16. Geochemical evidences of methane hydrate dissociation in Alaskan Beaufort Margin during Holocene

    Science.gov (United States)

    Uchida, M.; Rella, S.; Kubota, Y.; Kumata, H.; Mantoku, K.; Nishino, S.; Itoh, M.

    2017-12-01

    Alaskan Beaufort margin bear large abundances of sub-sea and permafrost methane hydrate[Ruppel, 2016]. During the Last Glacial, previous reported direct and indirect evidences accumulated from geochemical data from marginal sea sediment suggests that methane episodically released from hydrate trapped in the seafloor sediments[Kennett et al., 2000; Uchida et al., 2006, 2008; Cook et al, 2011]. Here we analyzed stable isotopes of foraminifera and molecular marker derived from the activity of methanotrophic bacteria from piston cores collected by the 2010 R/V Mirai cruise in Alaskan Beaufort Margin. Our data showed highly depleted 13C compositions of benthic foraminifera, suggesting indirect records of enhanced incorporation of 13C-depleted CO2 formed by methanotrophic process that use 12C-enriched methane as their main source of carbon. This is the first evidence of methane hydrate dissociation in Alaskan margin. Here we discussed timing of signals of methane dissociation with variability of sea ice and intermediate Atlantic water temperature. The dissociation of methane hydrate in the Alaskan Margin may be modulated by Atlantic warm intermediate water warming. Our results suggest that Arctic marginal regions bearing large amount methane hydrate may be a profound effect on future warming climate changes.

  17. Regulatory taxation of large energy users reconsidered

    International Nuclear Information System (INIS)

    Mannaerts, H.

    2002-01-01

    Energy policy in the Netherlands with respect to the basic industries has been restrained. National energy taxation is considered to be unsuitable for large energy users because of its international reallocation effects. However, alternative measures such as energy restrictions and marginal taxation induce low average and high marginal energy costs and consequently generate small displacement effects, together with large energy savings. A system of tradable permits not only has the advantage of low average and high marginal costs, but also keeps one firm from investing in relatively expensive energy-saving options while other firms refrain from exploiting their relatively cheap saving options

  18. The stability margin on EAST tokamak

    International Nuclear Information System (INIS)

    Jin-Ping, Qian; Bao-Nian, Wan; Biao, Shen; Bing-Jia, Xiao; Walker, M.L.; Humphreys, D.A.

    2009-01-01

    The experimental advanced superconducting tokamak (EAST) is the first full superconducting tokamak with a D-shaped cross-sectional plasma presently in operation. Its poloidal coils are relatively far from the plasma due to the necessary thermal isolation from the superconducting magnets, which leads to relatively weaker coupling between plasma and poloidal field. This may cause more difficulties in controlling the vertical instability by using the poloidal coils. The measured growth rates of vertical stability are compared with theoretical calculations, based on a rigid plasma model. Poloidal beta and internal inductance are varied to investigate their effects on the stability margin by changing the values of parameters α n and γ n (Howl et al 1992 Phys. Fluids B 4 1724), with plasma shape fixed to be a configuration with k = 1.9 and δ = 0.5. A number of ways of studying the stability margin are investigated. Among them, changing the values of parameters κ and l i is shown to be the most effective way to increase the stability margin. Finally, a guideline of stability margin M s (κ, l i , A) to a new discharge scenario showing whether plasmas can be stabilized is also presented in this paper

  19. Fedme og risiko for marginal parodontitis

    DEFF Research Database (Denmark)

    Kongstad, Johanne; Keller, Amélie Cléo; Rohde, Jeanett Friis

    2017-01-01

    Oversigtsartiklen, der er af narrativ karakter, beskriver sammenhængen mellem overvægt/ fedme og marginal parodontitis. Artiklen er baseret på et udvalg af nyere engelsksproget litteratur og motiveres af den øgede forekomst af overvægtige og fede i befolkningen. Desuden er det afgørende, at tandl......Oversigtsartiklen, der er af narrativ karakter, beskriver sammenhængen mellem overvægt/ fedme og marginal parodontitis. Artiklen er baseret på et udvalg af nyere engelsksproget litteratur og motiveres af den øgede forekomst af overvægtige og fede i befolkningen. Desuden er det afgørende......, at tandlæger forholder sig kritisk til systemiske tilstandes mulige konsekvens for udvikling, forværring og behandling af marginal parodontitis. Der nævnes forskellige mål for fedme, hvor body mass index (BMI) og taljeomkreds er de mest anvendte. Problematikken vedrørende tidligere studiers anvendelse af...... forskellige kriterier for marginal parodontitis berøres. Litteraturgennemgangen tager udgangspunkt i de biologiske mekanismer, der udløses i fedtvæv ved overvægt/fedme og medfører en kronisk inflammatorisk tilstand med frigivelse af bl.a. adipokiner. Epidemiologiske tværsnitsog longitudinelle studier af...

  20. Second Language Learners' Use of Marginal Glosses

    Science.gov (United States)

    O'Donnell, Mary E.

    2012-01-01

    The use of marginal reading glosses by 18 second language (L2) learners is examined through a quantitative and qualitative analysis of audiotaped think-aloud protocols. How these readers interact with the glosses is identified and divided into five categories or gloss interactions. Examples from each are presented. The primary research question…

  1. RISK-INFORMED SAFETY MARGIN CHARACTERIZATION

    International Nuclear Information System (INIS)

    Dinh, Nam; Szilard, Ronaldo

    2009-01-01

    The concept of safety margins has served as a fundamental principle in the design and operation of commercial nuclear power plants (NPPs). Defined as the minimum distance between a system's 'loading' and its 'capacity', plant design and operation is predicated on ensuring an adequate safety margin for safety-significant parameters (e.g., fuel cladding temperature, containment pressure, etc.) is provided over the spectrum of anticipated plant operating, transient and accident conditions. To meet the anticipated challenges associated with extending the operational lifetimes of the current fleet of operating NPPs, the United States Department of Energy (USDOE), the Idaho National Laboratory (INL) and the Electric Power Research Institute (EPRI) have developed a collaboration to conduct coordinated research to identify and address the technological challenges and opportunities that likely would affect the safe and economic operation of the existing NPP fleet over the postulated long-term time horizons. In this paper we describe a framework for developing and implementing a Risk-Informed Safety Margin Characterization (RISMC) approach to evaluate and manage changes in plant safety margins over long time horizons

  2. Early math intervention for marginalized students

    DEFF Research Database (Denmark)

    Overgaard, Steffen; Tonnesen, Pia Beck

    2016-01-01

    This study is one of more substudies in the project Early Math Intervention for Marginalized Students (TMTM2014). The paper presents the initial process of this substudy that will be carried out fall 2015. In the TMTM2014 project, 80 teachers, who completed a one week course in the idea of TMTM...

  3. Mundhulens mikroflora hos patienter med marginal parodontitis

    DEFF Research Database (Denmark)

    Larsen, Tove; Fiehn, Nils-Erik

    2011-01-01

    Viden om marginal parodontitis’ mikrobiologi tog for alvor fart for ca. 40 år siden. Den tidlige viden var baseret på mikroskopiske og dyrkningsmæssige undersøgelser af den subgingivale plak. Anvendelsen af de nyere molekylærbiologiske metoder har betydet, at vor viden om de ætiologiske faktorer ...

  4. 17 CFR 31.18 - Margin calls.

    Science.gov (United States)

    2010-04-01

    ... transaction merchant is unable to effect personal contact with a leverage customer, a telegram sent to the....18 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION LEVERAGE TRANSACTIONS § 31.18 Margin calls. (a) No leverage transaction merchant shall liquidate a leverage contract because of...

  5. Thinking on the Margin: A Classroom Experiment

    Science.gov (United States)

    Bangs, Joann

    2009-01-01

    One of the most important concepts being taught in principles classes is the idea of "thinking on the margin." It can also be one of the most difficult to get across. One of the most telling examples, according to this author, comes in trying to get students to learn the profit maximizing condition for perfectly competitive firms. She…

  6. CRBRP structural and thermal margin beyond the design base

    International Nuclear Information System (INIS)

    Strawbridge, L.E.

    1979-01-01

    Prudent margins beyond the design base have been included in the design of Clinch River Breeder Reactor Plant to further reduce the risk to the public from highly improbable occurrences. These margins include Structural Margin Beyond the Design Base to address the energetics aspects and Thermal Margin Beyond the Design Base to address the longer term thermal and radiological consequences. The assessments that led to the specification of these margins are described, along with the experimental support for those assessments. 8 refs

  7. Balanced sensitivity functions for tuning multi-dimensional Bayesian network classifiers

    NARCIS (Netherlands)

    Bolt, J.H.; van der Gaag, L.C.

    Multi-dimensional Bayesian network classifiers are Bayesian networks of restricted topological structure, which are tailored to classifying data instances into multiple dimensions. Like more traditional classifiers, multi-dimensional classifiers are typically learned from data and may include

  8. Nonparametric, Coupled ,Bayesian ,Dictionary ,and Classifier Learning for Hyperspectral Classification.

    Science.gov (United States)

    Akhtar, Naveed; Mian, Ajmal

    2017-10-03

    We present a principled approach to learn a discriminative dictionary along a linear classifier for hyperspectral classification. Our approach places Gaussian Process priors over the dictionary to account for the relative smoothness of the natural spectra, whereas the classifier parameters are sampled from multivariate Gaussians. We employ two Beta-Bernoulli processes to jointly infer the dictionary and the classifier. These processes are coupled under the same sets of Bernoulli distributions. In our approach, these distributions signify the frequency of the dictionary atom usage in representing class-specific training spectra, which also makes the dictionary discriminative. Due to the coupling between the dictionary and the classifier, the popularity of the atoms for representing different classes gets encoded into the classifier. This helps in predicting the class labels of test spectra that are first represented over the dictionary by solving a simultaneous sparse optimization problem. The labels of the spectra are predicted by feeding the resulting representations to the classifier. Our approach exploits the nonparametric Bayesian framework to automatically infer the dictionary size--the key parameter in discriminative dictionary learning. Moreover, it also has the desirable property of adaptively learning the association between the dictionary atoms and the class labels by itself. We use Gibbs sampling to infer the posterior probability distributions over the dictionary and the classifier under the proposed model, for which, we derive analytical expressions. To establish the effectiveness of our approach, we test it on benchmark hyperspectral images. The classification performance is compared with the state-of-the-art dictionary learning-based classification methods.

  9. Classifying a smoker scale in adult daily and nondaily smokers.

    Science.gov (United States)

    Pulvers, Kim; Scheuermann, Taneisha S; Romero, Devan R; Basora, Brittany; Luo, Xianghua; Ahluwalia, Jasjit S

    2014-05-01

    Smoker identity, or the strength of beliefs about oneself as a smoker, is a robust marker of smoking behavior. However, many nondaily smokers do not identify as smokers, underestimating their risk for tobacco-related disease and resulting in missed intervention opportunities. Assessing underlying beliefs about characteristics used to classify smokers may help explain the discrepancy between smoking behavior and smoker identity. This study examines the factor structure, reliability, and validity of the Classifying a Smoker scale among a racially diverse sample of adult smokers. A cross-sectional survey was administered through an online panel survey service to 2,376 current smokers who were at least 25 years of age. The sample was stratified to obtain equal numbers of 3 racial/ethnic groups (African American, Latino, and White) across smoking level (nondaily and daily smoking). The Classifying a Smoker scale displayed a single factor structure and excellent internal consistency (α = .91). Classifying a Smoker scores significantly increased at each level of smoking, F(3,2375) = 23.68, p smoker identity, stronger dependence on cigarettes, greater health risk perceptions, more smoking friends, and were more likely to carry cigarettes. Classifying a Smoker scores explained unique variance in smoking variables above and beyond that explained by smoker identity. The present study supports the use of the Classifying a Smoker scale among diverse, experienced smokers. Stronger endorsement of characteristics used to classify a smoker (i.e., stricter criteria) was positively associated with heavier smoking and related characteristics. Prospective studies are needed to inform prevention and treatment efforts.

  10. Representative Vector Machines: A Unified Framework for Classical Classifiers.

    Science.gov (United States)

    Gui, Jie; Liu, Tongliang; Tao, Dacheng; Sun, Zhenan; Tan, Tieniu

    2016-08-01

    Classifier design is a fundamental problem in pattern recognition. A variety of pattern classification methods such as the nearest neighbor (NN) classifier, support vector machine (SVM), and sparse representation-based classification (SRC) have been proposed in the literature. These typical and widely used classifiers were originally developed from different theory or application motivations and they are conventionally treated as independent and specific solutions for pattern classification. This paper proposes a novel pattern classification framework, namely, representative vector machines (or RVMs for short). The basic idea of RVMs is to assign the class label of a test example according to its nearest representative vector. The contributions of RVMs are twofold. On one hand, the proposed RVMs establish a unified framework of classical classifiers because NN, SVM, and SRC can be interpreted as the special cases of RVMs with different definitions of representative vectors. Thus, the underlying relationship among a number of classical classifiers is revealed for better understanding of pattern classification. On the other hand, novel and advanced classifiers are inspired in the framework of RVMs. For example, a robust pattern classification method called discriminant vector machine (DVM) is motivated from RVMs. Given a test example, DVM first finds its k -NNs and then performs classification based on the robust M-estimator and manifold regularization. Extensive experimental evaluations on a variety of visual recognition tasks such as face recognition (Yale and face recognition grand challenge databases), object categorization (Caltech-101 dataset), and action recognition (Action Similarity LAbeliNg) demonstrate the advantages of DVM over other classifiers.

  11. Learning multivariate distributions by competitive assembly of marginals.

    Science.gov (United States)

    Sánchez-Vega, Francisco; Younes, Laurent; Geman, Donald

    2013-02-01

    We present a new framework for learning high-dimensional multivariate probability distributions from estimated marginals. The approach is motivated by compositional models and Bayesian networks, and designed to adapt to small sample sizes. We start with a large, overlapping set of elementary statistical building blocks, or "primitives," which are low-dimensional marginal distributions learned from data. Each variable may appear in many primitives. Subsets of primitives are combined in a Lego-like fashion to construct a probabilistic graphical model; only a small fraction of the primitives will participate in any valid construction. Since primitives can be precomputed, parameter estimation and structure search are separated. Model complexity is controlled by strong biases; we adapt the primitives to the amount of training data and impose rules which restrict the merging of them into allowable compositions. The likelihood of the data decomposes into a sum of local gains, one for each primitive in the final structure. We focus on a specific subclass of networks which are binary forests. Structure optimization corresponds to an integer linear program and the maximizing composition can be computed for reasonably large numbers of variables. Performance is evaluated using both synthetic data and real datasets from natural language processing and computational biology.

  12. Assessment of ablative margin after radiofrequency ablation for hepatocellular carcinoma; comparison between magnetic resonance imaging with ferucarbotran and enhanced CT with iodized oil deposition

    International Nuclear Information System (INIS)

    Koda, Masahiko; Tokunaga, Shiho; Fujise, Yuki; Kato, Jun; Matono, Tomomitsu; Sugihara, Takaaki; Nagahara, Takakazu; Ueki, Masaru; Murawaki, Yoshikazu; Kakite, Suguru; Yamashita, Eijiro

    2012-01-01

    Background and purpose: Our aim was to investigate whether magnetic resonance imaging (MRI) with ferucarbotran administered prior to radiofrequency ablation could accurately assess ablative margin when compared with enhanced computed tomography (CT) with iodized oil marking. Materials and methods: We enrolled 27 patients with 32 hepatocellular carcinomas in which iodized oil deposits were visible throughout the nodule after transcatheter arterial chemoembolization. For these nodules, radiofrequency ablation was performed after ferucarbotran administration. We then performed T2-weighted MRI after 1 week and enhanced CT after 1 month. T2-weighted MRI demonstrated the ablative margin as a low-intensity rim. We classified the margin into three grades; margin (+): high-intensity area with a continuous low-intensity rim; margin zero: high-intensity area with a discontinuous low-intensity rim; and margin (−): high-intensity area extending beyond the low-intensity rim. Results: In 28 (86%) of 32 nodules, there was agreement between MRI and CT. The overall agreement between for the two modalities in the assessment of ablative margin was good (κ = 0.759, 95% confidence interval: 0.480–1.000, p < 0.001). In four nodules, ablative margins on MRI were underestimated by one grade compared with CT. Conclusion: MRI using ferucarbotran is less invasive and allows earlier assessment than CT. The MRI technique performed similarly to enhanced CT with iodized oil marking in evaluating the ablative margin after radiofrequency ablation.

  13. Crustal-Scale Fault Interaction at Rifted Margins and the Formation of Domain-Bounding Breakaway Complexes: Insights From Offshore Norway

    Science.gov (United States)

    Osmundsen, P. T.; Péron-Pinvidic, G.

    2018-03-01

    The large-magnitude faults that control crustal thinning and excision at rifted margins combine into laterally persistent structural boundaries that separate margin domains of contrasting morphology and structure. We term them breakaway complexes. At the Mid-Norwegian margin, we identify five principal breakaway complexes that separate the proximal, necking, distal, and outer margin domains. Downdip and lateral interactions between the faults that constitute breakaway complexes became fundamental to the evolution of the 3-D margin architecture. Different types of fault interaction are observed along and between these faults, but simple models for fault growth will not fully describe their evolution. These structures operate on the crustal scale, cut large thicknesses of heterogeneously layered lithosphere, and facilitate fundamental margin processes such as deformation coupling and exhumation. Variations in large-magnitude fault geometry, erosional footwall incision, and subsequent differential subsidence along the main breakaway complexes likely record the variable efficiency of these processes.

  14. The origin of Karaj dam basement sill marginal reversal by Soret fractionation

    Science.gov (United States)

    Maghdour-Mashhour, Reza; Esmaeily, Dariush

    2010-05-01

    The Karaj dam basement sill (KDBS), located North West of Tehran, northern Iran, is one of the several E-W-trending plutons in the Albourz Mountains. The KDBS consists of a layered series between upper and lower chilled margins. The rocks of the chilled margins are gabbroic in composition and porphyritic, with euhedral to subhedral plagioclase and clinopyroxene megacrysts up to 5 mm long. The rocks become coarse-grained toward the center of the sill and show a gradual transition from porphyritic to equigranular texture. Field and petrographic observations reveal a reverse trend in marginal units crystallization from the eutectic point to the main magma composition; i.e., the olivine-bearing gabbro (porphyritic chilled margin), which has a eutectic composition, crystallized prior to the marginal gabbros, which have a cotectic or near-cotectic composition, as plagioclase laths in the gabbroic unit are embedded in large crystals of clinopyroxene and this phenomenon is believed to result from the cotectic crystallization of plagioclase and clinopyroxene. Four major mechanisms are proposed and discussed in order to find the exact mechanism responsible for marginal reversal formation as following: 1) Crystal settling is a gravity-dependent mechanism and phenocrysts must have settled to form a layer at the bottom of the sill, showing sharp upper boundary which is not observable in KDBS. Besides, the reverse fractionation of inwardly-dipping sequence of mentioned sill occurs in layers with primary dips up to 55°. Consequently capability of marginal reversals to develop along steeply inclined chamber margins, by this mechanism is implausible. 2) Multiple injections of successive magma pulses fails to explain the origin of marginal reversal since the transition along the entire length of marginal reversal is gradual also there is no compositional break or chilled contact between two mentioned units of KDBS margin (Olivine-gabbro and marginal gabbro). 3) The idea of

  15. Current Directional Protection of Series Compensated Line Using Intelligent Classifier

    Directory of Open Access Journals (Sweden)

    M. Mollanezhad Heydarabadi

    2016-12-01

    Full Text Available Current inversion condition leads to incorrect operation of current based directional relay in power system with series compensated device. Application of the intelligent system for fault direction classification has been suggested in this paper. A new current directional protection scheme based on intelligent classifier is proposed for the series compensated line. The proposed classifier uses only half cycle of pre-fault and post fault current samples at relay location to feed the classifier. A lot of forward and backward fault simulations under different system conditions upon a transmission line with a fixed series capacitor are carried out using PSCAD/EMTDC software. The applicability of decision tree (DT, probabilistic neural network (PNN and support vector machine (SVM are investigated using simulated data under different system conditions. The performance comparison of the classifiers indicates that the SVM is a best suitable classifier for fault direction discriminating. The backward faults can be accurately distinguished from forward faults even under current inversion without require to detect of the current inversion condition.

  16. Neural network classifier of attacks in IP telephony

    Science.gov (United States)

    Safarik, Jakub; Voznak, Miroslav; Mehic, Miralem; Partila, Pavol; Mikulec, Martin

    2014-05-01

    Various types of monitoring mechanism allow us to detect and monitor behavior of attackers in VoIP networks. Analysis of detected malicious traffic is crucial for further investigation and hardening the network. This analysis is typically based on statistical methods and the article brings a solution based on neural network. The proposed algorithm is used as a classifier of attacks in a distributed monitoring network of independent honeypot probes. Information about attacks on these honeypots is collected on a centralized server and then classified. This classification is based on different mechanisms. One of them is based on the multilayer perceptron neural network. The article describes inner structure of used neural network and also information about implementation of this network. The learning set for this neural network is based on real attack data collected from IP telephony honeypot called Dionaea. We prepare the learning set from real attack data after collecting, cleaning and aggregation of this information. After proper learning is the neural network capable to classify 6 types of most commonly used VoIP attacks. Using neural network classifier brings more accurate attack classification in a distributed system of honeypots. With this approach is possible to detect malicious behavior in a different part of networks, which are logically or geographically divided and use the information from one network to harden security in other networks. Centralized server for distributed set of nodes serves not only as a collector and classifier of attack data, but also as a mechanism for generating a precaution steps against attacks.

  17. Use of information barriers to protect classified information

    International Nuclear Information System (INIS)

    MacArthur, D.; Johnson, M.W.; Nicholas, N.J.; Whiteson, R.

    1998-01-01

    This paper discusses the detailed requirements for an information barrier (IB) for use with verification systems that employ intrusive measurement technologies. The IB would protect classified information in a bilateral or multilateral inspection of classified fissile material. Such a barrier must strike a balance between providing the inspecting party the confidence necessary to accept the measurement while protecting the inspected party's classified information. The authors discuss the structure required of an IB as well as the implications of the IB on detector system maintenance. A defense-in-depth approach is proposed which would provide assurance to the inspected party that all sensitive information is protected and to the inspecting party that the measurements are being performed as expected. The barrier could include elements of physical protection (such as locks, surveillance systems, and tamper indicators), hardening of key hardware components, assurance of capabilities and limitations of hardware and software systems, administrative controls, validation and verification of the systems, and error detection and resolution. Finally, an unclassified interface could be used to display and, possibly, record measurement results. The introduction of an IB into an analysis system may result in many otherwise innocuous components (detectors, analyzers, etc.) becoming classified and unavailable for routine maintenance by uncleared personnel. System maintenance and updating will be significantly simplified if the classification status of as many components as possible can be made reversible (i.e. the component can become unclassified following the removal of classified objects)

  18. Detection of microaneurysms in retinal images using an ensemble classifier

    Directory of Open Access Journals (Sweden)

    M.M. Habib

    2017-01-01

    Full Text Available This paper introduces, and reports on the performance of, a novel combination of algorithms for automated microaneurysm (MA detection in retinal images. The presence of MAs in retinal images is a pathognomonic sign of Diabetic Retinopathy (DR which is one of the leading causes of blindness amongst the working age population. An extensive survey of the literature is presented and current techniques in the field are summarised. The proposed technique first detects an initial set of candidates using a Gaussian Matched Filter and then classifies this set to reduce the number of false positives. A Tree Ensemble classifier is used with a set of 70 features (the most commons features in the literature. A new set of 32 MA groundtruth images (with a total of 256 labelled MAs based on images from the MESSIDOR dataset is introduced as a public dataset for benchmarking MA detection algorithms. We evaluate our algorithm on this dataset as well as another public dataset (DIARETDB1 v2.1 and compare it against the best available alternative. Results show that the proposed classifier is superior in terms of eliminating false positive MA detection from the initial set of candidates. The proposed method achieves an ROC score of 0.415 compared to 0.2636 achieved by the best available technique. Furthermore, results show that the classifier model maintains consistent performance across datasets, illustrating the generalisability of the classifier and that overfitting does not occur.

  19. Dynamic cluster generation for a fuzzy classifier with ellipsoidal regions.

    Science.gov (United States)

    Abe, S

    1998-01-01

    In this paper, we discuss a fuzzy classifier with ellipsoidal regions that dynamically generates clusters. First, for the data belonging to a class we define a fuzzy rule with an ellipsoidal region. Namely, using the training data for each class, we calculate the center and the covariance matrix of the ellipsoidal region for the class. Then we tune the fuzzy rules, i.e., the slopes of the membership functions, successively until there is no improvement in the recognition rate of the training data. Then if the number of the data belonging to a class that are misclassified into another class exceeds a prescribed number, we define a new cluster to which those data belong and the associated fuzzy rule. Then we tune the newly defined fuzzy rules in the similar way as stated above, fixing the already obtained fuzzy rules. We iterate generation of clusters and tuning of the newly generated fuzzy rules until the number of the data belonging to a class that are misclassified into another class does not exceed the prescribed number. We evaluate our method using thyroid data, Japanese Hiragana data of vehicle license plates, and blood cell data. By dynamic cluster generation, the generalization ability of the classifier is improved and the recognition rate of the fuzzy classifier for the test data is the best among the neural network classifiers and other fuzzy classifiers if there are no discrete input variables.

  20. Mesozoic carbonate-siliciclastic platform to basin systems of a South Tethyan margin (Egypt, East Mediterranean)

    Science.gov (United States)

    Tassy, Aurélie; Crouzy, Emmanuel; Gorini, Christian; Rubino, Jean-Loup

    2015-04-01

    series (up to 3500 m) as a mixed combination of debris flows, internal preserved blocks, and/or compressively-deformed distal allochthonous masses. Transported material have proceeded from the dismantling of the Mesozoic mixed carbonate-siliciclastic platform. They can spread down slope over areas as large as 70000 of km2. According to stratigraphic correlations with global sea-level positions, platform instability would have been triggered by the gravitational collapse of the carbonate-siliciclastic platform under its own weight after successive subaerial exposures which were able to generate karstification processes. Seismic interpretation is constrained by a detailed assessment of the Egyptian margin paleogeography supported by wells. This margin segment is briefly compared to the outcropping Apulian margin in Italy.

  1. Influence of Different Implant Geometry in Clinical Longevity and Maintenance of Marginal Bone: A Systematic Review.

    Science.gov (United States)

    Lovatto, Sabrina Telles; Bassani, Rafaela; Sarkis-Onofre, Rafael; Dos Santos, Mateus Bertolini Fernandes

    2018-03-26

    To assess, through a systematic review, the influence of different implant geometries on clinical longevity and maintenance of marginal bone tissue. An electronic search was conducted in MEDLINE, Scopus, and Web of Science databases, limited to studies written in English from 1996 to 2017 using specific search strategies. Only randomized controlled trials (RCTs) that compared dental implants and their geometries were included. Two reviewers independently selected studies, extracted data, and assessed the risk of bias of included studies. From the 4006 references identified by the search, 24 were considered eligible for full-text analysis, after which 10 studies were included in this review. A similar behavior of marginal bone loss between tapered and cylindrical geometries was observed; however, implants that had micro-threads in the neck presented a slight decrease of marginal bone loss compared to implants with straight or smooth neck. Success and survival rates were high, with cylindrical implants presenting higher success and survival rates than tapered ones. Implant geometry seems to have little influence on marginal bone loss (MBL) and survival and success rates after 1 year of implant placement; however, the evidence in this systematic review was classified as very low due to limitations such as study design, sample size, and publication bias. Thus, more well-designed RCTs should be conducted to provide evidence regarding the influence of implant geometry on MBL and survival and success rates after 1 year of implant placement. © 2018 by the American College of Prosthodontists.

  2. Seaward dipping reflectors along the SW continental margin of India: Evidence for volcanic passive margin

    Digital Repository Service at National Institute of Oceanography (India)

    Ajay, K.K.; Chaubey, A.K.; Krishna, K.S.; Rao, D.G.; Sar, D.

    Multi-channel seismic reflection profiles across the southwest continental margin of India (SWCMI) show presence of westerly dipping seismic reflectors beneath sedimentary strata along the western flank of the Laccadive Ridge-northernmost part...

  3. Conference Report: The New Discovery of Margins: Theory-Based Excursions in Marginal Social Fields

    Directory of Open Access Journals (Sweden)

    Babette Kirchner

    2014-05-01

    Full Text Available At this year's spring conference of the Sociology of Knowledge Section of the German Sociological Association, a diverse range of theoretical concepts and multiple empirical insights into different marginal social fields were presented. As in everyday life, drawing a line between center and margin can be seen as an important challenge that must equally be faced in sociology. The socially constructed borderline appears to be highly variable. Therefore it has to be delineated or fixed somehow. The construction of margins is necessary for society in general and smaller social groupings alike to confirm one's own "normal" identity, or one's own membership on the fringes. The different contributions exemplify what was established at the beginning of the conference: Namely that society and its margins are defined differently according to the empirical as well as conceptual focus. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1402148

  4. SpectraClassifier 1.0: a user friendly, automated MRS-based classifier-development system

    Directory of Open Access Journals (Sweden)

    Julià-Sapé Margarida

    2010-02-01

    Full Text Available Abstract Background SpectraClassifier (SC is a Java solution for designing and implementing Magnetic Resonance Spectroscopy (MRS-based classifiers. The main goal of SC is to allow users with minimum background knowledge of multivariate statistics to perform a fully automated pattern recognition analysis. SC incorporates feature selection (greedy stepwise approach, either forward or backward, and feature extraction (PCA. Fisher Linear Discriminant Analysis is the method of choice for classification. Classifier evaluation is performed through various methods: display of the confusion matrix of the training and testing datasets; K-fold cross-validation, leave-one-out and bootstrapping as well as Receiver Operating Characteristic (ROC curves. Results SC is composed of the following modules: Classifier design, Data exploration, Data visualisation, Classifier evaluation, Reports, and Classifier history. It is able to read low resolution in-vivo MRS (single-voxel and multi-voxel and high resolution tissue MRS (HRMAS, processed with existing tools (jMRUI, INTERPRET, 3DiCSI or TopSpin. In addition, to facilitate exchanging data between applications, a standard format capable of storing all the information needed for a dataset was developed. Each functionality of SC has been specifically validated with real data with the purpose of bug-testing and methods validation. Data from the INTERPRET project was used. Conclusions SC is a user-friendly software designed to fulfil the needs of potential users in the MRS community. It accepts all kinds of pre-processed MRS data types and classifies them semi-automatically, allowing spectroscopists to concentrate on interpretation of results with the use of its visualisation tools.

  5. Classifying galaxy spectra at 0.5 < z < 1 with self-organizing maps

    Science.gov (United States)

    Rahmani, S.; Teimoorinia, H.; Barmby, P.

    2018-05-01

    The spectrum of a galaxy contains information about its physical properties. Classifying spectra using templates helps elucidate the nature of a galaxy's energy sources. In this paper, we investigate the use of self-organizing maps in classifying galaxy spectra against templates. We trained semi-supervised self-organizing map networks using a set of templates covering the wavelength range from far ultraviolet to near infrared. The trained networks were used to classify the spectra of a sample of 142 galaxies with 0.5 K-means clustering, a supervised neural network, and chi-squared minimization. Spectra corresponding to quiescent galaxies were more likely to be classified similarly by all methods while starburst spectra showed more variability. Compared to classification using chi-squared minimization or the supervised neural network, the galaxies classed together by the self-organizing map had more similar spectra. The class ordering provided by the one-dimensional self-organizing maps corresponds to an ordering in physical properties, a potentially important feature for the exploration of large datasets.

  6. High speed intelligent classifier of tomatoes by colour, size and weight

    Energy Technology Data Exchange (ETDEWEB)

    Cement, J.; Novas, N.; Gazquez, J. A.; Manzano-Agugliaro, F.

    2012-11-01

    At present most horticultural products are classified and marketed according to quality standards, which provide a common language for growers, packers, buyers and consumers. The standardisation of both product and packaging enables greater speed and efficiency in management and marketing. Of all the vegetables grown in greenhouses, tomatoes are predominant in both surface area and tons produced. This paper will present the development and evaluation of a low investment classification system of tomatoes with these objectives: to put it at the service of producing farms and to classify for trading standards. An intelligent classifier of tomatoes has been developed by weight, diameter and colour. This system has optimised the necessary algorithms for data processing in the case of tomatoes, so that productivity is greatly increased, with the use of less expensive and lower performance electronics. The prototype is able to achieve very high speed classification, 12.5 ratings per second, using accessible and low cost commercial equipment for this. It decreases fourfold the manual sorting time and is not sensitive to the variety of tomato classified. This system facilitates the processes of standardisation and quality control, increases the competitiveness of tomato farms and impacts positively on profitability. The automatic classification system described in this work represents a contribution from the economic point of view, as it is profitable for a farm in the short term (less than six months), while the existing systems, can only be used in large trading centers. (Author) 36 refs.

  7. Qualification of class 1e equipment: regulation, technological margins and test experience

    International Nuclear Information System (INIS)

    Pasco, Y.; Le Meur, M.; Henry, J.Y.; Droger, J.P.; Morange, E.; Roubault, J.

    1986-10-01

    French regulation requires licensee to qualify electrical equipment important to safety for service in nuclear power plants to ensure that the equipment can perform its safety function under the set of plausible operating conditions. The French regulatory texts entitled Fundamental safety rules have classified safety related electrical equipment in three main categories: k1, k2, k3, according to their location and operating conditions. The definition of a design basis accident test profile must account for margins applied to thermal hydraulic code outputs. Specific safety margins was added to cover uncertainties in qualification test representativity. Up to now, accidental sequence studies have shown the validity of such a qualification test profile. On the other hand, the results from post accident simulation tests have shown that it is useful not only to validate post accident operating life but also to reveal failures initiated during previous tests [fr

  8. A History of Classified Activities at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    2001-01-30

    The facilities that became Oak Ridge National Laboratory (ORNL) were created in 1943 during the United States' super-secret World War II project to construct an atomic bomb (the Manhattan Project). During World War II and for several years thereafter, essentially all ORNL activities were classified. Now, in 2000, essentially all ORNL activities are unclassified. The major purpose of this report is to provide a brief history of ORNL's major classified activities from 1943 until the present (September 2000). This report is expected to be useful to the ORNL Classification Officer and to ORNL's Authorized Derivative Classifiers and Authorized Derivative Declassifiers in their classification review of ORNL documents, especially those documents that date from the 1940s and 1950s.

  9. COMPARISON OF SVM AND FUZZY CLASSIFIER FOR AN INDIAN SCRIPT

    Directory of Open Access Journals (Sweden)

    M. J. Baheti

    2012-01-01

    Full Text Available With the advent of technological era, conversion of scanned document (handwritten or printed into machine editable format has attracted many researchers. This paper deals with the problem of recognition of Gujarati handwritten numerals. Gujarati numeral recognition requires performing some specific steps as a part of preprocessing. For preprocessing digitization, segmentation, normalization and thinning are done with considering that the image have almost no noise. Further affine invariant moments based model is used for feature extraction and finally Support Vector Machine (SVM and Fuzzy classifiers are used for numeral classification. . The comparison of SVM and Fuzzy classifier is made and it can be seen that SVM procured better results as compared to Fuzzy Classifier.

  10. Optimal threshold estimation for binary classifiers using game theory.

    Science.gov (United States)

    Sanchez, Ignacio Enrique

    2016-01-01

    Many bioinformatics algorithms can be understood as binary classifiers. They are usually compared using the area under the receiver operating characteristic ( ROC ) curve. On the other hand, choosing the best threshold for practical use is a complex task, due to uncertain and context-dependent skews in the abundance of positives in nature and in the yields/costs for correct/incorrect classification. We argue that considering a classifier as a player in a zero-sum game allows us to use the minimax principle from game theory to determine the optimal operating point. The proposed classifier threshold corresponds to the intersection between the ROC curve and the descending diagonal in ROC space and yields a minimax accuracy of 1-FPR. Our proposal can be readily implemented in practice, and reveals that the empirical condition for threshold estimation of "specificity equals sensitivity" maximizes robustness against uncertainties in the abundance of positives in nature and classification costs.

  11. Statistical text classifier to detect specific type of medical incidents.

    Science.gov (United States)

    Wong, Zoie Shui-Yee; Akiyama, Masanori

    2013-01-01

    WHO Patient Safety has put focus to increase the coherence and expressiveness of patient safety classification with the foundation of International Classification for Patient Safety (ICPS). Text classification and statistical approaches has showed to be successful to identifysafety problems in the Aviation industryusing incident text information. It has been challenging to comprehend the taxonomy of medical incidents in a structured manner. Independent reporting mechanisms for patient safety incidents have been established in the UK, Canada, Australia, Japan, Hong Kong etc. This research demonstrates the potential to construct statistical text classifiers to detect specific type of medical incidents using incident text data. An illustrative example for classifying look-alike sound-alike (LASA) medication incidents using structured text from 227 advisories related to medication errors from Global Patient Safety Alerts (GPSA) is shown in this poster presentation. The classifier was built using logistic regression model. ROC curve and the AUC value indicated that this is a satisfactory good model.

  12. Defending Malicious Script Attacks Using Machine Learning Classifiers

    Directory of Open Access Journals (Sweden)

    Nayeem Khan

    2017-01-01

    Full Text Available The web application has become a primary target for cyber criminals by injecting malware especially JavaScript to perform malicious activities for impersonation. Thus, it becomes an imperative to detect such malicious code in real time before any malicious activity is performed. This study proposes an efficient method of detecting previously unknown malicious java scripts using an interceptor at the client side by classifying the key features of the malicious code. Feature subset was obtained by using wrapper method for dimensionality reduction. Supervised machine learning classifiers were used on the dataset for achieving high accuracy. Experimental results show that our method can efficiently classify malicious code from benign code with promising results.

  13. Decoding the Margins: What Can the Fractal Geometry of Basaltic Flow Margins Tell Us?

    Science.gov (United States)

    Schaefer, E. I.; Hamilton, C.; Neish, C.; Beard, S. P.; Bramson, A. M.; Sori, M.; Rader, E. L.

    2016-12-01

    Studying lava flows on other planetary bodies is essential to characterizing eruption styles and constraining the bodies' thermal evolution. Although planetary basaltic flows are common, many key features are not resolvable in orbital imagery. We are thus developing a technique to characterize basaltic flow type, sub-meter roughness, and sediment mantling from these data. We will present the results from upcoming fieldwork at Craters of the Moon National Monument and Preserve with FINESSE (August) and at Hawai'i Volcanoes National Park (September). We build on earlier work that showed that basaltic flow margins are approximately fractal [Bruno et al., 1992; Gaonac'h et al., 1992] and that their fractal dimensions (D) have distinct `a`ā and pāhoehoe ranges under simple conditions [Bruno et al., 1994]. Using a differential GPS rover, we have recently shown that the margin of Iceland's 2014 Holuhraun flow exhibits near-perfect (R2=0.9998) fractality for ≥24 km across dm to km scales [Schaefer et al., 2016]. This finding suggests that a fractal-based technique has significant potential to characterize flows at sub-resolution scales. We are simultaneously seeking to understand how margin fractality can be modified. A preliminary result for an `a'ā flow in Hawaii's Ka'ū Desert suggests that although aeolian mantling obscures the original flow margin, the apparent margin (i.e., sediment-lava interface) remains fractal [Schaefer et al., 2015]. Further, the apparent margin's D is likely significantly modified from that of the original margin. Other factors that we are exploring include erosion, transitional flow types, and topographic confinement. We will also rigorously test the intriguing possibility that margin D correlates with the sub-meter Hurst exponent H of the flow surface, a common metric of roughness scaling [e.g., Shepard et al., 2001]. This hypothesis is based on geometric arguments [Turcotte, 1997] and is qualitatively consistent with all results so far.

  14. Conference Report: The New Discovery of Margins: Theory-Based Excursions in Marginal Social Fields

    OpenAIRE

    Kirchner, Babette; Lorenzen, Jule-Marie; Striffler, Christine

    2014-01-01

    At this year's spring conference of the Sociology of Knowledge Section of the German Sociological Association, a diverse range of theoretical concepts and multiple empirical insights into different marginal social fields were presented. As in everyday life, drawing a line between center and margin can be seen as an important challenge that must equally be faced in sociology. The socially constructed borderline appears to be highly variable. Therefore it has to be delineated or fixed somehow. ...

  15. The marginal band system in nymphalid butterfly wings.

    Science.gov (United States)

    Taira, Wataru; Kinjo, Seira; Otaki, Joji M

    2015-01-01

    Butterfly wing color patterns are highly complex and diverse, but they are believed to be derived from the nymphalid groundplan, which is composed of several color pattern systems. Among these pattern systems, the marginal band system, including marginal and submarginal bands, has rarely been studied. Here, we examined the color pattern diversity of the marginal band system among nymphalid butterflies. Marginal and submarginal bands are usually expressed as a pair of linear bands aligned with the wing margin. However, a submarginal band can be expressed as a broken band, an elongated oval, or a single dot. The marginal focus, usually a white dot at the middle of a wing compartment along the wing edge, corresponds to the pupal edge spot, one of the pupal cuticle spots that signify the locations of color pattern organizing centers. A marginal band can be expressed as a semicircle, an elongated oval, or a pair of eyespot-like structures, which suggest the organizing activity of the marginal focus. Physical damage at the pupal edge spot leads to distal dislocation of the submarginal band in Junonia almana and in Vanessa indica, suggesting that the marginal focus functions as an organizing center for the marginal band system. Taken together, we conclude that the marginal band system is developmentally equivalent to other symmetry systems. Additionally, the marginal band is likely a core element and the submarginal band a paracore element of the marginal band system, and both bands are primarily specified by the marginal focus organizing center.

  16. Detection of brain tumor margins using optical coherence tomography

    Science.gov (United States)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  17. Implications of physical symmetries in adaptive image classifiers

    DEFF Research Database (Denmark)

    Sams, Thomas; Hansen, Jonas Lundbek

    2000-01-01

    It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn...... a given classification task, or to improve generalization of the neural network. Notably, the symmetrization of the detector does not compromise the ability to distinguish objects that break the symmetry. (C) 2000 Elsevier Science Ltd. All rights reserved....

  18. Silicon nanowire arrays as learning chemical vapour classifiers

    International Nuclear Information System (INIS)

    Niskanen, A O; Colli, A; White, R; Li, H W; Spigone, E; Kivioja, J M

    2011-01-01

    Nanowire field-effect transistors are a promising class of devices for various sensing applications. Apart from detecting individual chemical or biological analytes, it is especially interesting to use multiple selective sensors to look at their collective response in order to perform classification into predetermined categories. We show that non-functionalised silicon nanowire arrays can be used to robustly classify different chemical vapours using simple statistical machine learning methods. We were able to distinguish between acetone, ethanol and water with 100% accuracy while methanol, ethanol and 2-propanol were classified with 96% accuracy in ambient conditions.

  19. Experimental validation of the van Herk margin formula for lung radiation therapy

    International Nuclear Information System (INIS)

    Ecclestone, Gillian; Heath, Emily; Bissonnette, Jean-Pierre

    2013-01-01

    defined by the ICRU; thus, suitable PTV margins were estimated. The penumbra widths calculated in lung tissue for each plan were found to be very similar to the 6.4 mm value assumed by the margin formula model. The plan conformity correction yielded inconsistent results which were largely affected by image and dose grid resolution while the trajectory modified PTV plans yielded a dosimetric benefit over the standard internal target volumes approach with up to a 5% decrease in the V20 value.Conclusions: The margin formula showed to be robust against variations in tumor size and motion, treatment technique, plan conformity, as well as low tissue density. This was validated by maintaining coverage of all of the derived PTVs by 95% dose level, as required by the formal definition of the PTV. However, the assumption of perfect plan conformity in the margin formula derivation yields conservative margin estimation. Future modifications to the margin formula will require a correction for plan conformity. Plan conformity can also be improved by using the proposed trajectory modified PTV planning approach. This proves especially beneficial for tumors with a large anterior–posterior component of respiratory motion

  20. The stability margin of elongated plasmas

    International Nuclear Information System (INIS)

    Portone, Alfredo

    2005-01-01

    Passive stabilization is a key feature in tokamak design since it indicates the efficiency of the metallic structures to 'oppose' plasma displacements. As far as plasma vertical displacement modes are concerned, usually their passive stabilization is characterized in terms of two main indices, namely the instability growth time and the stability margin. In this study-after recalling the governing equations-we extend the definition of the stability margin given in the literature (Lazarus E. et al 1990 Nucl. Fusion 30 111, Albanese R. et al 1990 IEEE Trans. Magn. 26, Kameari A. et al 1985 Nucl. Eng. Des./Fusion 365-73) for the rigid body displacement model to the non-rigid plasma model. Numerical examples are also given for the reduced task objectives/reduced cost ITER design

  1. Passive target tracking using marginalized particle filter

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A marginalized particle filtering(MPF)approach is proposed for target tracking under the background of passive measurement.Essentially,the MPF is a combination of particle filtering technique and Kalman filter.By making full use of marginalization,the distributions of the tractable linear part of the total state variables are updated analytically using Kalman filter,and only the lower-dimensional nonlinear state variable needs to be dealt with using particle filter.Simulation studies are performed on an illustrative example,and the results show that the MPF method leads to a significant reduction of the tracking errors when compared with the direct particle implementation.Real data test results also validate the effectiveness of the presented method.

  2. Distributions with given marginals and statistical modelling

    CERN Document Server

    Fortiana, Josep; Rodriguez-Lallena, José

    2002-01-01

    This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.

  3. Marginal Loss Calculations for the DCOPF

    Energy Technology Data Exchange (ETDEWEB)

    Eldridge, Brent [Federal Energy Regulatory Commission, Washington, DC (United States); Johns Hopkins Univ., Baltimore, MD (United States); O' Neill, Richard P. [Federal Energy Regulatory Commission, Washington, DC (United States); Castillo, Andrea R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-12-05

    The purpose of this paper is to explain some aspects of including a marginal line loss approximation in the DCOPF. The DCOPF optimizes electric generator dispatch using simplified power flow physics. Since the standard assumptions in the DCOPF include a lossless network, a number of modifications have to be added to the model. Calculating marginal losses allows the DCOPF to optimize the location of power generation, so that generators that are closer to demand centers are relatively cheaper than remote generation. The problem formulations discussed in this paper will simplify many aspects of practical electric dispatch implementations in use today, but will include sufficient detail to demonstrate a few points with regard to the handling of losses.

  4. The rifted margin of Saudi Arabia

    Science.gov (United States)

    McClain, J. S.; Orcutt, J. A.

    The structure of rifted continental margins has always been of great scientific interest, and now, with dwindling economic oil deposits, these complex geological features assume practical importance as well. The ocean-continent transition is, by definition, laterally heterogeneous and likely to be extremely complicated. The southernmost shotpoints (4, 5, and 6) in the U.S. Geological Survey seismic refraction profile in the Kingdom of Saudi Arabia lie within a transition region and thus provide a testing ground for methods that treat wave propagation in laterally heterogeneous media. This portion of the profile runs from the Farasan Islands in the Red Sea across the coast line and the Hijaz-Asir escarpment into the Hijaz-Asir tectonic province. Because the southernmost shotpoint is within the margin of the Saudi sub-continent, the full transition region is not sampled. Furthermore, such an experiment is precluded by the narrowness of the purely oceanic portion of the Red Sea.

  5. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.

    2010-08-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  6. Origins of saline fluids at convergent margins

    Science.gov (United States)

    Martin, Jonathan B.; Kastner, Miriam; Egeberg, Per Kr.

    The compositions of pore and venting fluids at convergent margins differ from seawater values, reflecting mixing and diagenesis. Most significantly, the concentration of Cl-, assumed to be a conservative ion, differs from its seawater value. Chloride concentrations could be elevated by four processes, although two, the formation of gas hydrate and ion filtration by clay membranes, are insignificant in forming saline fluids at convergent margins. During the formation of gas hydrate, the resulting Cl--rich fluids, estimated to contain an average excess of ˜140 mM Cl- over seawater value, probably would be flushed from the sediment when the pore fluids vent to seawater. Ion filtration by clay membranes requires compaction pressures typical of >2 km burial depths. Even at these depths, the efficiency of ion filtration will be negligible because (1) fluids will flow through fractures, thereby bypassing clay membranes, (2) concentrations of clay minerals are diluted by other phases, and (3) during burial, smectite converts to illite, which has little capacity for ion filtration. A third process, mixing with subaerially evaporated seawater, elevates Cl- concentrations to 1043 mM in forearc basins along the Peru margin. Evaporation of seawater, however, will be important only in limited geographic regions that are characterized by enclosed basins, arid climates, and permeable sediments. At the New Hebrides and Izu-Bonin margins, Cl- concentrations are elevated to a maximum of 1241 mM. The process responsible for this increase is the alteration of volcanic ash to hydrous clay and zeolite minerals. Mass balance calculations, based on the decrease in δ18O values to -9.5‰ (SMOW), suggest that the Cl- concentrations could increase solely from the formation of smectite in a closed system. The diagenesis of volcanic ash also alters the concentrations of most dissolved species in addition to Cl-. Depending on the volume of this altered fluid, it could influence seawater

  7. Marginal longitudinal semiparametric regression via penalized splines

    KAUST Repository

    Al Kadiri, M.; Carroll, R.J.; Wand, M.P.

    2010-01-01

    We study the marginal longitudinal nonparametric regression problem and some of its semiparametric extensions. We point out that, while several elaborate proposals for efficient estimation have been proposed, a relative simple and straightforward one, based on penalized splines, has not. After describing our approach, we then explain how Gibbs sampling and the BUGS software can be used to achieve quick and effective implementation. Illustrations are provided for nonparametric regression and additive models.

  8. PENDIDIKAN ALTERNATIF UNTUK PEREMPUAN MARGINAL DI PEDESAAN

    Directory of Open Access Journals (Sweden)

    Ratnawati Tahir

    2011-11-01

    Full Text Available Abstract: Alternative Education for Marginalized Women in Rural Areas. The study aims to find alter­native forms of education for marginalized women, the process of forming study groups and gender based learning process that serves the center of the development of education, leadership and a source of economic empowerment. The study uses qualitative methods, involving a group of women who have attended an al­ternative education. Researchers and informants from community leaders. The results showed that the form of alternative education is a method of adult education or andragogy. Study groups consisted of basic literacy and functional literacy. The learning process begins with the sharing of learning, reflection on life experience and role play method. The result is 65% of participants have increased the ability of reading, writing and numeracy, and understanding of the issues of women who have confidence in the decision making of households and communities. Abstrak: Pendidikan Alternatif untuk Perempuan Marginal di Pedesaan. Penelitian ini bertujuan mengetahui bentuk pendidikan alternatif untuk perempuan marginal, proses pembentukan kelompok belajar, dan proses pembelajaran berperspektif gender yang berfungsi menjadi pusat pengembangan pendidikan, kepemimpinan, dan sumber penguatan ekonomi. Penelitian menggunakan metode kualitatif, mengambil satu kelompok perempuan yang telah mengikuti pendidikan alternatif. Informan terdiri atas tokoh masyarakat, seperti Kepala Desa, Ketua RT/RW, dan ibu rumah tangga. Hasil penelitian menunjukkan bahwa bentuk pembelajaran pendidikan alternatif adalah metode pendidikan orang dewasa atau andragogy. Pembentukan kelompok belajar terdiri atas; kelompok baca tulis dan keaksaraan fungsional. Proses pembe­lajaran dimulai dengan sharing pembelajaran, refleksi pengalaman hidup, dan metode role play. Hasilnya 65% peserta pembelajaran mengalami peningkatan kemampuan membaca, menulis, dan berhitung, serta pema

  9. Atlantic continental margin of the United States

    Science.gov (United States)

    Grow, John A.; Sheridan, Robert E.; Palmer, A.R.

    1982-01-01

    The objective of this Decade of North American Geology (D-NAG) volume will be to focus on the Mesozoic and Cenozoic evolution of the U.S. Atlantic continental margin, including the onshore coastal plain, related onshore Triassic-Jurassic rift grabens, and the offshore basins and platforms. Following multiple compressional tectonic episodes between Africa and North America during the Paleozoic Era that formed the Appalachian Mountains, the Mesozoic and Cenozoic Eras were dominated by tensional tectonic processes that separated Africa and North America. Extensional rifting during Triassic and Early Jurassic times resulted in numerous tensional grabens both onshore and offshore, which filled with nonmarine continental red beds, lacustrine deposits, and volcanic flows and debris. The final stage of this breakup between Africa and North America occurred beneath the present outer continental shelf and continental slope during Early or Middle Jurassic time when sea-floor spreading began to form new oceanic crust and lithosophere between the two continents as they drifted apart. Postrift subsidence of the marginal basins continued in response to cooling of the lithosphere and sedimentary loading.Geophysical surveys and oil-exploration drilling along the U.S. Atlantic continental margin during the past 5 years are beginning to answer many questions concerning its deep structure and stratigraphy and how it evolved during the rifting and early sea-floor-spreading stages of the separation of this region from Africa. Earlier geophysical studies of the U.S. continental margin used marine refraction and submarine gravity measurements. Single-channel seismic-reflection, marine magnetic, aeromagnetic, and continuous gravity measurements became available during the 1960s.

  10. Marginal microfiltration in amalgam restorations. Review

    OpenAIRE

    Lahoud Salem, Víctor

    2014-01-01

    The present articule is review references from phenomenon of microfiltration in restorations with amalgam and yours consecuents in changes of color in the interface tooth-restorations, margin deterioted , sensitivity dentinarea postoperate, caries secondary and pulp inflamation. Besides naming the mechanicals for to reduce microfiltration, and yours effects for use of sealers dentinaries representation for the varnish cavitys and adhesive systens Conclusive indicate wath the amalgam is the ma...

  11. Work culture and migrant women's welfare marginalization

    OpenAIRE

    Psimmenos, Iordanis

    2007-01-01

    Central to this paper is the relationship between work and welfare marginalization ofmigrant women domestic workers. Based upon the findings of a recent (2005-2007)research study on Albanian and Ukrainian domestic workers’ access to socialinsurance, medical and children’s care (i.e. nurseries, kindergartens), the paper claimsthat welfare barriers are constituted around lack of resources, discriminations as well asconditions and values at work.At the highest level of generality, paid domestic ...

  12. PREDICTIVE METHODS FOR STABILITY MARGIN IN BWR

    OpenAIRE

    MELARA SAN ROMÁN, JOSÉ

    2016-01-01

    [EN] Power and flow oscillations in a BWR are very undesirable. One of the major concerns is to ensure, during power oscillations, compliance with GDC 10 and 12. GDC 10 requires that the reactor core be designed with appropriate margin to assure that specified acceptable fuel design limits will not be exceeded during any condition of normal operation, including the effects of anticipated operational occurrences. GDC 12 requires assurance that power oscillations which can result in conditions ...

  13. The influence of tectonic and volcanic processes on the morphology of the Iberian continental margins; Influencia de los procesos tectonicos y volcanicos en la morfologia de los margenes continentales ibericos

    Energy Technology Data Exchange (ETDEWEB)

    Maestro, A.; Bohoyo, F.; Lopez-Martinez, J.; Acosta, J.; Gomez-Ballesteros, M.; Llaave, E.; Munoz, A.; Terrinha, P. G.; Dominguez, M.; Fernandez-Saez, F.

    2015-07-01

    The Iberian continental margins are mainly passive margins. Nevertheless, the northern sector of the margin was active during some stages of its geological evolution. The southern sector is considered as a transformed margin, which defines the boundary between the Iberian and African plates. This margin was also an active margin in the past. The different types, origins and intensities of the endogenic processes that have affected he Iberian continental margins have led to the development of various tectonic and volcanic morphologies. The North Atlantic rifting allowed the development of large marginal platforms in the Cantabrian and Galician margins the North-Atlantic Ocean spreading. The reactivation of Variscan faults during the Mesozoic and Cenozoic controlled the strike of some of the largest canyons in the Iberian margins. The Gulf of Cadiz margin is characterized by the development of morphologies related to salt tectonic, fluid seepage, thrust fronts and strike-slip fault lineaments hundreds of kilometres long. The Alboran basin and the Betic margin show morphologies connected with the Miocene rift phase, which generated volcanic edifices and various structural reliefs, and with the subsequent compressive phase, when folds and strike-slip, reverse faults, diapirs and mud volcanoes were developed. Finally, the Catalan-Valencian margin and the Balearic promontory are characterized by the presence of horst and graben structures related to the development of the Valencia trough during the Paleogene. The morphological features of endogenic origin have largely controlled the location and extent of the sedimentary processes and morphological products along the Iberian margins. (Author)

  14. On marginally resolved objects in optical interferometry

    Science.gov (United States)

    Lachaume, R.

    2003-03-01

    With the present and soon-to-be breakthrough of optical interferometry, countless objects shall be within reach of interferometers; yet, most of them are expected to remain only marginally resolved with hectometric baselines. In this paper, we tackle the problem of deriving the properties of a marginally resolved object from its optical visibilities. We show that they depend on the moments of flux distribution of the object: centre, mean angular size, asymmetry, and curtosis. We also point out that the visibility amplitude is a second-order phenomenon, whereas the phase is a combination of a first-order term, giving the location of the photocentre, and a third-order term, more difficult to detect than the visibility amplitude, giving an asymmetry coefficient of the object. We then demonstrate that optical visibilities are not a good model constraint while the object stays marginally resolved, unless observations are carried out at different wavelengths. Finally, we show an application of this formalism to circumstellar discs.

  15. Ocean Margins Programs, Phase I research summaries

    Energy Technology Data Exchange (ETDEWEB)

    Verity, P. [ed.

    1994-08-01

    During FY 1992, the DOE restructured its regional coastal-ocean programs into a new Ocean Margins Program (OMP), to: Quantify the ecological and biogeochemical processes and mechanisms that affect the cycling, flux, and storage of carbon and other biogenic elements at the land/ocean interface; Define ocean-margin sources and sinks in global biogeochemical cycles, and; Determine whether continental shelves are quantitatively significant in removing carbon dioxide from the atmosphere and isolating it via burial in sediments or export to the interior ocean. Currently, the DOE Ocean Margins Program supports more than 70 principal and co-principal investigators, spanning more than 30 academic institutions. Research funded by the OMP amounted to about $6.9M in FY 1994. This document is a collection of abstracts summarizing the component projects of Phase I of the OMP. This phase included both research and technology development, and comprised projects of both two and three years duration. The attached abstracts describe the goals, methods, measurement scales, strengths and limitations, and status of each project, and level of support. Keywords are provided to index the various projects. The names, addresses, affiliations, and major areas of expertise of the investigators are provided in appendices.

  16. Pricing district heating by marginal cost

    International Nuclear Information System (INIS)

    Difs, Kristina; Trygg, Louise

    2009-01-01

    A vital measure for industries when redirecting the energy systems towards sustainability is conversion from electricity to district heating (DH). This conversion can be achieved for example, by replacing electrical heating with DH and compression cooling with heat-driven absorption cooling. Conversion to DH must, however, always be an economically attractive choice for an industry. In this paper the effects for industries and the local DH supplier are analysed when pricing DH by marginal cost in combination with industrial energy efficiency measures. Energy audits have shown that the analysed industries can reduce their annual electricity use by 30% and increase the use of DH by 56%. When marginal costs are applied as DH tariffs and the industrial energy efficiency measures are implemented, the industrial energy costs can be reduced by 17%. When implementing the industrial energy efficiency measures and also considering a utility investment in the local energy system, the local DH supplier has a potential to reduce the total energy system cost by 1.6 million EUR. Global carbon dioxide emissions can be reduced by 25,000 tonnes if the industrial energy efficiency measures are implemented and when coal-condensing power is assumed to be the marginal electricity source

  17. Marginalized Student Access to Technology Education

    Science.gov (United States)

    Kurtcu, Wanda M.

    The purpose of this paper is to investigate how a teacher can disrupt an established curriculum that continues the cycle of inequity of access to science, technology, engineering, and math (STEM) curriculum by students in alternative education. For this paper, I will focus on the technology components of the STEM curriculum. Technology in the United States, if not the world economy, is developing at a rapid pace. Many areas of day to day living, from applying for a job to checking one's bank account online, involve a component of science and technology. The 'gap' in technology education is emphasized between the 'haves and have-nots', which is delineated along socio-economic lines. Marginalized students in alternative education programs use this equipment for little else than remedial programs and credit recovery. This level of inequity further widens in alternative education programs and affects the achievement of marginalized students in credit recovery or alternative education classes instead of participation technology classes. For the purposes of this paper I focus on how can I decrease the inequity of student access to 21st century technology education in an alternative education program by addressing the established curriculum of the program and modifying structural barriers of marginalized student access to a technology focused curriculum.

  18. Evans Syndrome Presented with Marginal Zone Lymphoma and Duodenal Neuroendocrine Tumor in an Elderly Woman

    Directory of Open Access Journals (Sweden)

    Daniele D'Ambrosio

    2016-12-01

    Full Text Available Evans syndrome (ES is an autoimmune disorder characterized by simultaneous or sequential development of autoimmune hemolytic anemia, immune thrombocytopenia, and/or neutropenia. ES can be classified as a primary (idiopathic or secondary (associated with an underlying disease syndrome. We report a case of ES in an elderly patient in the presence of multiple trigger factors such as recent influenza vaccine, marginal zone lymphoma, and neuroendocrine tumor G1. Whether this association is casual or causal remains a matter of speculation. It is however necessary to have a thorough work-up in a newly diagnosed ES and a more accurate search of miscellaneous factors especially in elderly patients.

  19. Impact of Millimeter-Level Margins on Peripheral Normal Brain Sparing for Gamma Knife Radiosurgery

    International Nuclear Information System (INIS)

    Ma, Lijun; Sahgal, Arjun; Larson, David A.; Pinnaduwage, Dilini; Fogh, Shannon; Barani, Igor; Nakamura, Jean; McDermott, Michael; Sneed, Penny

    2014-01-01

    Purpose: To investigate how millimeter-level margins beyond the gross tumor volume (GTV) impact peripheral normal brain tissue sparing for Gamma Knife radiosurgery. Methods and Materials: A mathematical formula was derived to predict the peripheral isodose volume, such as the 12-Gy isodose volume, with increasing margins by millimeters. The empirical parameters of the formula were derived from a cohort of brain tumor and surgical tumor resection cavity cases (n=15) treated with the Gamma Knife Perfexion. This was done by first adding margins from 0.5 to 3.0 mm to each individual target and then creating for each expanded target a series of treatment plans of nearly identical quality as the original plan. Finally, the formula was integrated with a published logistic regression model to estimate the treatment-induced complication rate for stereotactic radiosurgery when millimeter-level margins are added. Results: Confirmatory correlation between the nominal target radius (ie, R T ) and commonly used maximum target size was found for the studied cases, except for a few outliers. The peripheral isodose volume such as the 12-Gy volume was found to increase exponentially with increasing Δ/R T , where Δ is the margin size. Such a curve fitted the data (logarithmic regression, R 2 >0.99), and the 12-Gy isodose volume was shown to increase steeply with a 0.5- to 3.0-mm margin applied to a target. For example, a 2-mm margin on average resulted in an increase of 55% ± 16% in the 12-Gy volume; this corresponded to an increase in the symptomatic necrosis rate of 6% to 25%, depending on the Δ/R T values for the target. Conclusions: Millimeter-level margins beyond the GTV significantly impact peripheral normal brain sparing and should be applied with caution. Our model provides a rapid estimate of such an effect, particularly for large and/or irregularly shaped targets

  20. An Investigation of Rotorcraft Stability-Phase Margin Requirements in Hover

    Science.gov (United States)

    Blanken, Chris L.; Lusardi, Jeff A.; Ivler, Christina M.; Tischler, Mark B.; Hoefinger, Marc T.; Decker, William A.; Malpica, Carlos A.; Berger, Tom; Tucker, George E.

    2009-01-01

    A cooperative study was performed to investigate the handling quality effects from reduced flight control system stability margins, and the trade-offs with higher disturbance rejection bandwidth (DRB). The piloted simulation study, perform on the NASA-Ames Vertical Motion Simulator, included three classes of rotorcraft in four configurations: a utility-class helicopter; a medium-lift helicopter evaluated with and without an external slung load; and a large (heavy-lift) civil tiltrotor aircraft. This large aircraft also allowed an initial assessment of ADS-33 handling quality requirements for an aircraft of this size. Ten experimental test pilots representing the U.S. Army, Marine Corps, NASA, rotorcraft industry, and the German Aerospace Center (DLR), evaluated the four aircraft configurations, for a range of flight control stability-margins and turbulence levels, while primarily performing the ADS-33 Hover and Lateral Reposition MTEs. Pilot comments and aircraft-task performance data were analyzed. The preliminary stability margin results suggest higher DRB and less phase margin cases are preferred as the aircraft increases in size. Extra care will need to be taken to assess the influence of variability when nominal flight control gains start with reduced margins. Phase margins as low as 20-23 degrees resulted in low disturbance-response damping ratios, objectionable oscillations, PIO tendencies, and a perception of an incipient handling qualities cliff. Pilot comments on the disturbance response of the aircraft correlated well to the DRB guidelines provided in the ADS-33 Test Guide. The A D-3S3 mid-term response-to-control damping ratio metrics can be measured and applied to the disturbance-response damping ratio. An initial assessment of LCTR yaw bandwidth shows the current Level 1 boundary needs to be relaxed to help account for a large pilot off-set from the c.g. Future efforts should continue to investigate the applicability/refinement of the current ADS-33

  1. Perforated marginal ulcers after laparoscopic gastric bypass.

    Science.gov (United States)

    Felix, Edward L; Kettelle, John; Mobley, Elijah; Swartz, Daniel

    2008-10-01

    Perforated marginal ulcer (PMU) after laparoscopic Roux-en-Y gastric bypass (LRYGB) is a serious complication, but its incidence and etiology have rarely been investigated. Therefore, a retrospective review of all patients undergoing LRYGB at the authors' center was conducted to determine the incidence of PMU and whether any causative factors were present. A prospectively kept database of all patients at the authors' bariatric center was retrospectively reviewed. The complete records of patients with a PMU were examined individually for accuracy and analyzed for treatment, outcome, and possible underlying causes of the marginal perforation. Between April 1999 and August 2007, 1% of the patients (35/3,430) undergoing laparoscopic gastric bypass experienced one or more perforated marginal ulcers 3 to 70 months (median, 18 months) after LRYGB. The patients with and without perforation were not significantly different in terms of mean age (37 vs 41 years), weight (286 vs 287 lb), body mass index (BMI) (46 vs 47), or female gender (89% vs 83%). Of the patients with perforations, 2 (6%) were taking steroids, 10 (29%) were receiving nonsteroidal antiinflammatory drugs (NSAIDs) at the time of the perforation, 18 (51%) were actively smoking, and 6 of the smokers also were taking NSAIDs. Eleven of the patients (31%) who perforated did not have at least one of these possible risk factors, but 4 (36%) of the 11 patients in this group had been treated after bypass for a marginal ulcer. Only 7 (20%) of the 35 patients who had laparoscopic bypass, or 7 (0.2%) in the entire group of 3,430 patients, perforated without any warning. There were no deaths, but three patients reperforated. The incidence of a marginal ulcer perforating after LRYGB was significant (>1%) and appeared to be related to smoking or the use of NSAIDs or steroids. Because only 0.2% of all patients acutely perforated without some risk factor or warning, long-term ulcer prophylaxis or treatment may be necessary

  2. 18 CFR 367.18 - Criteria for classifying leases.

    Science.gov (United States)

    2010-04-01

    ... the lessee) must not give rise to a new classification of a lease for accounting purposes. ... classifying the lease. (4) The present value at the beginning of the lease term of the minimum lease payments... taxes to be paid by the lessor, including any related profit, equals or exceeds 90 percent of the excess...

  3. Discrimination-Aware Classifiers for Student Performance Prediction

    Science.gov (United States)

    Luo, Ling; Koprinska, Irena; Liu, Wei

    2015-01-01

    In this paper we consider discrimination-aware classification of educational data. Mining and using rules that distinguish groups of students based on sensitive attributes such as gender and nationality may lead to discrimination. It is desirable to keep the sensitive attributes during the training of a classifier to avoid information loss but…

  4. 29 CFR 1910.307 - Hazardous (classified) locations.

    Science.gov (United States)

    2010-07-01

    ... equipment at the location. (c) Electrical installations. Equipment, wiring methods, and installations of... covers the requirements for electric equipment and wiring in locations that are classified depending on... provisions of this section. (4) Division and zone classification. In Class I locations, an installation must...

  5. 29 CFR 1926.407 - Hazardous (classified) locations.

    Science.gov (United States)

    2010-07-01

    ...) locations, unless modified by provisions of this section. (b) Electrical installations. Equipment, wiring..., DEPARTMENT OF LABOR (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR CONSTRUCTION Electrical Installation Safety... electric equipment and wiring in locations which are classified depending on the properties of the...

  6. 18 CFR 3a.71 - Accountability for classified material.

    Science.gov (United States)

    2010-04-01

    ... numbers assigned to top secret material will be separate from the sequence for other classified material... central control registry in calendar year 1969. TS 1006—Sixth Top Secret document controlled by the... control registry when the document is transferred. (e) For Top Secret documents only, an access register...

  7. Classifier fusion for VoIP attacks classification

    Science.gov (United States)

    Safarik, Jakub; Rezac, Filip

    2017-05-01

    SIP is one of the most successful protocols in the field of IP telephony communication. It establishes and manages VoIP calls. As the number of SIP implementation rises, we can expect a higher number of attacks on the communication system in the near future. This work aims at malicious SIP traffic classification. A number of various machine learning algorithms have been developed for attack classification. The paper presents a comparison of current research and the use of classifier fusion method leading to a potential decrease in classification error rate. Use of classifier combination makes a more robust solution without difficulties that may affect single algorithms. Different voting schemes, combination rules, and classifiers are discussed to improve the overall performance. All classifiers have been trained on real malicious traffic. The concept of traffic monitoring depends on the network of honeypot nodes. These honeypots run in several networks spread in different locations. Separation of honeypots allows us to gain an independent and trustworthy attack information.

  8. Bayesian Classifier for Medical Data from Doppler Unit

    Directory of Open Access Journals (Sweden)

    J. Málek

    2006-01-01

    Full Text Available Nowadays, hand-held ultrasonic Doppler units (probes are often used for noninvasive screening of atherosclerosis in the arteries of the lower limbs. The mean velocity of blood flow in time and blood pressures are measured on several positions on each lower limb. By listening to the acoustic signal generated by the device or by reading the signal displayed on screen, a specialist can detect peripheral arterial disease (PAD.This project aims to design software that will be able to analyze data from such a device and classify it into several diagnostic classes. At the Department of Functional Diagnostics at the Regional Hospital in Liberec a database of several hundreds signals was collected. In cooperation with the specialist, the signals were manually classified into four classes. For each class, selected signal features were extracted and then used for training a Bayesian classifier. Another set of signals was used for evaluating and optimizing the parameters of the classifier. Slightly above 84 % of successfully recognized diagnostic states, was recently achieved on the test data. 

  9. An Investigation to Improve Classifier Accuracy for Myo Collected Data

    Science.gov (United States)

    2017-02-01

    Bad Samples Effect on Classification Accuracy 7 5.1 Naïve Bayes (NB) Classifier Accuracy 7 5.2 Logistic Model Tree (LMT) 10 5.3 K-Nearest Neighbor...gesture, pitch feature, user 06. All samples exhibit reversed movement...20 Fig. A-2 Come gesture, pitch feature, user 14. All samples exhibit reversed movement

  10. Diagnosis of Broiler Livers by Classifying Image Patches

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Fagertun, Jens; Moeslund, Thomas B.

    2017-01-01

    The manual health inspection are becoming the bottleneck at poultry processing plants. We present a computer vision method for automatic diagnosis of broiler livers. The non-rigid livers, of varying shape and sizes, are classified in patches by a convolutional neural network, outputting maps...

  11. Support vector machines classifiers of physical activities in preschoolers

    Science.gov (United States)

    The goal of this study is to develop, test, and compare multinomial logistic regression (MLR) and support vector machines (SVM) in classifying preschool-aged children physical activity data acquired from an accelerometer. In this study, 69 children aged 3-5 years old were asked to participate in a s...

  12. A Linguistic Image of Nature: The Burmese Numerative Classifier System

    Science.gov (United States)

    Becker, Alton L.

    1975-01-01

    The Burmese classifier system is coherent because it is based upon a single elementary semantic dimension: deixis. On that dimension, four distances are distinguished, distances which metaphorically substitute for other conceptual relations between people and other living beings, people and things, and people and concepts. (Author/RM)

  13. Data Stream Classification Based on the Gamma Classifier

    Directory of Open Access Journals (Sweden)

    Abril Valeria Uriarte-Arcia

    2015-01-01

    Full Text Available The ever increasing data generation confronts us with the problem of handling online massive amounts of information. One of the biggest challenges is how to extract valuable information from these massive continuous data streams during single scanning. In a data stream context, data arrive continuously at high speed; therefore the algorithms developed to address this context must be efficient regarding memory and time management and capable of detecting changes over time in the underlying distribution that generated the data. This work describes a novel method for the task of pattern classification over a continuous data stream based on an associative model. The proposed method is based on the Gamma classifier, which is inspired by the Alpha-Beta associative memories, which are both supervised pattern recognition models. The proposed method is capable of handling the space and time constrain inherent to data stream scenarios. The Data Streaming Gamma classifier (DS-Gamma classifier implements a sliding window approach to provide concept drift detection and a forgetting mechanism. In order to test the classifier, several experiments were performed using different data stream scenarios with real and synthetic data streams. The experimental results show that the method exhibits competitive performance when compared to other state-of-the-art algorithms.

  14. Building an automated SOAP classifier for emergency department reports.

    Science.gov (United States)

    Mowery, Danielle; Wiebe, Janyce; Visweswaran, Shyam; Harkema, Henk; Chapman, Wendy W

    2012-02-01

    Information extraction applications that extract structured event and entity information from unstructured text can leverage knowledge of clinical report structure to improve performance. The Subjective, Objective, Assessment, Plan (SOAP) framework, used to structure progress notes to facilitate problem-specific, clinical decision making by physicians, is one example of a well-known, canonical structure in the medical domain. Although its applicability to structuring data is understood, its contribution to information extraction tasks has not yet been determined. The first step to evaluating the SOAP framework's usefulness for clinical information extraction is to apply the model to clinical narratives and develop an automated SOAP classifier that classifies sentences from clinical reports. In this quantitative study, we applied the SOAP framework to sentences from emergency department reports, and trained and evaluated SOAP classifiers built with various linguistic features. We found the SOAP framework can be applied manually to emergency department reports with high agreement (Cohen's kappa coefficients over 0.70). Using a variety of features, we found classifiers for each SOAP class can be created with moderate to outstanding performance with F(1) scores of 93.9 (subjective), 94.5 (objective), 75.7 (assessment), and 77.0 (plan). We look forward to expanding the framework and applying the SOAP classification to clinical information extraction tasks. Copyright © 2011. Published by Elsevier Inc.

  15. Learning to classify wakes from local sensory information

    Science.gov (United States)

    Alsalman, Mohamad; Colvert, Brendan; Kanso, Eva; Kanso Team

    2017-11-01

    Aquatic organisms exhibit remarkable abilities to sense local flow signals contained in their fluid environment and to surmise the origins of these flows. For example, fish can discern the information contained in various flow structures and utilize this information for obstacle avoidance and prey tracking. Flow structures created by flapping and swimming bodies are well characterized in the fluid dynamics literature; however, such characterization relies on classical methods that use an external observer to reconstruct global flow fields. The reconstructed flows, or wakes, are then classified according to the unsteady vortex patterns. Here, we propose a new approach for wake identification: we classify the wakes resulting from a flapping airfoil by applying machine learning algorithms to local flow information. In particular, we simulate the wakes of an oscillating airfoil in an incoming flow, extract the downstream vorticity information, and train a classifier to learn the different flow structures and classify new ones. This data-driven approach provides a promising framework for underwater navigation and detection in application to autonomous bio-inspired vehicles.

  16. The Closing of the Classified Catalog at Boston University

    Science.gov (United States)

    Hazen, Margaret Hindle

    1974-01-01

    Although the classified catalog at Boston University libraries has been a useful research tool, it has proven too expensive to keep current. The library has converted to a traditional alphabetic subject catalog and will recieve catalog cards from the Ohio College Library Center through the New England Library Network. (Author/LS)

  17. Recognition of Arabic Sign Language Alphabet Using Polynomial Classifiers

    Directory of Open Access Journals (Sweden)

    M. Al-Rousan

    2005-08-01

    Full Text Available Building an accurate automatic sign language recognition system is of great importance in facilitating efficient communication with deaf people. In this paper, we propose the use of polynomial classifiers as a classification engine for the recognition of Arabic sign language (ArSL alphabet. Polynomial classifiers have several advantages over other classifiers in that they do not require iterative training, and that they are highly computationally scalable with the number of classes. Based on polynomial classifiers, we have built an ArSL system and measured its performance using real ArSL data collected from deaf people. We show that the proposed system provides superior recognition results when compared with previously published results using ANFIS-based classification on the same dataset and feature extraction methodology. The comparison is shown in terms of the number of misclassified test patterns. The reduction in the rate of misclassified patterns was very significant. In particular, we have achieved a 36% reduction of misclassifications on the training data and 57% on the test data.

  18. Reconfigurable support vector machine classifier with approximate computing

    NARCIS (Netherlands)

    van Leussen, M.J.; Huisken, J.; Wang, L.; Jiao, H.; De Gyvez, J.P.

    2017-01-01

    Support Vector Machine (SVM) is one of the most popular machine learning algorithms. An energy-efficient SVM classifier is proposed in this paper, where approximate computing is utilized to reduce energy consumption and silicon area. A hardware architecture with reconfigurable kernels and

  19. Classifying regularized sensor covariance matrices: An alternative to CSP

    NARCIS (Netherlands)

    Roijendijk, L.M.M.; Gielen, C.C.A.M.; Farquhar, J.D.R.

    2016-01-01

    Common spatial patterns ( CSP) is a commonly used technique for classifying imagined movement type brain-computer interface ( BCI) datasets. It has been very successful with many extensions and improvements on the basic technique. However, a drawback of CSP is that the signal processing pipeline

  20. Classifying regularised sensor covariance matrices: An alternative to CSP

    NARCIS (Netherlands)

    Roijendijk, L.M.M.; Gielen, C.C.A.M.; Farquhar, J.D.R.

    2016-01-01

    Common spatial patterns (CSP) is a commonly used technique for classifying imagined movement type brain computer interface (BCI) datasets. It has been very successful with many extensions and improvements on the basic technique. However, a drawback of CSP is that the signal processing pipeline

  1. Two-categorical bundles and their classifying spaces

    DEFF Research Database (Denmark)

    Baas, Nils A.; Bökstedt, M.; Kro, T.A.

    2012-01-01

    -category is a classifying space for the associated principal 2-bundles. In the process of proving this we develop a lot of powerful machinery which may be useful in further studies of 2-categorical topology. As a corollary we get a new proof of the classification of principal bundles. A calculation based...

  2. 3 CFR - Classified Information and Controlled Unclassified Information

    Science.gov (United States)

    2010-01-01

    ... on Transparency and Open Government and on the Freedom of Information Act, my Administration is... memoranda of January 21, 2009, on Transparency and Open Government and on the Freedom of Information Act; (B... 3 The President 1 2010-01-01 2010-01-01 false Classified Information and Controlled Unclassified...

  3. Comparison of Classifier Architectures for Online Neural Spike Sorting.

    Science.gov (United States)

    Saeed, Maryam; Khan, Amir Ali; Kamboh, Awais Mehmood

    2017-04-01

    High-density, intracranial recordings from micro-electrode arrays need to undergo Spike Sorting in order to associate the recorded neuronal spikes to particular neurons. This involves spike detection, feature extraction, and classification. To reduce the data transmission and power requirements, on-chip real-time processing is becoming very popular. However, high computational resources are required for classifiers in on-chip spike-sorters, making scalability a great challenge. In this review paper, we analyze several popular classifiers to propose five new hardware architectures using the off-chip training with on-chip classification approach. These include support vector classification, fuzzy C-means classification, self-organizing maps classification, moving-centroid K-means classification, and Cosine distance classification. The performance of these architectures is analyzed in terms of accuracy and resource requirement. We establish that the neural networks based Self-Organizing Maps classifier offers the most viable solution. A spike sorter based on the Self-Organizing Maps classifier, requires only 7.83% of computational resources of the best-reported spike sorter, hierarchical adaptive means, while offering a 3% better accuracy at 7 dB SNR.

  4. A Gene Expression Classifier of Node-Positive Colorectal Cancer

    Directory of Open Access Journals (Sweden)

    Paul F. Meeh

    2009-10-01

    Full Text Available We used digital long serial analysis of gene expression to discover gene expression differences between node-negative and node-positive colorectal tumors and developed a multigene classifier able to discriminate between these two tumor types. We prepared and sequenced long serial analysis of gene expression libraries from one node-negative and one node-positive colorectal tumor, sequenced to a depth of 26,060 unique tags, and identified 262 tags significantly differentially expressed between these two tumors (P < 2 x 10-6. We confirmed the tag-to-gene assignments and differential expression of 31 genes by quantitative real-time polymerase chain reaction, 12 of which were elevated in the node-positive tumor. We analyzed the expression levels of these 12 upregulated genes in a validation panel of 23 additional tumors and developed an optimized seven-gene logistic regression classifier. The classifier discriminated between node-negative and node-positive tumors with 86% sensitivity and 80% specificity. Receiver operating characteristic analysis of the classifier revealed an area under the curve of 0.86. Experimental manipulation of the function of one classification gene, Fibronectin, caused profound effects on invasion and migration of colorectal cancer cells in vitro. These results suggest that the development of node-positive colorectal cancer occurs in part through elevated epithelial FN1 expression and suggest novel strategies for the diagnosis and treatment of advanced disease.

  5. Cascaded lexicalised classifiers for second-person reference resolution

    NARCIS (Netherlands)

    Purver, M.; Fernández, R.; Frampton, M.; Peters, S.; Healey, P.; Pieraccini, R.; Byron, D.; Young, S.; Purver, M.

    2009-01-01

    This paper examines the resolution of the second person English pronoun you in multi-party dialogue. Following previous work, we attempt to classify instances as generic or referential, and in the latter case identify the singular or plural addressee. We show that accuracy and robustness can be

  6. Human Activity Recognition by Combining a Small Number of Classifiers.

    Science.gov (United States)

    Nazabal, Alfredo; Garcia-Moreno, Pablo; Artes-Rodriguez, Antonio; Ghahramani, Zoubin

    2016-09-01

    We consider the problem of daily human activity recognition (HAR) using multiple wireless inertial sensors, and specifically, HAR systems with a very low number of sensors, each one providing an estimation of the performed activities. We propose new Bayesian models to combine the output of the sensors. The models are based on a soft outputs combination of individual classifiers to deal with the small number of sensors. We also incorporate the dynamic nature of human activities as a first-order homogeneous Markov chain. We develop both inductive and transductive inference methods for each model to be employed in supervised and semisupervised situations, respectively. Using different real HAR databases, we compare our classifiers combination models against a single classifier that employs all the signals from the sensors. Our models exhibit consistently a reduction of the error rate and an increase of robustness against sensor failures. Our models also outperform other classifiers combination models that do not consider soft outputs and an Markovian structure of the human activities.

  7. Evaluation of three classifiers in mapping forest stand types using ...

    African Journals Online (AJOL)

    EJIRO

    applied for classification of the image. Supervised classification technique using maximum likelihood algorithm is the most commonly and widely used method for land cover classification (Jia and Richards, 2006). In Australia, the maximum likelihood classifier was effectively used to map different forest stand types with high.

  8. Classifying patients' complaints for regulatory purposes : A Pilot Study

    NARCIS (Netherlands)

    Bouwman, R.J.R.; Bomhoff, Manja; Robben, Paul; Friele, R.D.

    2018-01-01

    Objectives: It is assumed that classifying and aggregated reporting of patients' complaints by regulators helps to identify problem areas, to respond better to patients and increase public accountability. This pilot study addresses what a classification of complaints in a regulatory setting

  9. Localizing genes to cerebellar layers by classifying ISH images.

    Directory of Open Access Journals (Sweden)

    Lior Kirsch

    Full Text Available Gene expression controls how the brain develops and functions. Understanding control processes in the brain is particularly hard since they involve numerous types of neurons and glia, and very little is known about which genes are expressed in which cells and brain layers. Here we describe an approach to detect genes whose expression is primarily localized to a specific brain layer and apply it to the mouse cerebellum. We learn typical spatial patterns of expression from a few markers that are known to be localized to specific layers, and use these patterns to predict localization for new genes. We analyze images of in-situ hybridization (ISH experiments, which we represent using histograms of local binary patterns (LBP and train image classifiers and gene classifiers for four layers of the cerebellum: the Purkinje, granular, molecular and white matter layer. On held-out data, the layer classifiers achieve accuracy above 94% (AUC by representing each image at multiple scales and by combining multiple image scores into a single gene-level decision. When applied to the full mouse genome, the classifiers predict specific layer localization for hundreds of new genes in the Purkinje and granular layers. Many genes localized to the Purkinje layer are likely to be expressed in astrocytes, and many others are involved in lipid metabolism, possibly due to the unusual size of Purkinje cells.

  10. Classifying Your Food as Acid, Low-Acid, or Acidified

    OpenAIRE

    Bacon, Karleigh

    2012-01-01

    As a food entrepreneur, you should be aware of how ingredients in your product make the food look, feel, and taste; as well as how the ingredients create environments for microorganisms like bacteria, yeast, and molds to survive and grow. This guide will help you classifying your food as acid, low-acid, or acidified.

  11. Gene-expression Classifier in Papillary Thyroid Carcinoma

    DEFF Research Database (Denmark)

    Londero, Stefano Christian; Jespersen, Marie Louise; Krogdahl, Annelise

    2016-01-01

    BACKGROUND: No reliable biomarker for metastatic potential in the risk stratification of papillary thyroid carcinoma exists. We aimed to develop a gene-expression classifier for metastatic potential. MATERIALS AND METHODS: Genome-wide expression analyses were used. Development cohort: freshly...

  12. Abbreviations: Their Effects on Comprehension of Classified Advertisements.

    Science.gov (United States)

    Sokol, Kirstin R.

    Two experimental designs were used to test the hypothesis that abbreviations in classified advertisements decrease the reader's comprehension of such ads. In the first experimental design, 73 high school students read four ads (for employment, used cars, apartments for rent, and articles for sale) either with abbreviations or with all…

  13. Predict or classify: The deceptive role of time-locking in brain signal classification

    Science.gov (United States)

    Rusconi, Marco; Valleriani, Angelo

    2016-06-01

    Several experimental studies claim to be able to predict the outcome of simple decisions from brain signals measured before subjects are aware of their decision. Often, these studies use multivariate pattern recognition methods with the underlying assumption that the ability to classify the brain signal is equivalent to predict the decision itself. Here we show instead that it is possible to correctly classify a signal even if it does not contain any predictive information about the decision. We first define a simple stochastic model that mimics the random decision process between two equivalent alternatives, and generate a large number of independent trials that contain no choice-predictive information. The trials are first time-locked to the time point of the final event and then classified using standard machine-learning techniques. The resulting classification accuracy is above chance level long before the time point of time-locking. We then analyze the same trials using information theory. We demonstrate that the high classification accuracy is a consequence of time-locking and that its time behavior is simply related to the large relaxation time of the process. We conclude that when time-locking is a crucial step in the analysis of neural activity patterns, both the emergence and the timing of the classification accuracy are affected by structural properties of the network that generates the signal.

  14. The EB factory project. I. A fast, neural-net-based, general purpose light curve classifier optimized for eclipsing binaries

    International Nuclear Information System (INIS)

    Paegert, Martin; Stassun, Keivan G.; Burger, Dan M.

    2014-01-01

    We describe a new neural-net-based light curve classifier and provide it with documentation as a ready-to-use tool for the community. While optimized for identification and classification of eclipsing binary stars, the classifier is general purpose, and has been developed for speed in the context of upcoming massive surveys such as the Large Synoptic Survey Telescope. A challenge for classifiers in the context of neural-net training and massive data sets is to minimize the number of parameters required to describe each light curve. We show that a simple and fast geometric representation that encodes the overall light curve shape, together with a chi-square parameter to capture higher-order morphology information results in efficient yet robust light curve classification, especially for eclipsing binaries. Testing the classifier on the ASAS light curve database, we achieve a retrieval rate of 98% and a false-positive rate of 2% for eclipsing binaries. We achieve similarly high retrieval rates for most other periodic variable-star classes, including RR Lyrae, Mira, and delta Scuti. However, the classifier currently has difficulty discriminating between different sub-classes of eclipsing binaries, and suffers a relatively low (∼60%) retrieval rate for multi-mode delta Cepheid stars. We find that it is imperative to train the classifier's neural network with exemplars that include the full range of light curve quality to which the classifier will be expected to perform; the classifier performs well on noisy light curves only when trained with noisy exemplars. The classifier source code, ancillary programs, a trained neural net, and a guide for use, are provided.

  15. Zooniverse: Combining Human and Machine Classifiers for the Big Survey Era

    Science.gov (United States)

    Fortson, Lucy; Wright, Darryl; Beck, Melanie; Lintott, Chris; Scarlata, Claudia; Dickinson, Hugh; Trouille, Laura; Willi, Marco; Laraia, Michael; Boyer, Amy; Veldhuis, Marten; Zooniverse

    2018-01-01

    Many analyses of astronomical data sets, ranging from morphological classification of galaxies to identification of supernova candidates, have relied on humans to classify data into distinct categories. Crowdsourced galaxy classifications via the Galaxy Zoo project provided a solution that scaled visual classification for extant surveys by harnessing the combined power of thousands of volunteers. However, the much larger data sets anticipated from upcoming surveys will require a different approach. Automated classifiers using supervised machine learning have improved considerably over the past decade but their increasing sophistication comes at the expense of needing ever more training data. Crowdsourced classification by human volunteers is a critical technique for obtaining these training data. But several improvements can be made on this zeroth order solution. Efficiency gains can be achieved by implementing a “cascade filtering” approach whereby the task structure is reduced to a set of binary questions that are more suited to simpler machines while demanding lower cognitive loads for humans.Intelligent subject retirement based on quantitative metrics of volunteer skill and subject label reliability also leads to dramatic improvements in efficiency. We note that human and machine classifiers may retire subjects differently leading to trade-offs in performance space. Drawing on work with several Zooniverse projects including Galaxy Zoo and Supernova Hunter, we will present recent findings from experiments that combine cohorts of human and machine classifiers. We show that the most efficient system results when appropriate subsets of the data are intelligently assigned to each group according to their particular capabilities.With sufficient online training, simple machines can quickly classify “easy” subjects, leaving more difficult (and discovery-oriented) tasks for volunteers. We also find humans achieve higher classification purity while samples

  16. Convolution method and CTV-to-PTV margins for finite fractions and small systematic errors

    International Nuclear Information System (INIS)

    Gordon, J J; Siebers, J V

    2007-01-01

    The van Herk margin formula (VHMF) relies on the accuracy of the convolution method (CM) to determine clinical target volume (CTV) to planning target volume (PTV) margins. This work (1) evaluates the accuracy of the CM and VHMF as a function of the number of fractions N and other parameters, and (2) proposes an alternative margin algorithm which ensures target coverage for a wider range of parameter values. Dose coverage was evaluated for a spherical target with uniform margin, using the same simplified dose model and CTV coverage criterion as were used in development of the VHMF. Systematic and random setup errors were assumed to be normally distributed with standard deviations Σ and σ. For clinically relevant combinations of σ, Σ and N, margins were determined by requiring that 90% of treatment course simulations have a CTV minimum dose greater than or equal to the static PTV minimum dose. Simulation results were compared with the VHMF and the alternative margin algorithm. The CM and VHMF were found to be accurate for parameter values satisfying the approximate criterion: σ[1 - γN/25] 0.2, because they failed to account for the non-negligible dose variability associated with random setup errors. These criteria are applicable when σ ∼> σ P , where σ P = 0.32 cm is the standard deviation of the normal dose penumbra. (Qualitative behaviour of the CM and VHMF will remain the same, though the criteria might vary if σ P takes values other than 0.32 cm.) When σ P , dose variability due to random setup errors becomes negligible, and the CM and VHMF are valid regardless of the values of Σ and N. When σ ∼> σ P , consistent with the above criteria, it was found that the VHMF can underestimate margins for large σ, small Σ and small N. A potential consequence of this underestimate is that the CTV minimum dose can fall below its planned value in more than the prescribed 10% of treatments. The proposed alternative margin algorithm provides better margin

  17. Marginal costs and co-benefits of energy efficiency investments

    International Nuclear Information System (INIS)

    Jakob, Martin

    2006-01-01

    Key elements of present investment decision-making regarding energy efficiency of new buildings and the refurbishment of existing buildings are the marginal costs of energy efficiency measures and incomplete knowledge of investors and architects about pricing, co-benefits and new technologies. This paper reports on a recently completed empirical study for the Swiss residential sector. It empirically quantifies the marginal costs of energy efficiency investments (i.e. additional insulation, improved window systems, ventilation and heating systems and architectural concepts). For the private sector, first results on the economic valuation of co-benefits such as improved comfort of living, improved indoor air quality, better protection against external noise, etc. may amount to the same order of magnitude as the energy-related benefits are given. The cost-benefit analysis includes newly developed technologies that show large variations in prices due to pioneer market pricing, add-on of learning costs and risk components of the installers. Based on new empirical data on the present cost-situation and past techno-economic progress, the potential of future cost reduction was estimated applying the experience curve concept. The paper shows, for the first time, co-benefits and cost dynamics of energy efficiency investments, of which decision makers in the real estate sector, politics and administrations are scarcely aware

  18. Neogene sedimentation on the outer continental margin, southern Bering Sea

    Science.gov (United States)

    Vallier, T.L.; Underwood, M.B.; Gardner, J.V.; Barron, J.A.

    1980-01-01

    Neogene sedimentary rocks and sediments from sites on the outer continental margin in the southern Bering Sea and on the Alaska Peninsula are dominated by volcanic components that probably were eroded from an emergent Aleutian Ridge. A mainland continental source is subordinate. Most sediment in the marine environment was transported to the depositional sites by longshore currents, debris flows, and turbidity currents during times when sea level was near the outermost continental shelf. Fluctuations of sea level are ascribed both to worldwide glacio-eustatic effects and to regional vertical tectonics. Large drainage systems, such as the Yukon and Kuskokwim Rivers, had little direct influence on sedimentation along the continental slope and Unmak Plateau in the southern Bering Sea. Sediments from those drainage systems probably were transported to the floor of the Aleutian Basin, to the numerous shelf basins that underlie the outer continental shelf, and to the Arctic Ocean after passing through the Bering Strait. Environments of deposition at the sites along the outer continental margin have not changed significantly since the middle Miocene. The site on the Alaska Peninsula, however, is now emergent following shallow-marine and transitional sedimentation during the Neogene. ?? 1980.

  19. Work, organisational practices, and margin of manoeuver during work reintegration.

    Science.gov (United States)

    O'Hagan, Fergal

    2017-09-29

    Many individuals of working age experience cardiovascular disease and are disabled from work as a result. The majority of research in cardiac work disability has focused on individual biological and psychological factors influencing work disability despite evidence of the importance of social context in work disability. In this article, the focus is on work and organisational features influencing the leeway (margin of manoeuvre) workers are afforded during work reintegration. A qualitative method was used. A large auto manufacturing plant was selected owing to work, organisational, and worker characteristics. Workplace context was assessed through site visits and meetings with stakeholders including occupational health, human resources and union personnel and a review of collective agreement provisions relating to seniority, benefits and accommodation. Worker experience was assessed using a series of in-depth interviews with workers (n = 12) returning to work at the plant following disabling cardiac illness. Data was analysed using qualitative content analysis. Workers demonstrated variable levels of adjustment to the workplace that could be related to production expectations and work design. Policies and practices around electronic rate monitoring, seniority and accommodation, and disability management practices affected the buffer available to workers to adjust to the workplace. Work qualities and organisational resources establish a margin of manoeuver for work reintegration efforts. Practitioners need to inform themselves of the constraints on work accommodation imposed by work organisation and collective agreements. Organisations and labour need to reconsider policies and practices that creates unequal accommodation conditions for disabled workers. Implications for rehabilitation Margin of manoeuvre offers a framework for evaluating and structuring work reintegration programmes. Assessing initial conditions for productivity expectations, context and ways

  20. Chronobiology of deep-water decapod crustaceans on continental margins.

    Science.gov (United States)

    Aguzzi, Jacopo; Company, Joan B

    2010-01-01

    Species have evolved biological rhythms in behaviour and physiology with a 24-h periodicity in order to increase their fitness, anticipating the onset of unfavourable habitat conditions. In marine organisms inhabiting deep-water continental margins (i.e. the submerged outer edges of continents), day-night activity rhythms are often referred to in three ways: vertical water column migrations (i.e. pelagic), horizontal displacements within benthic boundary layer of the continental margin, along bathymetric gradients (i.e. nektobenthic), and endobenthic movements (i.e. rhythmic emergence from the substrate). Many studies have been conducted on crustacean decapods that migrate vertically in the water column, but much less information is available for other endobenthic and nektobenthic species. Also, the types of displacement and major life habits of most marine species are still largely unknown, especially in deep-water continental margins, where steep clines in habitat factors (i.e. light intensity and its spectral quality, sediment characteristics, and hydrography) take place. This is the result of technical difficulties in performing temporally scheduled sampling and laboratory testing on living specimens. According to this scenario, there are several major issues that still need extensive research in deep-water crustacean decapods. First, the regulation of their behaviour and physiology by a biological clock is almost unknown compared to data for coastal species that are easily accessible to direct observation and sampling. Second, biological rhythms may change at different life stages (i.e. size-related variations) or at different moments of the reproductive cycle (e.g. at egg-bearing) based on different intra- and interspecific interactions. Third, there is still a major lack of knowledge on the links that exist among the observed bathymetric distributions of species and selected autoecological traits that are controlled by their biological clock, such as the

  1. A linear-RBF multikernel SVM to classify big text corpora.

    Science.gov (United States)

    Romero, R; Iglesias, E L; Borrajo, L

    2015-01-01

    Support vector machine (SVM) is a powerful technique for classification. However, SVM is not suitable for classification of large datasets or text corpora, because the training complexity of SVMs is highly dependent on the input size. Recent developments in the literature on the SVM and other kernel methods emphasize the need to consider multiple kernels or parameterizations of kernels because they provide greater flexibility. This paper shows a multikernel SVM to manage highly dimensional data, providing an automatic parameterization with low computational cost and improving results against SVMs parameterized under a brute-force search. The model consists in spreading the dataset into cohesive term slices (clusters) to construct a defined structure (multikernel). The new approach is tested on different text corpora. Experimental results show that the new classifier has good accuracy compared with the classic SVM, while the training is significantly faster than several other SVM classifiers.

  2. Classification of EEG signals using a genetic-based machine learning classifier.

    Science.gov (United States)

    Skinner, B T; Nguyen, H T; Liu, D K

    2007-01-01

    This paper investigates the efficacy of the genetic-based learning classifier system XCS, for the classification of noisy, artefact-inclusive human electroencephalogram (EEG) signals represented using large condition strings (108bits). EEG signals from three participants were recorded while they performed four mental tasks designed to elicit hemispheric responses. Autoregressive (AR) models and Fast Fourier Transform (FFT) methods were used to form feature vectors with which mental tasks can be discriminated. XCS achieved a maximum classification accuracy of 99.3% and a best average of 88.9%. The relative classification performance of XCS was then compared against four non-evolutionary classifier systems originating from different learning techniques. The experimental results will be used as part of our larger research effort investigating the feasibility of using EEG signals as an interface to allow paralysed persons to control a powered wheelchair or other devices.

  3. A sub-nJ CMOS ECG classifier for wireless smart sensor.

    Science.gov (United States)

    Chollet, Paul; Pallas, Remi; Lahuec, Cyril; Arzel, Matthieu; Seguin, Fabrice

    2017-07-01

    Body area sensor networks hold the promise of more efficient and cheaper medical care services through the constant monitoring of physiological markers such as heart beats. Continuously transmitting the electrocardiogram (ECG) signal requires most of the wireless ECG sensor energy budget. This paper presents the analog implantation of a classifier for ECG signals that can be embedded onto a sensor. The classifier is a sparse neural associative memory. It is implemented using the ST 65 nm CMOS technology and requires only 234 pJ per classification while achieving a 93.6% classification accuracy. The energy requirement is 6 orders of magnitude lower than a digital accelerator that performs a similar task. The lifespan of the resulting sensor is 191 times as large as that of a sensor sending all the data.

  4. Dissecting the gray zone between follicular lymphoma and marginal zone lymphoma using morphological and genetic features

    NARCIS (Netherlands)

    Krijgsman, Oscar; Gonzalez, Patricia; Ponz, Olga Balague; Roemer, Margaretha G. M.; Slot, Stefanie; Broeks, Annegien; Braaf, Linde; Kerkhoven, Ron M.; Bot, Freek; van Groningen, Krijn; Beijert, Max; Ylstra, Bauke; de Jongi, Daphne

    2013-01-01

    Nodal marginal zone lymphoma is a poorly defined entity in the World Health Organization classification, based largely on criteria of exclusion and the diagnosis often remains subjective. Follicular lymphoma lacking t(14;18) has similar characteristics which results in a major potential diagnostic

  5. Margination of Stiffened Red Blood Cells Regulated By Vessel Geometry.

    Science.gov (United States)

    Chen, Yuanyuan; Li, Donghai; Li, Yongjian; Wan, Jiandi; Li, Jiang; Chen, Haosheng

    2017-11-10

    Margination of stiffened red blood cells has been implicated in many vascular diseases. Here, we report the margination of stiffened RBCs in vivo, and reveal the crucial role of the vessel geometry in the margination by calculations when the blood is seen as viscoelastic fluid. The vessel-geometry-regulated margination is then confirmed by in vitro experiments in microfluidic devices, and it establishes new insights to cell sorting technology and artificial blood vessel fabrication.

  6. Marginal and Interaction Effects in Ordered Response Models

    OpenAIRE

    Debdulal Mallick

    2009-01-01

    In discrete choice models the marginal effect of a variable of interest that is interacted with another variable differs from the marginal effect of a variable that is not interacted with any variable. The magnitude of the interaction effect is also not equal to the marginal effect of the interaction term. I present consistent estimators of both marginal and interaction effects in ordered response models. This procedure is general and can easily be extended to other discrete choice models. I ...

  7. Predicting membrane protein types using various decision tree classifiers based on various modes of general PseAAC for imbalanced datasets.

    Science.gov (United States)

    Sankari, E Siva; Manimegalai, D

    2017-12-21

    Predicting membrane protein types is an important and challenging research area in bioinformatics and proteomics. Traditional biophysical methods are used to classify membrane protein types. Due to large exploration of uncharacterized protein sequences in databases, traditional methods are very time consuming, expensive and susceptible to errors. Hence, it is highly desirable to develop a robust, reliable, and efficient method to predict membrane protein types. Imbalanced datasets and large datasets are often handled well by decision tree classifiers. Since imbalanced datasets are taken, the performance of various decision tree classifiers such as Decision Tree (DT), Classification And Regression Tree (CART), C4.5, Random tree, REP (Reduced Error Pruning) tree, ensemble methods such as Adaboost, RUS (Random Under Sampling) boost, Rotation forest and Random forest are analysed. Among the various decision tree classifiers Random forest performs well in less time with good accuracy of 96.35%. Another inference is RUS boost decision tree classifier is able to classify one or two samples in the class with very less samples while the other classifiers such as DT, Adaboost, Rotation forest and Random forest are not sensitive for the classes with fewer samples. Also the performance of decision tree classifiers is compared with SVM (Support Vector Machine) and Naive Bayes classifier. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Principals' Perceived Supervisory Behaviors Regarding Marginal Teachers in Two States

    Science.gov (United States)

    Range, Bret; Hewitt, Paul; Young, Suzie

    2014-01-01

    This descriptive study used an online survey to determine how principals in two states viewed the supervision of marginal teachers. Principals ranked their own evaluation of the teacher as the most important factor when identifying marginal teachers and relied on informal methods to diagnose marginal teaching. Female principals rated a majority of…

  9. Quantifying motion for pancreatic radiotherapy margin calculation

    International Nuclear Information System (INIS)

    Whitfield, Gillian; Jain, Pooja; Green, Melanie; Watkins, Gillian; Henry, Ann; Stratford, Julie; Amer, Ali; Marchant, Thomas; Moore, Christopher; Price, Patricia

    2012-01-01

    Background and purpose: Pancreatic radiotherapy (RT) is limited by uncertain target motion. We quantified 3D patient/organ motion during pancreatic RT and calculated required treatment margins. Materials and methods: Cone-beam computed tomography (CBCT) and orthogonal fluoroscopy images were acquired post-RT delivery from 13 patients with locally advanced pancreatic cancer. Bony setup errors were calculated from CBCT. Inter- and intra-fraction fiducial (clip/seed/stent) motion was determined from CBCT projections and orthogonal fluoroscopy. Results: Using an off-line CBCT correction protocol, systematic (random) setup errors were 2.4 (3.2), 2.0 (1.7) and 3.2 (3.6) mm laterally (left–right), vertically (anterior–posterior) and longitudinally (cranio-caudal), respectively. Fiducial motion varied substantially. Random inter-fractional changes in mean fiducial position were 2.0, 1.6 and 2.6 mm; 95% of intra-fractional peak-to-peak fiducial motion was up to 6.7, 10.1 and 20.6 mm, respectively. Calculated clinical to planning target volume (CTV–PTV) margins were 1.4 cm laterally, 1.4 cm vertically and 3.0 cm longitudinally for 3D conformal RT, reduced to 0.9, 1.0 and 1.8 cm, respectively, if using 4D planning and online setup correction. Conclusions: Commonly used CTV–PTV margins may inadequately account for target motion during pancreatic RT. Our results indicate better immobilisation, individualised allowance for respiratory motion, online setup error correction and 4D planning would improve targeting.

  10. Impact of Close and Positive Margins in Transoral Laser Microsurgery for Tis-T2 Glottic Cancer.

    Science.gov (United States)

    Fiz, Ivana; Mazzola, Francesco; Fiz, Francesco; Marchi, Filippo; Filauro, Marta; Paderno, Alberto; Parrinello, Giampiero; Piazza, Cesare; Peretti, Giorgio

    2017-01-01

    Transoral laser microsurgery (TLM) represents one of the most effective treatment strategies for Tis-T2 glottic squamous cell carcinomas (SCC). The prognostic influence of close/positive margins is still debated, and the role of narrow band imaging (NBI) in their intraoperative definition is still to be validated on large cohort of patients. This study analyzed the influence of margin status on recurrence-free survival (RFS) and disease-specific survival (DSS). We retrospectively studied 507 cases of pTis-T1b (Group A) and 127 cases of pT2 (Group B) glottic SCC. We identified the following margin status: negative ( n  = 232), close superficial ( n  = 79), close deep (CD) ( n  = 35), positive single superficial ( n  = 146), positive multiple superficial ( n  = 94), and positive deep ( n  = 48) and analyzed their impact on RFS and DSS. Close margins were defined by tumor-margin distance <1 mm. Pre-TLM margins were defined by white light in 323 patients, whereas NBI was employed in 311 patients. In Group A, DSS and RFS were reduced in positive multiple superficial and positive deep margins (DSS = 96.1 and 97%, both p  < 0.05; RFS = 72%, p  < 0.001 and 75.8%, p  < 0.01). In Group B, DSS was reduced in positive multiple superficial margins (82.4%, p  < 0.05). RFS was reduced in positive single superficial, positive multiple superficial, and positive deep margins (62.5, 41.2, and 53.3%, p  < 0.01). In the entire population, RFS was reduced in CD margins (77.1%, p  < 0.05). Use of NBI led to improvement in RFS and DSS. The study indicates that close and positive single superficial margins do not affect DSS. By contrast, all types of margin positivity predict the occurrence of relapses, albeit with different likelihood, depending on stage/margin type. CD margins should be considered as a single risk factor. Use of NBI granted better intraoperative margins definition.

  11. Impact of Close and Positive Margins in Transoral Laser Microsurgery for Tis–T2 Glottic Cancer

    Directory of Open Access Journals (Sweden)

    Ivana Fiz

    2017-10-01

    Full Text Available IntroductionTransoral laser microsurgery (TLM represents one of the most effective treatment strategies for Tis–T2 glottic squamous cell carcinomas (SCC. The prognostic influence of close/positive margins is still debated, and the role of narrow band imaging (NBI in their intraoperative definition is still to be validated on large cohort of patients. This study analyzed the influence of margin status on recurrence-free survival (RFS and disease-specific survival (DSS.MethodsWe retrospectively studied 507 cases of pTis–T1b (Group A and 127 cases of pT2 (Group B glottic SCC. We identified the following margin status: negative (n = 232, close superficial (n = 79, close deep (CD (n = 35, positive single superficial (n = 146, positive multiple superficial (n = 94, and positive deep (n = 48 and analyzed their impact on RFS and DSS. Close margins were defined by tumor-margin distance <1 mm. Pre-TLM margins were defined by white light in 323 patients, whereas NBI was employed in 311 patients.ResultsIn Group A, DSS and RFS were reduced in positive multiple superficial and positive deep margins (DSS = 96.1 and 97%, both p < 0.05; RFS = 72%, p < 0.001 and 75.8%, p < 0.01. In Group B, DSS was reduced in positive multiple superficial margins (82.4%, p < 0.05. RFS was reduced in positive single superficial, positive multiple superficial, and positive deep margins (62.5, 41.2, and 53.3%, p < 0.01. In the entire population, RFS was reduced in CD margins (77.1%, p < 0.05. Use of NBI led to improvement in RFS and DSS.ConclusionThe study indicates that close and positive single superficial margins do not affect DSS. By contrast, all types of margin positivity predict the occurrence of relapses, albeit with different likelihood, depending on stage/margin type. CD margins should be considered as a single risk factor. Use of NBI granted better intraoperative margins definition.

  12. The marginal cost of public funds

    DEFF Research Database (Denmark)

    Kleven, Henrik Jacobsen; Kreiner, Claus Thustrup

    2006-01-01

    This paper extends the theory and measurement of the marginal cost of public funds (MCF) to account for labor force participation responses. Our work is motivated by the emerging consensus in the empirical literature that extensive (participation) responses are more important than intensive (hours...... of work) responses. In the modelling of extensive responses, we argue that it is crucial to account for the presence of non-convexities created by fixed work costs. In a non-convex framework, tax and transfer reforms give rise to discrete participation responses generating first-order effects...

  13. Intacs for early pellucid marginal degeneration.

    Science.gov (United States)

    Kymionis, George D; Aslanides, Ioannis M; Siganos, Charalambos S; Pallikaris, Ioannis G

    2004-01-01

    A 42-year-old man had Intacs (Addition Technology Inc.) implantation for early pellucid marginal degeneration (PMD). Two Intacs segments (0.45 mm thickness) were inserted uneventfully in the fashion typically used for low myopia correction (nasal-temporal). Eleven months after the procedure, the uncorrected visual acuity was 20/200, compared with counting fingers preoperatively, while the best spectacle-corrected visual acuity improved to 20/25 from 20/50. Corneal topographic pattern also improved. Although the results are encouraging, concern still exists regarding the long-term effect of this approach for the management of patients with PMD.

  14. TENNESSEE WILLIAMS E O TEATRO MARGINAL GAY

    Directory of Open Access Journals (Sweden)

    Adriana Falqueto Lemos

    2014-06-01

    Full Text Available The work developed in this text aims to read the dramatist Tennnessee Williams in a play in two scenes “E Contar Tristes Histórias das Mortes das Bonecas” which was published in Brazil in the book “Mister Paradise e outras peças em um ato” (2011. The intention is to reflect upon one of his recurring themes, the marginalization. In order to perform the analysis, the theoretical support was grounded in “Literatura e Sociedade” by Antonio Candido (2006, concerning the participation of society and authorship in a piece of literature.

  15. TENNESSEE WILLIAMS E O TEATRO MARGINAL GAY

    Directory of Open Access Journals (Sweden)

    Adriana Falqueto Lemos

    2014-09-01

    Full Text Available The work developed in this text aims to read the dramatist Tennnessee Williams in a play in two scenes “E Contar Tristes Histórias das Mortes das Bonecas” which was published in Brazil in the book “Mister Paradise e outras peças em um ato” (2011. The intention is to reflect upon one of his recurring themes, the marginalization. In order to perform the analysis, the theoretical support was grounded in “Literatura e Sociedade” by Antonio Candido (2006, concerning the participation of society and authorship in a piece of literature.

  16. Absenteeism, efficiency wages, and marginal taxes

    OpenAIRE

    Dale-Olsen, Harald

    2013-01-01

    In this paper, I test the argument that increased taxes on earnings correspond to increased incentives to shirk, thus causing an increase in the rate of worker absenteeism. After fixed job effects are taken into account, panel register data on prime-age Norwegian males who work full-time show that a higher marginal net-of-earnings-tax rate reduces the rate of absenteeism. When the net-of-tax rate is increased by 1.0 percent, absenteeism decreases by 0.3−0.5 percent. Injury-related absences ar...

  17. MAMMOGRAMS ANALYSIS USING SVM CLASSIFIER IN COMBINED TRANSFORMS DOMAIN

    Directory of Open Access Journals (Sweden)

    B.N. Prathibha

    2011-02-01

    Full Text Available Breast cancer is a primary cause of mortality and morbidity in women. Reports reveal that earlier the detection of abnormalities, better the improvement in survival. Digital mammograms are one of the most effective means for detecting possible breast anomalies at early stages. Digital mammograms supported with Computer Aided Diagnostic (CAD systems help the radiologists in taking reliable decisions. The proposed CAD system extracts wavelet features and spectral features for the better classification of mammograms. The Support Vector Machines classifier is used to analyze 206 mammogram images from Mias database pertaining to the severity of abnormality, i.e., benign and malign. The proposed system gives 93.14% accuracy for discrimination between normal-malign and 87.25% accuracy for normal-benign samples and 89.22% accuracy for benign-malign samples. The study reveals that features extracted in hybrid transform domain with SVM classifier proves to be a promising tool for analysis of mammograms.

  18. Evaluation of LDA Ensembles Classifiers for Brain Computer Interface

    International Nuclear Information System (INIS)

    Arjona, Cristian; Pentácolo, José; Gareis, Iván; Atum, Yanina; Gentiletti, Gerardo; Acevedo, Rubén; Rufiner, Leonardo

    2011-01-01

    The Brain Computer Interface (BCI) translates brain activity into computer commands. To increase the performance of the BCI, to decode the user intentions it is necessary to get better the feature extraction and classification techniques. In this article the performance of a three linear discriminant analysis (LDA) classifiers ensemble is studied. The system based on ensemble can theoretically achieved better classification results than the individual counterpart, regarding individual classifier generation algorithm and the procedures for combine their outputs. Classic algorithms based on ensembles such as bagging and boosting are discussed here. For the application on BCI, it was concluded that the generated results using ER and AUC as performance index do not give enough information to establish which configuration is better.

  19. The three-dimensional origin of the classifying algebra

    International Nuclear Information System (INIS)

    Fuchs, Juergen; Schweigert, Christoph; Stigner, Carl

    2010-01-01

    It is known that reflection coefficients for bulk fields of a rational conformal field theory in the presence of an elementary boundary condition can be obtained as representation matrices of irreducible representations of the classifying algebra, a semisimple commutative associative complex algebra. We show how this algebra arises naturally from the three-dimensional geometry of factorization of correlators of bulk fields on the disk. This allows us to derive explicit expressions for the structure constants of the classifying algebra as invariants of ribbon graphs in the three-manifold S 2 xS 1 . Our result unravels a precise relation between intertwiners of the action of the mapping class group on spaces of conformal blocks and boundary conditions in rational conformal field theories.

  20. Machine learning classifiers and fMRI: a tutorial overview.

    Science.gov (United States)

    Pereira, Francisco; Mitchell, Tom; Botvinick, Matthew

    2009-03-01

    Interpreting brain image experiments requires analysis of complex, multivariate data. In recent years, one analysis approach that has grown in popularity is the use of machine learning algorithms to train classifiers to decode stimuli, mental states, behaviours and other variables of interest from fMRI data and thereby show the data contain information about them. In this tutorial overview we review some of the key choices faced in using this approach as well as how to derive statistically significant results, illustrating each point from a case study. Furthermore, we show how, in addition to answering the question of 'is there information about a variable of interest' (pattern discrimination), classifiers can be used to tackle other classes of question, namely 'where is the information' (pattern localization) and 'how is that information encoded' (pattern characterization).

  1. Lung Nodule Detection in CT Images using Neuro Fuzzy Classifier

    Directory of Open Access Journals (Sweden)

    M. Usman Akram

    2013-07-01

    Full Text Available Automated lung cancer detection using computer aided diagnosis (CAD is an important area in clinical applications. As the manual nodule detection is very time consuming and costly so computerized systems can be helpful for this purpose. In this paper, we propose a computerized system for lung nodule detection in CT scan images. The automated system consists of two stages i.e. lung segmentation and enhancement, feature extraction and classification. The segmentation process will result in separating lung tissue from rest of the image, and only the lung tissues under examination are considered as candidate regions for detecting malignant nodules in lung portion. A feature vector for possible abnormal regions is calculated and regions are classified using neuro fuzzy classifier. It is a fully automatic system that does not require any manual intervention and experimental results show the validity of our system.

  2. A Bayesian Classifier for X-Ray Pulsars Recognition

    Directory of Open Access Journals (Sweden)

    Hao Liang

    2016-01-01

    Full Text Available Recognition for X-ray pulsars is important for the problem of spacecraft’s attitude determination by X-ray Pulsar Navigation (XPNAV. By using the nonhomogeneous Poisson model of the received photons and the minimum recognition error criterion, a classifier based on the Bayesian theorem is proposed. For X-ray pulsars recognition with unknown Doppler frequency and initial phase, the features of every X-ray pulsar are extracted and the unknown parameters are estimated using the Maximum Likelihood (ML method. Besides that, a method to recognize unknown X-ray pulsars or X-ray disturbances is proposed. Simulation results certificate the validity of the proposed Bayesian classifier.

  3. Evaluation of safety margin of packaging for radioactive materials transport during a severe fire

    International Nuclear Information System (INIS)

    Gilles, P.; Ringot, C.; Warniez, P.; Grall, L.; Perrot, J.

    1986-06-01

    A high safety is obtained by International regulations on radioactive materials transport. It is obtained by packaging design adapted to the potential risk. An important accident to consider is fire for two reasons: the probability of fire occuring for time and temperature higher than conditions applied to type B packaging (800 deg C, 1/2 hr) is not negligible, particularly for air or maritime transport. Safety margins are studied by computation and experimental tests. This report presents results obtained for different types of packagings. Results show a large safety margin [fr

  4. Wavelet classifier used for diagnosing shock absorbers in cars

    Directory of Open Access Journals (Sweden)

    Janusz GARDULSKI

    2007-01-01

    Full Text Available The paper discusses some commonly used methods of hydraulic absorbertesting. Disadvantages of the methods are described. A vibro-acoustic method is presented and recommended for practical use on existing test rigs. The method is based on continuous wavelet analysis combined with neural classifier and 25-neuron, one-way, three-layer back propagation network. The analysis satisfies the intended aim.

  5. Classified installations for environmental protection subject to declaration. Tome 2

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Legislation concerning classified installations govern most of industries or dangerous or pollutant activities. This legislation aims to prevent risks and harmful effects coming from an installation, air pollution, water pollution, noise, wastes produced by installations, even aesthetic bad effects. Pollutant or dangerous activities are defined in a list called nomenclature which obliged installations to a rule of declaration or authorization. Technical regulations ordered by the secretary of state for the environment are listed in tome 2

  6. Classified study and clinical value of the phase imaging features

    International Nuclear Information System (INIS)

    Dang Yaping; Ma Aiqun; Zheng Xiaopu; Yang Aimin; Xiao Jiang; Gao Xinyao

    2000-01-01

    445 patients with various heart diseases were examined by the gated cardiac blood pool imaging, and the phase was classified. The relationship between the seven types with left ventricular function index, clinical heart function, different heart diseases as well as electrocardiograph was studied. The results showed that the phase image classification could match with the clinical heart function. It can visually, directly and accurately indicate clinical heart function and can be used to identify diagnosis of heart disease

  7. Evaluating Classifiers in Detecting 419 Scams in Bilingual Cybercriminal Communities

    OpenAIRE

    Mbaziira, Alex V.; Abozinadah, Ehab; Jones Jr, James H.

    2015-01-01

    Incidents of organized cybercrime are rising because of criminals are reaping high financial rewards while incurring low costs to commit crime. As the digital landscape broadens to accommodate more internet-enabled devices and technologies like social media, more cybercriminals who are not native English speakers are invading cyberspace to cash in on quick exploits. In this paper we evaluate the performance of three machine learning classifiers in detecting 419 scams in a bilingual Nigerian c...

  8. Efficient Multi-Concept Visual Classifier Adaptation in Changing Environments

    Science.gov (United States)

    2016-09-01

    sets of images, hand annotated by humans with region boundary outlines followed by label assignment. This annotation is time consuming , and...performed as a necessary but time- consuming step to train su- pervised classifiers. U nsupervised o r s elf-supervised a pproaches h ave b een used to...time- consuming labeling pro- cess. However, the lack of human supervision has limited most of this work to binary classification (e.g., traversability

  9. Classifying apples by the means of fluorescence imaging

    OpenAIRE

    Codrea, Marius C.; Nevalainen, Olli S.; Tyystjärvi, Esa; VAN DE VEN, Martin; VALCKE, Roland

    2004-01-01

    Classification of harvested apples when predicting their storage potential is an important task. This paper describes how chlorophyll a fluorescence images taken in blue light through a red filter, can be used to classify apples. In such an image, fluorescence appears as a relatively homogenous area broken by a number of small nonfluorescing spots, corresponding to normal corky tissue patches, lenticells, and to damaged areas that lower the quality of the apple. The damaged regions appear mor...

  10. Building Road-Sign Classifiers Using a Trainable Similarity Measure

    Czech Academy of Sciences Publication Activity Database

    Paclík, P.; Novovičová, Jana; Duin, R.P.W.

    2006-01-01

    Roč. 7, č. 3 (2006), s. 309-321 ISSN 1524-9050 R&D Projects: GA AV ČR IAA2075302 EU Projects: European Commission(XE) 507752 - MUSCLE Institutional research plan: CEZ:AV0Z10750506 Keywords : classifier system design * road-sign classification * similarity data representation Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.434, year: 2006 http://www.ewh.ieee.org/tc/its/trans.html

  11. Classifying Floating Potential Measurement Unit Data Products as Science Data

    Science.gov (United States)

    Coffey, Victoria; Minow, Joseph

    2015-01-01

    We are Co-Investigators for the Floating Potential Measurement Unit (FPMU) on the International Space Station (ISS) and members of the FPMU operations and data analysis team. We are providing this memo for the purpose of classifying raw and processed FPMU data products and ancillary data as NASA science data with unrestricted, public availability in order to best support science uses of the data.

  12. Snoring classified: The Munich-Passau Snore Sound Corpus.

    Science.gov (United States)

    Janott, Christoph; Schmitt, Maximilian; Zhang, Yue; Qian, Kun; Pandit, Vedhas; Zhang, Zixing; Heiser, Clemens; Hohenhorst, Winfried; Herzog, Michael; Hemmert, Werner; Schuller, Björn

    2018-03-01

    Snoring can be excited in different locations within the upper airways during sleep. It was hypothesised that the excitation locations are correlated with distinct acoustic characteristics of the snoring noise. To verify this hypothesis, a database of snore sounds is developed, labelled with the location of sound excitation. Video and audio recordings taken during drug induced sleep endoscopy (DISE) examinations from three medical centres have been semi-automatically screened for snore events, which subsequently have been classified by ENT experts into four classes based on the VOTE classification. The resulting dataset containing 828 snore events from 219 subjects has been split into Train, Development, and Test sets. An SVM classifier has been trained using low level descriptors (LLDs) related to energy, spectral features, mel frequency cepstral coefficients (MFCC), formants, voicing, harmonic-to-noise ratio (HNR), spectral harmonicity, pitch, and microprosodic features. An unweighted average recall (UAR) of 55.8% could be achieved using the full set of LLDs including formants. Best performing subset is the MFCC-related set of LLDs. A strong difference in performance could be observed between the permutations of train, development, and test partition, which may be caused by the relatively low number of subjects included in the smaller classes of the strongly unbalanced data set. A database of snoring sounds is presented which are classified according to their sound excitation location based on objective criteria and verifiable video material. With the database, it could be demonstrated that machine classifiers can distinguish different excitation location of snoring sounds in the upper airway based on acoustic parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Young module multiplicities and classifying the indecomposable Young permutation modules

    OpenAIRE

    Gill, Christopher C.

    2012-01-01

    We study the multiplicities of Young modules as direct summands of permutation modules on cosets of Young subgroups. Such multiplicities have become known as the p-Kostka numbers. We classify the indecomposable Young permutation modules, and, applying the Brauer construction for p-permutation modules, we give some new reductions for p-Kostka numbers. In particular we prove that p-Kostka numbers are preserved under multiplying partitions by p, and strengthen a known reduction given by Henke, c...

  14. BIOPHARMACEUTICS CLASSIFICATION SYSTEM: A STRATEGIC TOOL FOR CLASSIFYING DRUG SUBSTANCES

    OpenAIRE

    Rohilla Seema; Rohilla Ankur; Marwaha RK; Nanda Arun

    2011-01-01

    The biopharmaceutical classification system (BCS) is a scientific approach for classifying drug substances based on their dose/solubility ratio and intestinal permeability. The BCS has been developed to allow prediction of in vivo pharmacokinetic performance of drug products from measurements of permeability and solubility. Moreover, the drugs can be categorized into four classes of BCS on the basis of permeability and solubility namely; high permeability high solubility, high permeability lo...

  15. Self-organizing map classifier for stressed speech recognition

    Science.gov (United States)

    Partila, Pavol; Tovarek, Jaromir; Voznak, Miroslav

    2016-05-01

    This paper presents a method for detecting speech under stress using Self-Organizing Maps. Most people who are exposed to stressful situations can not adequately respond to stimuli. Army, police, and fire department occupy the largest part of the environment that are typical of an increased number of stressful situations. The role of men in action is controlled by the control center. Control commands should be adapted to the psychological state of a man in action. It is known that the psychological changes of the human body are also reflected physiologically, which consequently means the stress effected speech. Therefore, it is clear that the speech stress recognizing system is required in the security forces. One of the possible classifiers, which are popular for its flexibility, is a self-organizing map. It is one type of the artificial neural networks. Flexibility means independence classifier on the character of the input data. This feature is suitable for speech processing. Human Stress can be seen as a kind of emotional state. Mel-frequency cepstral coefficients, LPC coefficients, and prosody features were selected for input data. These coefficients were selected for their sensitivity to emotional changes. The calculation of the parameters was performed on speech recordings, which can be divided into two classes, namely the stress state recordings and normal state recordings. The benefit of the experiment is a method using SOM classifier for stress speech detection. Results showed the advantage of this method, which is input data flexibility.

  16. Deconstructing Cross-Entropy for Probabilistic Binary Classifiers

    Directory of Open Access Journals (Sweden)

    Daniel Ramos

    2018-03-01

    Full Text Available In this work, we analyze the cross-entropy function, widely used in classifiers both as a performance measure and as an optimization objective. We contextualize cross-entropy in the light of Bayesian decision theory, the formal probabilistic framework for making decisions, and we thoroughly analyze its motivation, meaning and interpretation from an information-theoretical point of view. In this sense, this article presents several contributions: First, we explicitly analyze the contribution to cross-entropy of (i prior knowledge; and (ii the value of the features in the form of a likelihood ratio. Second, we introduce a decomposition of cross-entropy into two components: discrimination and calibration. This decomposition enables the measurement of different performance aspects of a classifier in a more precise way; and justifies previously reported strategies to obtain reliable probabilities by means of the calibration of the output of a discriminating classifier. Third, we give different information-theoretical interpretations of cross-entropy, which can be useful in different application scenarios, and which are related to the concept of reference probabilities. Fourth, we present an analysis tool, the Empirical Cross-Entropy (ECE plot, a compact representation of cross-entropy and its aforementioned decomposition. We show the power of ECE plots, as compared to other classical performance representations, in two diverse experimental examples: a speaker verification system, and a forensic case where some glass findings are present.

  17. General and Local: Averaged k-Dependence Bayesian Classifiers

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.

  18. Evaluation of Polarimetric SAR Decomposition for Classifying Wetland Vegetation Types

    Directory of Open Access Journals (Sweden)

    Sang-Hoon Hong

    2015-07-01

    Full Text Available The Florida Everglades is the largest subtropical wetland system in the United States and, as with subtropical and tropical wetlands elsewhere, has been threatened by severe environmental stresses. It is very important to monitor such wetlands to inform management on the status of these fragile ecosystems. This study aims to examine the applicability of TerraSAR-X quadruple polarimetric (quad-pol synthetic aperture radar (PolSAR data for classifying wetland vegetation in the Everglades. We processed quad-pol data using the Hong & Wdowinski four-component decomposition, which accounts for double bounce scattering in the cross-polarization signal. The calculated decomposition images consist of four scattering mechanisms (single, co- and cross-pol double, and volume scattering. We applied an object-oriented image analysis approach to classify vegetation types with the decomposition results. We also used a high-resolution multispectral optical RapidEye image to compare statistics and classification results with Synthetic Aperture Radar (SAR observations. The calculated classification accuracy was higher than 85%, suggesting that the TerraSAR-X quad-pol SAR signal had a high potential for distinguishing different vegetation types. Scattering components from SAR acquisition were particularly advantageous for classifying mangroves along tidal channels. We conclude that the typical scattering behaviors from model-based decomposition are useful for discriminating among different wetland vegetation types.

  19. A Novel Cascade Classifier for Automatic Microcalcification Detection.

    Directory of Open Access Journals (Sweden)

    Seung Yeon Shin

    Full Text Available In this paper, we present a novel cascaded classification framework for automatic detection of individual and clusters of microcalcifications (μC. Our framework comprises three classification stages: i a random forest (RF classifier for simple features capturing the second order local structure of individual μCs, where non-μC pixels in the target mammogram are efficiently eliminated; ii a more complex discriminative restricted Boltzmann machine (DRBM classifier for μC candidates determined in the RF stage, which automatically learns the detailed morphology of μC appearances for improved discriminative power; and iii a detector to detect clusters of μCs from the individual μC detection results, using two different criteria. From the two-stage RF-DRBM classifier, we are able to distinguish μCs using explicitly computed features, as well as learn implicit features that are able to further discriminate between confusing cases. Experimental evaluation is conducted on the original Mammographic Image Analysis Society (MIAS and mini-MIAS databases, as well as our own Seoul National University Bundang Hospital digital mammographic database. It is shown that the proposed method outperforms comparable methods in terms of receiver operating characteristic (ROC and precision-recall curves for detection of individual μCs and free-response receiver operating characteristic (FROC curve for detection of clustered μCs.

  20. Patients on weaning trials classified with support vector machines

    International Nuclear Information System (INIS)

    Garde, Ainara; Caminal, Pere; Giraldo, Beatriz F; Schroeder, Rico; Voss, Andreas; Benito, Salvador

    2010-01-01

    The process of discontinuing mechanical ventilation is called weaning and is one of the most challenging problems in intensive care. An unnecessary delay in the discontinuation process and an early weaning trial are undesirable. This study aims to characterize the respiratory pattern through features that permit the identification of patients' conditions in weaning trials. Three groups of patients have been considered: 94 patients with successful weaning trials, who could maintain spontaneous breathing after 48 h (GSucc); 39 patients who failed the weaning trial (GFail) and 21 patients who had successful weaning trials, but required reintubation in less than 48 h (GRein). Patients are characterized by their cardiorespiratory interactions, which are described by joint symbolic dynamics (JSD) applied to the cardiac interbeat and breath durations. The most discriminating features in the classification of the different groups of patients (GSucc, GFail and GRein) are identified by support vector machines (SVMs). The SVM-based feature selection algorithm has an accuracy of 81% in classifying GSucc versus the rest of the patients, 83% in classifying GRein versus GSucc patients and 81% in classifying GRein versus the rest of the patients. Moreover, a good balance between sensitivity and specificity is achieved in all classifications