WorldWideScience

Sample records for relevant classification codes

  1. Blind Signal Classification via Spare Coding

    Science.gov (United States)

    2016-04-10

    Blind Signal Classification via Sparse Coding Youngjune Gwon MIT Lincoln Laboratory gyj@ll.mit.edu Siamak Dastangoo MIT Lincoln Laboratory sia...achieve blind signal classification with no prior knowledge about signals (e.g., MCS, pulse shaping) in an arbitrary RF channel. Since modulated RF...classification method. Our results indicate that we can separate different classes of digitally modulated signals from blind sampling with 70.3% recall and 24.6

  2. Classification of first branchial cleft anomalies: is it clinically relevant ...

    African Journals Online (AJOL)

    Background: There are three classification systems for first branchial cleft anomalies currently in use. The Arnot, Work and Olsen classifications describe these lesions on the basis of morphology, tissue of origin and clinical appearance. However, the clinical relevance of these classifications is debated, as they may not be ...

  3. Classification of perimenstrual headache: clinical relevance.

    Science.gov (United States)

    MacGregor, E Anne

    2012-10-01

    Although more than 50% of women with migraine report an association between migraine and menstruation, menstruation has generally considered to be no more than one of a variety of different migraine triggers. In 2004, the second edition of the International Classification of Headache Disorders introduced specific diagnostic criteria for menstrual migraine. Results from research undertaken subsequently lend support to the clinical impression that menstrual migraine should be seen as a distinct clinical entity. This paper reviews the recent research and provides specific recommendations for consideration in future editions of the classification.

  4. EXTRACELLULAR VESICLES: CLASSIFICATION, FUNCTIONS AND CLINICAL RELEVANCE

    Directory of Open Access Journals (Sweden)

    A. V. Oberemko

    2014-12-01

    Full Text Available This review presents a generalized definition of vesicles as bilayer extracellular organelles of all celular forms of life: not only eu-, but also prokaryotic. The structure and composition of extracellular vesicles, history of research, nomenclature, their impact on life processes in health and disease are discussed. Moreover, vesicles may be useful as clinical instruments for biomarkers, and they are promising as biotechnological drug. However, many questions in this area are still unresolved and need to be addressed in the future. The most interesting from the point of view of practical health care represents a direction to study the effect of exosomes and microvesicles in the development and progression of a particular disease, the possibility of adjusting the pathological process by means of extracellular vesicles of a particular type, acting as an active ingredient. Relevant is the further elucidation of the role and importance of exosomes to the surrounding cells, tissues and organs at the molecular level, the prospects for the use of non-cellular vesicles as biomarkers of disease.

  5. A systematic literature review of automated clinical coding and classification systems.

    Science.gov (United States)

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  6. On the relevance of spectral features for instrument classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Sigurdsson, Sigurdur; Hansen, Lars Kai

    2007-01-01

    Automatic knowledge extraction from music signals is a key component for most music organization and music information retrieval systems. In this paper, we consider the problem of instrument modelling and instrument classification from the rough audio data. Existing systems for automatic instrument...... classification operate normally on a relatively large number of features, from which those related to the spectrum of the audio signal are particularly relevant. In this paper, we confront two different models about the spectral characterization of musical instruments. The first assumes a constant envelope...

  7. T-ray relevant frequencies for osteosarcoma classification

    Science.gov (United States)

    Withayachumnankul, W.; Ferguson, B.; Rainsford, T.; Findlay, D.; Mickan, S. P.; Abbott, D.

    2006-01-01

    We investigate the classification of the T-ray response of normal human bone cells and human osteosarcoma cells, grown in culture. Given the magnitude and phase responses within a reliable spectral range as features for input vectors, a trained support vector machine can correctly classify the two cell types to some extent. Performance of the support vector machine is deteriorated by the curse of dimensionality, resulting from the comparatively large number of features in the input vectors. Feature subset selection methods are used to select only an optimal number of relevant features for inputs. As a result, an improvement in generalization performance is attainable, and the selected frequencies can be used for further describing different mechanisms of the cells, responding to T-rays. We demonstrate a consistent classification accuracy of 89.6%, while the only one fifth of the original features are retained in the data set.

  8. Breathing (and Coding?) a Bit Easier: Changes to International Classification of Disease Coding for Pulmonary Hypertension.

    Science.gov (United States)

    Mathai, Stephen C; Mathew, Sherin

    2018-04-20

    International Classification of Disease (ICD) coding system is broadly utilized by healthcare providers, hospitals, healthcare payers, and governments to track health trends and statistics at the global, national, and local levels and to provide a reimbursement framework for medical care based upon diagnosis and severity of illness. The current iteration of the ICD system, ICD-10, was implemented in 2015. While many changes to the prior ICD-9 system were included in the ICD-10 system, the newer revision failed to adequately reflect advances in the clinical classification of certain diseases such as pulmonary hypertension (PH). Recently, a proposal to modify the ICD-10 codes for PH was considered and ultimately adopted for inclusion as updates to ICD-10 coding system. While these revisions better reflect the current clinical classification of PH, in the future, further changes should be considered to improve the accuracy and ease of coding for all forms of PH. Copyright © 2018. Published by Elsevier Inc.

  9. Quantitative software-reliability analysis of computer codes relevant to nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.

    1981-12-01

    This report presents the results of the first year of an ongoing research program to determine the probability of failure characteristics of computer codes relevant to nuclear safety. An introduction to both qualitative and quantitative aspects of nuclear software is given. A mathematical framework is presented which will enable the a priori prediction of the probability of failure characteristics of a code given the proper specification of its properties. The framework consists of four parts: (1) a classification system for software errors and code failures; (2) probabilistic modeling for selected reliability characteristics; (3) multivariate regression analyses to establish predictive relationships among reliability characteristics and generic code property and development parameters; and (4) the associated information base. Preliminary data of the type needed to support the modeling and the predictions of this program are described. Illustrations of the use of the modeling are given but the results so obtained, as well as all results of code failure probabilities presented herein, are based on data which at this point are preliminary, incomplete, and possibly non-representative of codes relevant to nuclear safety

  10. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Liu, Si; Xu, Changsheng; Ahuja, Narendra

    2013-01-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  11. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  12. Improving the coding and classification of ambulance data through the application of International Classification of Disease 10th revision.

    Science.gov (United States)

    Cantwell, Kate; Morgans, Amee; Smith, Karen; Livingston, Michael; Dietze, Paul

    2014-02-01

    This paper aims to examine whether an adaptation of the International Classification of Disease (ICD) coding system can be applied retrospectively to final paramedic assessment data in an ambulance dataset with a view to developing more fine-grained, clinically relevant case definitions than are available through point-of-call data. Over 1.2 million case records were extracted from the Ambulance Victoria data warehouse. Data fields included dispatch code, cause (CN) and final primary assessment (FPA). Each FPA was converted to an ICD-10-AM code using word matching or best fit. ICD-10-AM codes were then converted into Major Diagnostic Categories (MDC). CN was aligned with the ICD-10-AM codes for external cause of morbidity and mortality. The most accurate results were obtained when ICD-10-AM codes were assigned using information from both FPA and CN. Comparison of cases coded as unconscious at point-of-call with the associated paramedic assessment highlighted the extra clinical detail obtained when paramedic assessment data are used. Ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Coding of ambulance data using ICD-10-AM allows for comparison of not only ambulance service users but also with other population groups. WHAT IS KNOWN ABOUT THE TOPIC? There is no reliable and standard coding and categorising system for paramedic assessment data contained in ambulance service databases. WHAT DOES THIS PAPER ADD? This study demonstrates that ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Representation of ambulance case types using ICD-10-AM-coded information obtained after paramedic assessment is more fine grained and clinically relevant than point-of-call data, which uses caller information before ambulance attendance. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? This paper describes

  13. Conceptual-driven classification for coding advise in health insurance reimbursement.

    Science.gov (United States)

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in

  14. 49 CFR 173.52 - Classification codes and compatibility groups of explosives.

    Science.gov (United States)

    2010-10-01

    ... containing both an explosive substance and flammable liquid or gel J 1.1J1.2J 1.3J Article containing both an... classification codes for substances and articles described in the first column of table 1. Table 2 shows the... possible classification codes for explosives. Table 1—Classification Codes Description of substances or...

  15. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  16. On the classification of long non-coding RNAs

    KAUST Repository

    Ma, Lina

    2013-06-01

    Long non-coding RNAs (lncRNAs) have been found to perform various functions in a wide variety of important biological processes. To make easier interpretation of lncRNA functionality and conduct deep mining on these transcribed sequences, it is convenient to classify lncRNAs into different groups. Here, we summarize classification methods of lncRNAs according to their four major features, namely, genomic location and context, effect exerted on DNA sequences, mechanism of functioning and their targeting mechanism. In combination with the presently available function annotations, we explore potential relationships between different classification categories, and generalize and compare biological features of different lncRNAs within each category. Finally, we present our view on potential further studies. We believe that the classifications of lncRNAs as indicated above are of fundamental importance for lncRNA studies, helpful for further investigation of specific lncRNAs, for formulation of new hypothesis based on different features of lncRNA and for exploration of the underlying lncRNA functional mechanisms. © 2013 Landes Bioscience.

  17. Significance of perceptually relevant image decolorization for scene classification

    Science.gov (United States)

    Viswanathan, Sowmya; Divakaran, Govind; Soman, Kutti Padanyl

    2017-11-01

    Color images contain luminance and chrominance components representing the intensity and color information, respectively. The objective of this paper is to show the significance of incorporating chrominance information to the task of scene classification. An improved color-to-grayscale image conversion algorithm that effectively incorporates chrominance information is proposed using the color-to-gray structure similarity index and singular value decomposition to improve the perceptual quality of the converted grayscale images. The experimental results based on an image quality assessment for image decolorization and its success rate (using the Cadik and COLOR250 datasets) show that the proposed image decolorization technique performs better than eight existing benchmark algorithms for image decolorization. In the second part of the paper, the effectiveness of incorporating the chrominance component for scene classification tasks is demonstrated using a deep belief network-based image classification system developed using dense scale-invariant feature transforms. The amount of chrominance information incorporated into the proposed image decolorization technique is confirmed with the improvement to the overall scene classification accuracy. Moreover, the overall scene classification performance improved by combining the models obtained using the proposed method and conventional decolorization methods.

  18. Joint Concept Correlation and Feature-Concept Relevance Learning for Multilabel Classification.

    Science.gov (United States)

    Zhao, Xiaowei; Ma, Zhigang; Li, Zhi; Li, Zhihui

    2018-02-01

    In recent years, multilabel classification has attracted significant attention in multimedia annotation. However, most of the multilabel classification methods focus only on the inherent correlations existing among multiple labels and concepts and ignore the relevance between features and the target concepts. To obtain more robust multilabel classification results, we propose a new multilabel classification method aiming to capture the correlations among multiple concepts by leveraging hypergraph that is proved to be beneficial for relational learning. Moreover, we consider mining feature-concept relevance, which is often overlooked by many multilabel learning algorithms. To better show the feature-concept relevance, we impose a sparsity constraint on the proposed method. We compare the proposed method with several other multilabel classification methods and evaluate the classification performance by mean average precision on several data sets. The experimental results show that the proposed method outperforms the state-of-the-art methods.

  19. Relevance of the formal red meat classification system to the South ...

    African Journals Online (AJOL)

    Relevance of the formal red meat classification system to the South African ... to market information make them less willing to sell their animals through the formal market. ... Keywords: Communal farmers, marketing system, meat industry ...

  20. Pathohistological classification systems in gastric cancer: diagnostic relevance and prognostic value.

    Science.gov (United States)

    Berlth, Felix; Bollschweiler, Elfriede; Drebber, Uta; Hoelscher, Arnulf H; Moenig, Stefan

    2014-05-21

    Several pathohistological classification systems exist for the diagnosis of gastric cancer. Many studies have investigated the correlation between the pathohistological characteristics in gastric cancer and patient characteristics, disease specific criteria and overall outcome. It is still controversial as to which classification system imparts the most reliable information, and therefore, the choice of system may vary in clinical routine. In addition to the most common classification systems, such as the Laurén and the World Health Organization (WHO) classifications, other authors have tried to characterize and classify gastric cancer based on the microscopic morphology and in reference to the clinical outcome of the patients. In more than 50 years of systematic classification of the pathohistological characteristics of gastric cancer, there is no sole classification system that is consistently used worldwide in diagnostics and research. However, several national guidelines for the treatment of gastric cancer refer to the Laurén or the WHO classifications regarding therapeutic decision-making, which underlines the importance of a reliable classification system for gastric cancer. The latest results from gastric cancer studies indicate that it might be useful to integrate DNA- and RNA-based features of gastric cancer into the classification systems to establish prognostic relevance. This article reviews the diagnostic relevance and the prognostic value of different pathohistological classification systems in gastric cancer.

  1. 78 FR 21612 - Medical Device Classification Product Codes; Guidance for Industry and Food and Drug...

    Science.gov (United States)

    2013-04-11

    ... driving force for CDRH's internal organizational structure as well. These Panels were established with the... guidance represents the Agency's current thinking on medical device classification product codes. It does...

  2. Defining pediatric traumatic brain injury using International Classification of Diseases Version 10 Codes: a systematic review.

    Science.gov (United States)

    Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela

    2015-02-04

    Although healthcare administrative data are commonly used for traumatic brain injury (TBI) research, there is currently no consensus or consistency on the International Classification of Diseases Version 10 (ICD-10) codes used to define TBI among children and youth internationally. This study systematically reviewed the literature to explore the range of ICD-10 codes that are used to define TBI in this population. The identification of the range of ICD-10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews were systematically searched. Grey literature was searched using Grey Matters and Google. Reference lists of included articles were also searched for relevant studies. Two reviewers independently screened all titles and abstracts using pre-defined inclusion and exclusion criteria. A full text screen was conducted on articles that met the first screen inclusion criteria. All full text articles that met the pre-defined inclusion criteria were included for analysis in this systematic review. A total of 1,326 publications were identified through the predetermined search strategy and 32 articles/reports met all eligibility criteria for inclusion in this review. Five articles specifically examined children and youth aged 19 years or under with TBI. ICD-10 case definitions ranged from the broad injuries to the head codes (ICD-10 S00 to S09) to concussion only (S06.0). There was overwhelming consensus on the inclusion of ICD-10 code S06, intracranial injury, while codes S00 (superficial injury of the head), S03 (dislocation, sprain, and strain of joints and ligaments of head), and S05 (injury of eye and orbit) were only used by articles that examined head injury, none of which specifically examined children and

  3. Re-enacting the relevance of the moral codes of the un-shrined ...

    African Journals Online (AJOL)

    Also they maintain social justice, give cultural identity and encourage patriotism. The work will, identify the moral codes and analyze the methods of enforcing them. It will also delve into how the gods are appeased. The socio-cultural significance would be treated for a clearer view of the relevance of the moral codes of the ...

  4. On the classification of long non-coding RNAs

    KAUST Repository

    Ma, Lina; Bajic, Vladimir B.; Zhang, Zhang

    2013-01-01

    Long non-coding RNAs (lncRNAs) have been found to perform various functions in a wide variety of important biological processes. To make easier interpretation of lncRNA functionality and conduct deep mining on these transcribed sequences

  5. Classification of working processes to facilitate occupational hazard coding on industrial trawlers

    DEFF Research Database (Denmark)

    Jensen, Olaf C; Stage, Søren; Noer, Preben

    2003-01-01

    BACKGROUND: Commercial fishing is an extremely dangerous economic activity. In order to more accurately describe the risks involved, a specific injury coding based on the working process was developed. METHOD: Observation on six different types of vessels was conducted and allowed a description...... and a classification of the principal working processes on all kinds of vessels and a detailed classification for industrial trawlers. In industrial trawling, fish are landed for processing purposes, for example, for the production of fish oil and fish meal. The classification was subsequently used to code...... the injuries reported to the Danish Maritime Authority over a 5-year period. RESULTS: On industrial trawlers, 374 of 394 (95%) injuries were captured by the classification. Setting out and hauling in the gear and nets were the processes with the most injuries and accounted for 58.9% of all injuries...

  6. Stratification and prognostic relevance of Jass’s molecular classification of colorectal cancer

    OpenAIRE

    Inti eZlobec; Inti eZlobec; Michel P Bihl; Anja eFoerster; Alex eRufle; Luigi eTerracciano; Alessandro eLugli; Alessandro eLugli

    2012-01-01

    Background: The current proposed model of colorectal tumorigenesis is based primarily on CpG island methylator phenotype (CIMP), microsatellite instability (MSI), KRAS, BRAF, and methylation status of 0-6-Methylguanine DNA Methyltransferase (MGMT) and classifies tumors into 5 subgroups. The aim of this study is to validate this molecular classification and test its prognostic relevance. Methods: 302 patients were included in this study. Molecular analysis was performed for 5 CIMP-related pro...

  7. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10) for encoding video data and a corresponding video decoding device, wherein during decoding PPG relevant information shall be preserved. For this purpose the video coding device (10) comprises a first encoder (20) for encoding input video

  8. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  9. Automatic classification and detection of clinically relevant images for diabetic retinopathy

    Science.gov (United States)

    Xu, Xinyu; Li, Baoxin

    2008-03-01

    We proposed a novel approach to automatic classification of Diabetic Retinopathy (DR) images and retrieval of clinically-relevant DR images from a database. Given a query image, our approach first classifies the image into one of the three categories: microaneurysm (MA), neovascularization (NV) and normal, and then it retrieves DR images that are clinically-relevant to the query image from an archival image database. In the classification stage, the query DR images are classified by the Multi-class Multiple-Instance Learning (McMIL) approach, where images are viewed as bags, each of which contains a number of instances corresponding to non-overlapping blocks, and each block is characterized by low-level features including color, texture, histogram of edge directions, and shape. McMIL first learns a collection of instance prototypes for each class that maximizes the Diverse Density function using Expectation- Maximization algorithm. A nonlinear mapping is then defined using the instance prototypes and maps every bag to a point in a new multi-class bag feature space. Finally a multi-class Support Vector Machine is trained in the multi-class bag feature space. In the retrieval stage, we retrieve images from the archival database who bear the same label with the query image, and who are the top K nearest neighbors of the query image in terms of similarity in the multi-class bag feature space. The classification approach achieves high classification accuracy, and the retrieval of clinically-relevant images not only facilitates utilization of the vast amount of hidden diagnostic knowledge in the database, but also improves the efficiency and accuracy of DR lesion diagnosis and assessment.

  10. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  11. Maximum relevance, minimum redundancy band selection based on neighborhood rough set for hyperspectral data classification

    International Nuclear Information System (INIS)

    Liu, Yao; Chen, Yuehua; Tan, Kezhu; Xie, Hong; Wang, Liguo; Xie, Wu; Yan, Xiaozhen; Xu, Zhen

    2016-01-01

    Band selection is considered to be an important processing step in handling hyperspectral data. In this work, we selected informative bands according to the maximal relevance minimal redundancy (MRMR) criterion based on neighborhood mutual information. Two measures MRMR difference and MRMR quotient were defined and a forward greedy search for band selection was constructed. The performance of the proposed algorithm, along with a comparison with other methods (neighborhood dependency measure based algorithm, genetic algorithm and uninformative variable elimination algorithm), was studied using the classification accuracy of extreme learning machine (ELM) and random forests (RF) classifiers on soybeans’ hyperspectral datasets. The results show that the proposed MRMR algorithm leads to promising improvement in band selection and classification accuracy. (paper)

  12. Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data (SCED). NFES 2011-801

    Science.gov (United States)

    National Forum on Education Statistics, 2011

    2011-01-01

    In this handbook, "Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data" (SCED), the National Center for Education Statistics (NCES) and the National Forum on Education Statistics have extended the existing secondary course classification system with codes and descriptions for courses offered at…

  13. Feature-selective Attention in Frontoparietal Cortex: Multivoxel Codes Adjust to Prioritize Task-relevant Information.

    Science.gov (United States)

    Jackson, Jade; Rich, Anina N; Williams, Mark A; Woolgar, Alexandra

    2017-02-01

    Human cognition is characterized by astounding flexibility, enabling us to select appropriate information according to the objectives of our current task. A circuit of frontal and parietal brain regions, often referred to as the frontoparietal attention network or multiple-demand (MD) regions, are believed to play a fundamental role in this flexibility. There is evidence that these regions dynamically adjust their responses to selectively process information that is currently relevant for behavior, as proposed by the "adaptive coding hypothesis" [Duncan, J. An adaptive coding model of neural function in prefrontal cortex. Nature Reviews Neuroscience, 2, 820-829, 2001]. Could this provide a neural mechanism for feature-selective attention, the process by which we preferentially process one feature of a stimulus over another? We used multivariate pattern analysis of fMRI data during a perceptually challenging categorization task to investigate whether the representation of visual object features in the MD regions flexibly adjusts according to task relevance. Participants were trained to categorize visually similar novel objects along two orthogonal stimulus dimensions (length/orientation) and performed short alternating blocks in which only one of these dimensions was relevant. We found that multivoxel patterns of activation in the MD regions encoded the task-relevant distinctions more strongly than the task-irrelevant distinctions: The MD regions discriminated between stimuli of different lengths when length was relevant and between the same objects according to orientation when orientation was relevant. The data suggest a flexible neural system that adjusts its representation of visual objects to preferentially encode stimulus features that are currently relevant for behavior, providing a neural mechanism for feature-selective attention.

  14. Fast Binary Coding for the Scene Classification of High-Resolution Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    Fan Hu

    2016-06-01

    Full Text Available Scene classification of high-resolution remote sensing (HRRS imagery is an important task in the intelligent processing of remote sensing images and has attracted much attention in recent years. Although the existing scene classification methods, e.g., the bag-of-words (BOW model and its variants, can achieve acceptable performance, these approaches strongly rely on the extraction of local features and the complicated coding strategy, which are usually time consuming and demand much expert effort. In this paper, we propose a fast binary coding (FBC method, to effectively generate efficient discriminative scene representations of HRRS images. The main idea is inspired by the unsupervised feature learning technique and the binary feature descriptions. More precisely, equipped with the unsupervised feature learning technique, we first learn a set of optimal “filters” from large quantities of randomly-sampled image patches and then obtain feature maps by convolving the image scene with the learned filters. After binarizing the feature maps, we perform a simple hashing step to convert the binary-valued feature map to the integer-valued feature map. Finally, statistical histograms computed on the integer-valued feature map are used as global feature representations of the scenes of HRRS images, similar to the conventional BOW model. The analysis of the algorithm complexity and experiments on HRRS image datasets demonstrate that, in contrast with existing scene classification approaches, the proposed FBC has much faster computational speed and achieves comparable classification performance. In addition, we also propose two extensions to FBC, i.e., the spatial co-occurrence matrix and different visual saliency maps, for further improving its final classification accuracy.

  15. Classification and prediction of river network ephemerality and its relevance for waterborne disease epidemiology

    Science.gov (United States)

    Perez-Saez, Javier; Mande, Theophile; Larsen, Joshua; Ceperley, Natalie; Rinaldo, Andrea

    2017-12-01

    The transmission of waterborne diseases hinges on the interactions between hydrology and ecology of hosts, vectors and parasites, with the long-term absence of water constituting a strict lower bound. However, the link between spatio-temporal patterns of hydrological ephemerality and waterborne disease transmission is poorly understood and difficult to account for. The use of limited biophysical and hydroclimate information from otherwise data scarce regions is therefore needed to characterize, classify, and predict river network ephemerality in a spatially explicit framework. Here, we develop a novel large-scale ephemerality classification and prediction methodology based on monthly discharge data, water and energy availability, and remote-sensing measures of vegetation, that is relevant to epidemiology, and maintains a mechanistic link to catchment hydrologic processes. Specifically, with reference to the context of Burkina Faso in sub-Saharan Africa, we extract a relevant set of catchment covariates that include the aridity index, annual runoff estimation using the Budyko framework, and hysteretical relations between precipitation and vegetation. Five ephemerality classes, from permanent to strongly ephemeral, are defined from the duration of 0-flow periods that also accounts for the sensitivity of river discharge to the long-lasting drought of the 70's-80's in West Africa. Using such classes, a gradient-boosted tree-based prediction yielded three distinct geographic regions of ephemerality. Importantly, we observe a strong epidemiological association between our predictions of hydrologic ephemerality and the known spatial patterns of schistosomiasis, an endemic parasitic waterborne disease in which infection occurs with human-water contact, and requires aquatic snails as an intermediate host. The general nature of our approach and its relevance for predicting the hydrologic controls on schistosomiasis occurrence provides a pathway for the explicit inclusion of

  16. The relevance of the International Classification of Functioning, Disability and Health (ICF) in monitoring and evaluating Community-based Rehabilitation (CBR).

    Science.gov (United States)

    Madden, Rosamond H; Dune, Tinashe; Lukersmith, Sue; Hartley, Sally; Kuipers, Pim; Gargett, Alexandra; Llewellyn, Gwynnyth

    2014-01-01

    To examine the relevance of the International Classification of Functioning, Disability and Health (ICF) to CBR monitoring and evaluation by investigating the relationship between the ICF and information in published CBR monitoring and evaluation reports. A three-stage literature search and analysis method was employed. Studies were identified via online database searches for peer-reviewed journal articles, and hand-searching of CBR network resources, NGO websites and specific journals. From each study "information items" were extracted; extraction consistency among authors was established. Finally, the resulting information items were coded to ICF domains and categories, with consensus on coding being achieved. Thirty-six articles relating to monitoring and evaluating CBR were selected for analysis. Approximately one third of the 2495 information items identified in these articles (788 or 32%) related to concepts of functioning, disability and environment, and could be coded to the ICF. These information items were spread across the entire ICF classification with a concentration on Activities and Participation (49% of the 788 information items) and Environmental Factors (42%). The ICF is a relevant and potentially useful framework and classification, providing building blocks for the systematic recording of information pertaining to functioning and disability, for CBR monitoring and evaluation. Implications for Rehabilitation The application of the ICF, as one of the building blocks for CBR monitoring and evaluation, is a constructive step towards an evidence-base on the efficacy and outcomes of CBR programs. The ICF can be used to provide the infrastructure for functioning and disability information to inform service practitioners and enable national and international comparisons.

  17. Stratification and Prognostic Relevance of Jass’s Molecular Classification of Colorectal Cancer

    International Nuclear Information System (INIS)

    Zlobec, Inti; Bihl, Michel P.; Foerster, Anja; Rufle, Alex; Terracciano, Luigi; Lugli, Alessandro

    2012-01-01

    Background: The current proposed model of colorectal tumorigenesis is based primarily on CpG island methylator phenotype (CIMP), microsatellite instability (MSI), KRAS, BRAF, and methylation status of 0-6-Methylguanine DNA Methyltransferase (MGMT) and classifies tumors into five subgroups. The aim of this study is to validate this molecular classification and test its prognostic relevance. Methods: Three hundred two patients were included in this study. Molecular analysis was performed for five CIMP-related promoters (CRABP1, MLH1, p16INK4a, CACNA1G, NEUROG1), MGMT, MSI, KRAS, and BRAF. Methylation in at least 4 promoters or in one to three promoters was considered CIMP-high and CIMP-low (CIMP-H/L), respectively. Results: CIMP-H, CIMP-L, and CIMP-negative were found in 7.1, 43, and 49.9% cases, respectively. One hundred twenty-three tumors (41%) could not be classified into any one of the proposed molecular subgroups, including 107 CIMP-L, 14 CIMP-H, and two CIMP-negative cases. The 10 year survival rate for CIMP-high patients [22.6% (95%CI: 7–43)] was significantly lower than for CIMP-L or CIMP-negative (p = 0.0295). Only the combined analysis of BRAF and CIMP (negative versus L/H) led to distinct prognostic subgroups. Conclusion: Although CIMP status has an effect on outcome, our results underline the need for standardized definitions of low- and high-level CIMP, which clearly hinders an effective prognostic and molecular classification of colorectal cancer.

  18. Stratification and Prognostic Relevance of Jass’s Molecular Classification of Colorectal Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Zlobec, Inti [Institute of Pathology, University of Bern, Bern (Switzerland); Institute for Pathology, University Hospital Basel, Basel (Switzerland); Bihl, Michel P.; Foerster, Anja; Rufle, Alex; Terracciano, Luigi [Institute for Pathology, University Hospital Basel, Basel (Switzerland); Lugli, Alessandro, E-mail: inti.zlobec@pathology.unibe.ch [Institute of Pathology, University of Bern, Bern (Switzerland); Institute for Pathology, University Hospital Basel, Basel (Switzerland)

    2012-02-27

    Background: The current proposed model of colorectal tumorigenesis is based primarily on CpG island methylator phenotype (CIMP), microsatellite instability (MSI), KRAS, BRAF, and methylation status of 0-6-Methylguanine DNA Methyltransferase (MGMT) and classifies tumors into five subgroups. The aim of this study is to validate this molecular classification and test its prognostic relevance. Methods: Three hundred two patients were included in this study. Molecular analysis was performed for five CIMP-related promoters (CRABP1, MLH1, p16INK4a, CACNA1G, NEUROG1), MGMT, MSI, KRAS, and BRAF. Methylation in at least 4 promoters or in one to three promoters was considered CIMP-high and CIMP-low (CIMP-H/L), respectively. Results: CIMP-H, CIMP-L, and CIMP-negative were found in 7.1, 43, and 49.9% cases, respectively. One hundred twenty-three tumors (41%) could not be classified into any one of the proposed molecular subgroups, including 107 CIMP-L, 14 CIMP-H, and two CIMP-negative cases. The 10 year survival rate for CIMP-high patients [22.6% (95%CI: 7–43)] was significantly lower than for CIMP-L or CIMP-negative (p = 0.0295). Only the combined analysis of BRAF and CIMP (negative versus L/H) led to distinct prognostic subgroups. Conclusion: Although CIMP status has an effect on outcome, our results underline the need for standardized definitions of low- and high-level CIMP, which clearly hinders an effective prognostic and molecular classification of colorectal cancer.

  19. Stratification and prognostic relevance of Jass’s molecular classification of colorectal cancer

    Directory of Open Access Journals (Sweden)

    Inti eZlobec

    2012-02-01

    Full Text Available Background: The current proposed model of colorectal tumorigenesis is based primarily on CpG island methylator phenotype (CIMP, microsatellite instability (MSI, KRAS, BRAF, and methylation status of 0-6-Methylguanine DNA Methyltransferase (MGMT and classifies tumors into 5 subgroups. The aim of this study is to validate this molecular classification and test its prognostic relevance. Methods: 302 patients were included in this study. Molecular analysis was performed for 5 CIMP-related promoters (CRABP1, MLH1, p16INK4a, CACNA1G, NEUROG1, MGMT, MSI, KRAS and BRAF. Tumors were CIMP-high or CIMP-low if ≥4 and 1-3 promoters were methylated, respectively. Results: CIMP-high, CIMP-low and CIMP–negative were found in 7.1%, 43% and 49.9% cases, respectively. 123 tumors (41% could not be classified into any one of the proposed molecular subgroups, including 107 CIMP-low, 14 CIMP-high and 2 CIMP-negative cases. The 10-year survival rate for CIMP-high patients (22.6% (95%CI: 7-43 was significantly lower than for CIMP-low or CIMP-negative (p=0.0295. Only the combined analysis of BRAF and CIMP (negative versus low/high led to distinct prognostic subgroups. Conclusion: Although CIMP status has an effect on outcome, our results underline the need for standardized definitions of low- and high-level CIMP, which clearly hinders an effective prognostic and molecular classification of colorectal cancer.

  20. LAMOST OBSERVATIONS IN THE KEPLER FIELD: SPECTRAL CLASSIFICATION WITH THE MKCLASS CODE

    Energy Technology Data Exchange (ETDEWEB)

    Gray, R. O. [Department of Physics and Astronomy, Appalachian State University, Boone, NC 28608 (United States); Corbally, C. J. [Vatican Observatory Research Group, Steward Observatory, Tucson, AZ 85721-0065 (United States); Cat, P. De [Royal Observatory of Belgium, Ringlaan 3, B-1180 Brussel (Belgium); Fu, J. N.; Ren, A. B. [Department of Astronomy, Beijing Normal University, 19 Avenue Xinjiekouwai, Beijing 100875 (China); Shi, J. R.; Luo, A. L.; Zhang, H. T.; Wu, Y.; Cao, Z.; Li, G. [Key Laboratory for Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Zhang, Y.; Hou, Y.; Wang, Y. [Nanjing Institute of Astronomical Optics and Technology, National Astronomical Observatories, Chinese Academy of Sciences, Nanjing 210042 (China)

    2016-01-15

    The LAMOST-Kepler project was designed to obtain high-quality, low-resolution spectra of many of the stars in the Kepler field with the Large Sky Area Multi Object Fiber Spectroscopic Telescope (LAMOST) spectroscopic telescope. To date 101,086 spectra of 80,447 objects over the entire Kepler field have been acquired. Physical parameters, radial velocities, and rotational velocities of these stars will be reported in other papers. In this paper we present MK spectral classifications for these spectra determined with the automatic classification code MKCLASS. We discuss the quality and reliability of the spectral types and present histograms showing the frequency of the spectral types in the main table organized according to luminosity class. Finally, as examples of the use of this spectral database, we compute the proportion of A-type stars that are Am stars, and identify 32 new barium dwarf candidates.

  1. New Site Coefficients and Site Classification System Used in Recent Building Seismic Code Provisions

    Science.gov (United States)

    Dobry, R.; Borcherdt, R.D.; Crouse, C.B.; Idriss, I.M.; Joyner, W.B.; Martin, G.R.; Power, M.S.; Rinne, E.E.; Seed, R.B.

    2000-01-01

    Recent code provisions for buildings and other structures (1994 and 1997 NEHRP Provisions, 1997 UBC) have adopted new site amplification factors and a new procedure for site classification. Two amplitude-dependent site amplification factors are specified: Fa for short periods and Fv for longer periods. Previous codes included only a long period factor S and did not provide for a short period amplification factor. The new site classification system is based on definitions of five site classes in terms of a representative average shear wave velocity to a depth of 30 m (V?? s). This definition permits sites to be classified unambiguously. When the shear wave velocity is not available, other soil properties such as standard penetration resistance or undrained shear strength can be used. The new site classes denoted by letters A - E, replace site classes in previous codes denoted by S1 - S4. Site classes A and B correspond to hard rock and rock, Site Class C corresponds to soft rock and very stiff / very dense soil, and Site Classes D and E correspond to stiff soil and soft soil. A sixth site class, F, is defined for soils requiring site-specific evaluations. Both Fa and Fv are functions of the site class, and also of the level of seismic hazard on rock, defined by parameters such as Aa and Av (1994 NEHRP Provisions), Ss and S1 (1997 NEHRP Provisions) or Z (1997 UBC). The values of Fa and Fv decrease as the seismic hazard on rock increases due to soil nonlinearity. The greatest impact of the new factors Fa and Fv as compared with the old S factors occurs in areas of low-to-medium seismic hazard. This paper summarizes the new site provisions, explains the basis for them, and discusses ongoing studies of site amplification in recent earthquakes that may influence future code developments.

  2. HCPB TBM thermo mechanical design: Assessment with respect codes and standards and DEMO relevancy

    International Nuclear Information System (INIS)

    Cismondi, F.; Kecskes, S.; Aiello, G.

    2011-01-01

    In the frame of the activities of the European TBM Consortium of Associates the Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) is developed in Karlsruhe Institute of Technology (KIT). After performing detailed thermal and fluid dynamic analyses of the preliminary HCPB TBM design, the thermo mechanical behaviour of the TBM under typical ITER loads has to be assessed. A synthesis of the different design options proposed has been realized building two different assemblies of the HCPB-TBM: these two assemblies and the analyses performed on them are presented in this paper. Finite Element thermo-mechanical analyses of two detailed 1/4 scaled models of the HCPB-TBM assemblies proposed have been performed, with the aim of verifying the accordance of the mechanical behaviour with the criteria of the design codes and standards. The structural design limits specified in the codes and standard are discussed in relation with the EUROFER available data and possible damage modes. Solutions to improve the weak structural points of the present design are identified and the DEMO relevancy of the present thermal and structural design parameters is discussed.

  3. A physiologically-inspired model of numerical classification based on graded stimulus coding

    Directory of Open Access Journals (Sweden)

    John Pearson

    2010-01-01

    Full Text Available In most natural decision contexts, the process of selecting among competing actions takes place in the presence of informative, but potentially ambiguous, stimuli. Decisions about magnitudes—quantities like time, length, and brightness that are linearly ordered—constitute an important subclass of such decisions. It has long been known that perceptual judgments about such quantities obey Weber’s Law, wherein the just-noticeable difference in a magnitude is proportional to the magnitude itself. Current physiologically inspired models of numerical classification assume discriminations are made via a labeled line code of neurons selectively tuned for numerosity, a pattern observed in the firing rates of neurons in the ventral intraparietal area (VIP of the macaque. By contrast, neurons in the contiguous lateral intraparietal area (LIP signal numerosity in a graded fashion, suggesting the possibility that numerical classification could be achieved in the absence of neurons tuned for number. Here, we consider the performance of a decision model based on this analog coding scheme in a paradigmatic discrimination task—numerosity bisection. We demonstrate that a basic two-neuron classifier model, derived from experimentally measured monotonic responses of LIP neurons, is sufficient to reproduce the numerosity bisection behavior of monkeys, and that the threshold of the classifier can be set by reward maximization via a simple learning rule. In addition, our model predicts deviations from Weber Law scaling of choice behavior at high numerosity. Together, these results suggest both a generic neuronal framework for magnitude-based decisions and a role for reward contingency in the classification of such stimuli.

  4. ColorPhylo: A Color Code to Accurately Display Taxonomic Classifications.

    Science.gov (United States)

    Lespinats, Sylvain; Fertil, Bernard

    2011-01-01

    Color may be very useful to visualise complex data. As far as taxonomy is concerned, color may help observing various species' characteristics in correlation with classification. However, choosing the number of subclasses to display is often a complex task: on the one hand, assigning a limited number of colors to taxa of interest hides the structure imbedded in the subtrees of the taxonomy; on the other hand, differentiating a high number of taxa by giving them specific colors, without considering the underlying taxonomy, may lead to unreadable results since relationships between displayed taxa would not be supported by the color code. In the present paper, an automatic color coding scheme is proposed to visualise the levels of taxonomic relationships displayed as overlay on any kind of data plot. To achieve this goal, a dimensionality reduction method allows displaying taxonomic "distances" onto a Euclidean two-dimensional space. The resulting map is projected onto a 2D color space (the Hue, Saturation, Brightness colorimetric space with brightness set to 1). Proximity in the taxonomic classification corresponds to proximity on the map and is therefore materialised by color proximity. As a result, each species is related to a color code showing its position in the taxonomic tree. The so called ColorPhylo displays taxonomic relationships intuitively and can be combined with any biological result. A Matlab version of ColorPhylo is available at http://sy.lespi.free.fr/ColorPhylo-homepage.html. Meanwhile, an ad-hoc distance in case of taxonomy with unknown edge lengths is proposed.

  5. Ontological function annotation of long non-coding RNAs through hierarchical multi-label classification.

    Science.gov (United States)

    Zhang, Jingpu; Zhang, Zuping; Wang, Zixiang; Liu, Yuting; Deng, Lei

    2018-05-15

    Long non-coding RNAs (lncRNAs) are an enormous collection of functional non-coding RNAs. Over the past decades, a large number of novel lncRNA genes have been identified. However, most of the lncRNAs remain function uncharacterized at present. Computational approaches provide a new insight to understand the potential functional implications of lncRNAs. Considering that each lncRNA may have multiple functions and a function may be further specialized into sub-functions, here we describe NeuraNetL2GO, a computational ontological function prediction approach for lncRNAs using hierarchical multi-label classification strategy based on multiple neural networks. The neural networks are incrementally trained level by level, each performing the prediction of gene ontology (GO) terms belonging to a given level. In NeuraNetL2GO, we use topological features of the lncRNA similarity network as the input of the neural networks and employ the output results to annotate the lncRNAs. We show that NeuraNetL2GO achieves the best performance and the overall advantage in maximum F-measure and coverage on the manually annotated lncRNA2GO-55 dataset compared to other state-of-the-art methods. The source code and data are available at http://denglab.org/NeuraNetL2GO/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  6. A Crosswalk of Mineral Commodity End Uses and North American Industry Classification System (NAICS) codes

    Science.gov (United States)

    Barry, James J.; Matos, Grecia R.; Menzie, W. David

    2015-09-14

    This crosswalk is based on the premise that there is a connection between the way mineral commodities are used and how this use is reflected in the economy. Raw mineral commodities are the basic materials from which goods, finished products, or intermediate materials are manufactured or made. Mineral commodities are vital to the development of the U.S. economy and they impact nearly every industrial segment of the economy, representing 12.2 percent of the U.S. gross domestic product (GDP) in 2010 (U.S. Bureau of Economic Analysis, 2014). In an effort to better understand the distribution of mineral commodities in the economy, the U.S. Geological Survey (USGS) attempts to link the end uses of mineral commodities to the corresponding North American Industry Classification System (NAICS) codes.

  7. Should International Classification of Diseases codes be used to survey hospital-acquired pneumonia?

    Science.gov (United States)

    Wolfensberger, A; Meier, A H; Kuster, S P; Mehra, T; Meier, M-T; Sax, H

    2018-05-01

    As surveillance of hospital-acquired pneumonia (HAP) is very resource intensive, alternatives for HAP surveillance are needed urgently. This study compared HAP rates according to routine discharge diagnostic codes of the International Classification of Diseases, 10 th Revision (ICD-10; ICD-HAP) with HAP rates according to the validated surveillance definitions of the Hospitals in Europe Link for Infection Control through Surveillance (HELICS/IPSE; HELICS-HAP) by manual retrospective re-evaluation of patient records. The positive predictive value of ICD-HAP for HELICS-HAP was 0.35, and sensitivity was 0.59. Therefore, the currently available ICD-10-based routine discharge data do not allow reliable identification of patients with HAP. Copyright © 2018 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  8. Call for consistent coding in diabetes mellitus using the Royal College of General Practitioners and NHS pragmatic classification of diabetes

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2013-03-01

    Full Text Available Background The prevalence of diabetes is increasing with growing levels of obesity and an aging population. New practical guidelines for diabetes provide an applicable classification. Inconsistent coding of diabetes hampers the use of computerised disease registers for quality improvement, and limits the monitoring of disease trends.Objective To develop a consensus set of codes that should be used when recording diabetes diagnostic data.Methods The consensus approach was hierarchical, with a preference for diagnostic/disorder codes, to define each type of diabetes and non-diabetic hyperglycaemia, which were listed as being completely, partially or not readily mapped to available codes. The practical classification divides diabetes into type 1 (T1DM, type 2 (T2DM, genetic, other, unclassified and non-diabetic fasting hyperglycaemia. We mapped the classification to Read version 2, Clinical Terms version 3 and SNOMED CT.Results T1DMand T2DM were completely mapped to appropriate codes. However, in other areas only partial mapping is possible. Genetics is a fast-moving field and there were considerable gaps in the available labels for genetic conditions; what the classification calls ‘other’ the coding system labels ‘secondary’ diabetes. The biggest gap was the lack of a code for diabetes where the type of diabetes was uncertain. Notwithstanding these limitations we were able to develop a consensus list.Conclusions It is a challenge to develop codes that readily map to contemporary clinical concepts. However, clinicians should adopt the standard recommended codes; and audit the quality of their existing records.

  9. A full computation-relevant topological dynamics classification of elementary cellular automata.

    Science.gov (United States)

    Schüle, Martin; Stoop, Ruedi

    2012-12-01

    Cellular automata are both computational and dynamical systems. We give a complete classification of the dynamic behaviour of elementary cellular automata (ECA) in terms of fundamental dynamic system notions such as sensitivity and chaoticity. The "complex" ECA emerge to be sensitive, but not chaotic and not eventually weakly periodic. Based on this classification, we conjecture that elementary cellular automata capable of carrying out complex computations, such as needed for Turing-universality, are at the "edge of chaos."

  10. Event classification and optimization methods using artificial intelligence and other relevant techniques: Sharing the experiences

    Science.gov (United States)

    Mohamed, Abdul Aziz; Hasan, Abu Bakar; Ghazali, Abu Bakar Mhd.

    2017-01-01

    Classification of large data into respected classes or groups could be carried out with the help of artificial intelligence (AI) tools readily available in the market. To get the optimum or best results, optimization tool could be applied on those data. Classification and optimization have been used by researchers throughout their works, and the outcomes were very encouraging indeed. Here, the authors are trying to share what they have experienced in three different areas of applied research.

  11. Comparison study on flexible pavement design using FAA (Federal Aviation Administration) and LCN (Load Classification Number) code in Ahmad Yani international airport’s runway

    Science.gov (United States)

    Santoso, S. E.; Sulistiono, D.; Mawardi, A. F.

    2017-11-01

    FAA code for airport design has been broadly used by Indonesian Ministry of Aviation since decades ago. However, there is not much comprehensive study about its relevance and efficiency towards current situation in Indonesia. Therefore, a further comparison study on flexible pavement design for airport runway using comparable method has become essential. The main focus of this study is to compare which method between FAA and LCN that offer the most efficient and effective way in runway pavement planning. The comparative methods in this study mainly use the variety of variable approach. FAA code for instance, will use the approach on the aircraft’s maximum take-off weight and annual departure. Whilst LCN code use the variable of equivalent single wheel load and tire pressure. Based on the variables mentioned above, a further classification and rated method will be used to determine which code is best implemented. According to the analysis, it is clear that FAA method is the most effective way to plan runway design in Indonesia with consecutively total pavement thickness of 127cm and LCN method total pavement thickness of 70cm. Although, FAA total pavement is thicker that LCN its relevance towards sustainable and pristine condition in the future has become an essential aspect to consider in design and planning.

  12. An analysis of feature relevance in the classification of astronomical transients with machine learning methods

    Science.gov (United States)

    D'Isanto, A.; Cavuoti, S.; Brescia, M.; Donalek, C.; Longo, G.; Riccio, G.; Djorgovski, S. G.

    2016-04-01

    The exploitation of present and future synoptic (multiband and multi-epoch) surveys requires an extensive use of automatic methods for data processing and data interpretation. In this work, using data extracted from the Catalina Real Time Transient Survey (CRTS), we investigate the classification performance of some well tested methods: Random Forest, MultiLayer Perceptron with Quasi Newton Algorithm and K-Nearest Neighbours, paying special attention to the feature selection phase. In order to do so, several classification experiments were performed. Namely: identification of cataclysmic variables, separation between galactic and extragalactic objects and identification of supernovae.

  13. Classification and coding of commercial fishing injuries by work processes: an experience in the Danish fresh market fishing industry

    DEFF Research Database (Denmark)

    Jensen, Olaf Chresten; Stage, Søren; Noer, Preben

    2005-01-01

    BACKGROUND: Work-related injuries in commercial fishing are of concern internationally. To better identify the causes of injury, this study coded occupational injuries by working processes in commercial fishing for fresh market fish. METHODS: A classification system of the work processes was deve......BACKGROUND: Work-related injuries in commercial fishing are of concern internationally. To better identify the causes of injury, this study coded occupational injuries by working processes in commercial fishing for fresh market fish. METHODS: A classification system of the work processes...... to working with the gear and nets vary greatly in the different fishing methods. Coding of the injuries to the specific working processes allows for targeted prevention efforts....

  14. Use of a New International Classification of Health Interventions for Capturing Information on Health Interventions Relevant to People with Disabilities.

    Science.gov (United States)

    Fortune, Nicola; Madden, Richard; Almborg, Ann-Helene

    2018-01-17

    Development of the World Health Organization's International Classification of Health Interventions (ICHI) is currently underway. Once finalised, ICHI will provide a standard basis for collecting, aggregating, analysing, and comparing data on health interventions across all sectors of the health system. In this paper, we introduce the classification, describing its underlying tri-axial structure, organisation and content. We then discuss the potential value of ICHI for capturing information on met and unmet need for health interventions relevant to people with a disability, with a particular focus on interventions to support functioning and health promotion interventions. Early experiences of use of the Swedish National Classification of Social Care Interventions and Activities, which is based closely on ICHI, illustrate the value of a standard classification to support practice and collect statistical data. Testing of the ICHI beta version in a wide range of countries and contexts is now needed so that improvements can be made before it is finalised. Input from those with an interest in the health of people with disabilities and health promotion more broadly is welcomed.

  15. Multi-information fusion sparse coding with preserving local structure for hyperspectral image classification

    Science.gov (United States)

    Wei, Xiaohui; Zhu, Wen; Liao, Bo; Gu, Changlong; Li, Weibiao

    2017-10-01

    The key question of sparse coding (SC) is how to exploit the information that already exists to acquire the robust sparse representations (SRs) of distinguishing different objects for hyperspectral image (HSI) classification. We propose a multi-information fusion SC framework, which fuses the spectral, spatial, and label information in the same level, to solve the above question. In particular, pixels from disjointed spatial clusters, which are obtained by cutting the given HSI in space, are individually and sparsely encoded. Then, due to the importance of spatial structure, graph- and hypergraph-based regularizers are enforced to motivate the obtained representations smoothness and to preserve the local consistency for each spatial cluster. The latter simultaneously considers the spectrum, spatial, and label information of multiple pixels that have a great probability with the same label. Finally, a linear support vector machine is selected as the final classifier with the learned SRs as input. Experiments conducted on three frequently used real HSIs show that our methods can achieve satisfactory results compared with other state-of-the-art methods.

  16. The Classification of Complementary Information Set Codes of Lengths 14 and 16

    OpenAIRE

    Freibert, Finley

    2012-01-01

    In the paper "A new class of codes for Boolean masking of cryptographic computations," Carlet, Gaborit, Kim, and Sol\\'{e} defined a new class of rate one-half binary codes called \\emph{complementary information set} (or CIS) codes. The authors then classified all CIS codes of length less than or equal to 12. CIS codes have relations to classical Coding Theory as they are a generalization of self-dual codes. As stated in the paper, CIS codes also have important practical applications as they m...

  17. The National Ecosystem Services Classification System: A Framework for Identifying and Reducing Relevant Uncertainties

    Science.gov (United States)

    Rhodes, C. R.; Sinha, P.; Amanda, N.

    2013-12-01

    In recent years the gap between what scientists know and what policymakers should appreciate in environmental decision making has received more attention, as the costs of the disconnect have become more apparent to both groups. Particularly for water-related policies, the EPA's Office of Water has struggled with benefit estimates held low by the inability to quantify ecological and economic effects that theory, modeling, and anecdotal or isolated case evidence suggest may prove to be larger. Better coordination with ecologists and hydrologists is being explored as a solution. The ecosystem services (ES) concept now nearly two decades old links ecosystem functions and processes to the human value system. But there remains no clear mapping of which ecosystem goods and services affect which individual or economic values. The National Ecosystem Services Classification System (NESCS, 'nexus') project brings together ecologists, hydrologists, and social scientists to do this mapping for aquatic and other ecosystem service-generating systems. The objective is to greatly reduce the uncertainty in water-related policy making by mapping and ultimately quantifying the various functions and products of aquatic systems, as well as how changes to aquatic systems impact the human economy and individual levels of non-monetary appreciation for those functions and products. Primary challenges to fostering interaction between scientists, social scientists, and policymakers are lack of a common vocabulary, and the need for a cohesive comprehensive framework that organizes concepts across disciplines and accommodates scientific data from a range of sources. NESCS builds the vocabulary and the framework so both may inform a scalable transdisciplinary policy-making application. This talk presents for discussion the process and progress in developing both this vocabulary and a classifying framework capable of bridging the gap between a newer but existing ecosystem services classification

  18. Pelvic Arterial Anatomy Relevant to Prostatic Artery Embolisation and Proposal for Angiographic Classification

    Energy Technology Data Exchange (ETDEWEB)

    Assis, André Moreira de, E-mail: andre.maa@gmail.com; Moreira, Airton Mota, E-mail: motamoreira@gmail.com; Paula Rodrigues, Vanessa Cristina de, E-mail: vanessapaular@yahoo.com.br [University of Sao Paulo Medical School, Interventional Radiology and Endovascular Surgery Department, Radiology Institute (Brazil); Harward, Sardis Honoria, E-mail: sardis.harward@merit.com [The Dartmouth Center for Health Care Delivery Science (United States); Antunes, Alberto Azoubel, E-mail: antunesuro@uol.com.br; Srougi, Miguel, E-mail: srougi@usp.br [University of Sao Paulo Medical School, Urology Department (Brazil); Carnevale, Francisco Cesar, E-mail: fcarnevale@uol.com.br [University of Sao Paulo Medical School, Interventional Radiology and Endovascular Surgery Department, Radiology Institute (Brazil)

    2015-08-15

    PurposeTo describe and categorize the angiographic findings regarding prostatic vascularization, propose an anatomic classification, and discuss its implications for the PAE procedure.MethodsAngiographic findings from 143 PAE procedures were reviewed retrospectively, and the origin of the inferior vesical artery (IVA) was classified into five subtypes as follows: type I: IVA originating from the anterior division of the internal iliac artery (IIA), from a common trunk with the superior vesical artery (SVA); type II: IVA originating from the anterior division of the IIA, inferior to the SVA origin; type III: IVA originating from the obturator artery; type IV: IVA originating from the internal pudendal artery; and type V: less common origins of the IVA. Incidences were calculated by percentage.ResultsTwo hundred eighty-six pelvic sides (n = 286) were analyzed, and 267 (93.3 %) were classified into I–IV types. Among them, the most common origin was type IV (n = 89, 31.1 %), followed by type I (n = 82, 28.7 %), type III (n = 54, 18.9 %), and type II (n = 42, 14.7 %). Type V anatomy was seen in 16 cases (5.6 %). Double vascularization, defined as two independent prostatic branches in one pelvic side, was seen in 23 cases (8.0 %).ConclusionsDespite the large number of possible anatomical variations of male pelvis, four main patterns corresponded to almost 95 % of the cases. Evaluation of anatomy in a systematic fashion, following a standard classification, will make PAE a faster, safer, and more effective procedure.

  19. SWeRF--A method for estimating the relevant fine particle fraction in bulk materials for classification and labelling purposes.

    Science.gov (United States)

    Pensis, Ingeborg; Luetzenkirchen, Frank; Friede, Bernd

    2014-05-01

    In accordance with the European regulation for classification, labelling and packaging of substances and mixtures (CLP) as well as the criteria as set out in the Globally Harmonized System (GHS), fine fraction of crystalline silica (CS) has been classified as a specific target organ toxicity, the specific organ in this case being the lung. Generic cut-off values for products containing a fine fraction of CS trigger the need for a method for the quantification of the fine fraction of CS in bulk materials. This article describes the so-called SWeRF method, the size-weighted relevant fine fraction. The SWeRF method combines the particle size distribution of a powder with probability factors from the EN 481 standard and allows the relevant fine fraction of a material to be calculated. The SWeRF method has been validated with a number of industrial minerals. This will enable manufacturers and blenders to apply the CLP and GHS criteria for the classification of mineral products containing RCS a fine fraction of CS.

  20. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  1. Five-way Smoking Status Classification Using Text Hot-Spot Identification and Error-correcting Output Codes

    OpenAIRE

    Cohen, Aaron M.

    2008-01-01

    We participated in the i2b2 smoking status classification challenge task. The purpose of this task was to evaluate the ability of systems to automatically identify patient smoking status from discharge summaries. Our submission included several techniques that we compared and studied, including hot-spot identification, zero-vector filtering, inverse class frequency weighting, error-correcting output codes, and post-processing rules. We evaluated our approaches using the same methods as the i2...

  2. RELEVANT ISSUES CONCERNING THE RELOCATION OF CIVIL PROCEEDINGS UNDER THE NEW CODE OF CIVIL PROCEDURE (NCPC

    Directory of Open Access Journals (Sweden)

    Andrei Costin GRIMBERG

    2015-07-01

    Full Text Available The change of the new code of civil procedure and obvious the entry of the new provisions at 15th February 2013, has been thought with the hope to accelerate the procedures related to judgement with a noticeable simplification of procedures, all designed with the aim of unifying the case law and to lower the costs generated by lawsuits , costs both borne by the State as well by citizens involved the cases in court . Thus, the implementation of the New Code of Civil Procedure, desired the compliance right to a fair trial within a optimal time and predictable by the court, by judging the trial in a speedy way , avoiding unjustified delays of the pending cases and to the new petitions introduced, by excessive and unjustified delays often. By the noticeable changes that occurred following the entry into force of the new Code of Civil Procedure, it identify and amend the provisions regarding requests for displacement, in terms of the grounds on which it may formulate the petition of displacement and the court competent to hear such an application.

  3. Validity of International Classification of Diseases (ICD) coding for dengue infections in hospital discharge records in Malaysia.

    Science.gov (United States)

    Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn

    2018-04-20

    Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.

  4. Video coding and decoding devices and methods preserving PPG relevant information

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a video encoding device (10, 10', 10") and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  5. Video coding and decoding devices and methods preserving ppg relevant information

    NARCIS (Netherlands)

    2013-01-01

    The present invention relates to a video encoding device (10, 10', 10'') and method for encoding video data and to a corresponding video decoding device (60, 60') and method. To preserve PPG relevant information after encoding without requiring a large amount of additional data for the video encoder

  6. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  7. ncRNA-class Web Tool: Non-coding RNA feature extraction and pre-miRNA classification web tool

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Theofilatos, Konstantinos A.; Papadimitriou, Stergios; Tsakalidis, Athanasios K.; Likothanassis, Spiridon D.; Mavroudi, Seferina P.

    2012-01-01

    Until recently, it was commonly accepted that most genetic information is transacted by proteins. Recent evidence suggests that the majority of the genomes of mammals and other complex organisms are in fact transcribed into non-coding RNAs (ncRNAs), many of which are alternatively spliced and/or processed into smaller products. Non coding RNA genes analysis requires the calculation of several sequential, thermodynamical and structural features. Many independent tools have already been developed for the efficient calculation of such features but to the best of our knowledge there does not exist any integrative approach for this task. The most significant amount of existing work is related to the miRNA class of non-coding RNAs. MicroRNAs (miRNAs) are small non-coding RNAs that play a significant role in gene regulation and their prediction is a challenging bioinformatics problem. Non-coding RNA feature extraction and pre-miRNA classification Web Tool (ncRNA-class Web Tool) is a publicly available web tool ( http://150.140.142.24:82/Default.aspx ) which provides a user friendly and efficient environment for the effective calculation of a set of 58 sequential, thermodynamical and structural features of non-coding RNAs, plus a tool for the accurate prediction of miRNAs. © 2012 IFIP International Federation for Information Processing.

  8. The accuracy of International Classification of Diseases coding for dental problems not associated with trauma in a hospital emergency department.

    Science.gov (United States)

    Figueiredo, Rafael L F; Singhal, Sonica; Dempster, Laura; Hwang, Stephen W; Quinonez, Carlos

    2015-01-01

    Emergency department (ED) visits for nontraumatic dental conditions (NTDCs) may be a sign of unmet need for dental care. The objective of this study was to determine the accuracy of the International Classification of Diseases codes (ICD-10-CA) for ED visits for NTDC. ED visits in 2008-2099 at one hospital in Toronto were identified if the discharge diagnosis in the administrative database system was an ICD-10-CA code for a NTDC (K00-K14). A random sample of 100 visits was selected, and the medical records for these visits were reviewed by a dentist. The description of the clinical signs and symptoms were evaluated, and a diagnosis was assigned. This diagnosis was compared with the diagnosis assigned by the physician and the code assigned to the visit. The 100 ED visits reviewed were associated with 16 different ICD-10-CA codes for NTDC. Only 2 percent of these visits were clearly caused by trauma. The code K0887 (toothache) was the most frequent diagnostic code (31 percent). We found 43.3 percent disagreement on the discharge diagnosis reported by the physician, and 58.0 percent disagreement on the code in the administrative database assigned by the abstractor, compared with what it was suggested by the dentist reviewing the chart. There are substantial discrepancies between the ICD-10-CA diagnosis assigned in administrative databases and the diagnosis assigned by a dentist reviewing the chart retrospectively. However, ICD-10-CA codes can be used to accurately identify ED visits for NTDC. © 2015 American Association of Public Health Dentistry.

  9. Use of the Coding Causes of Death in HIV in the classification of deaths in Northeastern Brazil.

    Science.gov (United States)

    Alves, Diana Neves; Bresani-Salvi, Cristiane Campello; Batista, Joanna d'Arc Lyra; Ximenes, Ricardo Arraes de Alencar; Miranda-Filho, Demócrito de Barros; Melo, Heloísa Ramos Lacerda de; Albuquerque, Maria de Fátima Pessoa Militão de

    2017-01-01

    Describe the coding process of death causes for people living with HIV/AIDS, and classify deaths as related or unrelated to immunodeficiency by applying the Coding Causes of Death in HIV (CoDe) system. A cross-sectional study that codifies and classifies the causes of deaths occurring in a cohort of 2,372 people living with HIV/AIDS, monitored between 2007 and 2012, in two specialized HIV care services in Pernambuco. The causes of death already codified according to the International Classification of Diseases were recoded and classified as deaths related and unrelated to immunodeficiency by the CoDe system. We calculated the frequencies of the CoDe codes for the causes of death in each classification category. There were 315 (13%) deaths during the study period; 93 (30%) were caused by an AIDS-defining illness on the Centers for Disease Control and Prevention list. A total of 232 deaths (74%) were related to immunodeficiency after application of the CoDe. Infections were the most common cause, both related (76%) and unrelated (47%) to immunodeficiency, followed by malignancies (5%) in the first group and external causes (16%), malignancies (12 %) and cardiovascular diseases (11%) in the second group. Tuberculosis comprised 70% of the immunodeficiency-defining infections. Opportunistic infections and aging diseases were the most frequent causes of death, adding multiple disease burdens on health services. The CoDe system increases the probability of classifying deaths more accurately in people living with HIV/AIDS. Descrever o processo de codificação das causas de morte em pessoas vivendo com HIV/Aids, e classificar os óbitos como relacionados ou não relacionados à imunodeficiência aplicando o sistema Coding Causes of Death in HIV (CoDe). Estudo transversal, que codifica e classifica as causas dos óbitos ocorridos em uma coorte de 2.372 pessoas vivendo com HIV/Aids acompanhadas entre 2007 e 2012 em dois serviços de atendimento especializado em HIV em

  10. Using Administrative Mental Health Indicators in Heart Failure Outcomes Research: Comparison of Clinical Records and International Classification of Disease Coding.

    Science.gov (United States)

    Bender, Miriam; Smith, Tyler C

    2016-01-01

    Use of mental indication in health outcomes research is of growing interest to researchers. This study, as part of a larger research program, quantified agreement between administrative International Classification of Disease (ICD-9) coding for, and "gold standard" clinician documentation of, mental health issues (MHIs) in hospitalized heart failure (HF) patients to determine the validity of mental health administrative data for use in HF outcomes research. A 13% random sample (n = 504) was selected from all unique patients (n = 3,769) hospitalized with a primary HF diagnosis at 4 San Diego County community hospitals during 2009-2012. MHI was defined as ICD-9 discharge diagnostic coding 290-319. Records were audited for clinician documentation of MHI. A total of 43% (n = 216) had mental health clinician documentation; 33% (n = 164) had ICD-9 coding for MHI. ICD-9 code bundle 290-319 had 0.70 sensitivity, 0.97 specificity, and kappa 0.69 (95% confidence interval 0.61-0.79). More specific ICD-9 MHI code bundles had kappas ranging from 0.44 to 0.82 and sensitivities ranging from 42% to 82%. Agreement between ICD-9 coding and clinician documentation for a broadly defined MHI is substantial, and can validly "rule in" MHI for hospitalized patients with heart failure. More specific MHI code bundles had fair to almost perfect agreement, with a wide range of sensitivities for identifying patients with an MHI. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Association of Postoperative Readmissions With Surgical Quality Using a Delphi Consensus Process to Identify Relevant Diagnosis Codes.

    Science.gov (United States)

    Mull, Hillary J; Graham, Laura A; Morris, Melanie S; Rosen, Amy K; Richman, Joshua S; Whittle, Jeffery; Burns, Edith; Wagner, Todd H; Copeland, Laurel A; Wahl, Tyler; Jones, Caroline; Hollis, Robert H; Itani, Kamal M F; Hawn, Mary T

    2018-04-18

    Postoperative readmission data are used to measure hospital performance, yet the extent to which these readmissions reflect surgical quality is unknown. To establish expert consensus on whether reasons for postoperative readmission are associated with the quality of surgery in the index admission. In a modified Delphi process, a panel of 14 experts in medical and surgical readmissions comprising physicians and nonphysicians from Veterans Affairs (VA) and private-sector institutions reviewed 30-day postoperative readmissions from fiscal years 2008 through 2014 associated with inpatient surgical procedures performed at a VA medical center between October 1, 2007, and September 30, 2014. The consensus process was conducted from January through May 2017. Reasons for readmission were grouped into categories based on International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes. Panelists were given the proportion of readmissions coded by each reason and median (interquartile range) days to readmission. They answered the question, "Does the readmission reason reflect possible surgical quality of care problems in the index admission?" on a scale of 1 (never related) to 5 (directly related) in 3 rounds of consensus building. The consensus process was completed in May 2017 and data were analyzed in June 2017. Consensus on proportion of ICD-9-coded readmission reasons that reflected quality of surgical procedure. In 3 Delphi rounds, the 14 panelists achieved consensus on 50 reasons for readmission; 12 panelists also completed group telephone calls between rounds 1 and 2. Readmissions with diagnoses of infection, sepsis, pneumonia, hemorrhage/hematoma, anemia, ostomy complications, acute renal failure, fluid/electrolyte disorders, or venous thromboembolism were considered associated with surgical quality and accounted for 25 521 of 39 664 readmissions (64% of readmissions; 7.5% of 340 858 index surgical procedures). The proportion of readmissions

  12. Identifying Adverse Events Using International Classification of Diseases, Tenth Revision Y Codes in Korea: A Cross-sectional Study

    Directory of Open Access Journals (Sweden)

    Minsu Ock

    2018-01-01

    Full Text Available Objectives The use of administrative data is an affordable alternative to conducting a difficult large-scale medical-record review to estimate the scale of adverse events. We identified adverse events from 2002 to 2013 on the national level in Korea, using International Classification of Diseases, tenth revision (ICD-10 Y codes. Methods We used data from the National Health Insurance Service-National Sample Cohort (NHIS-NSC. We relied on medical treatment databases to extract information on ICD-10 Y codes from each participant in the NHIS-NSC. We classified adverse events in the ICD-10 Y codes into 6 types: those related to drugs, transfusions, and fluids; those related to vaccines and immunoglobulin; those related to surgery and procedures; those related to infections; those related to devices; and others. Results Over 12 years, a total of 20 817 adverse events were identified using ICD-10 Y codes, and the estimated total adverse event rate was 0.20%. Between 2002 and 2013, the total number of such events increased by 131.3%, from 1366 in 2002 to 3159 in 2013. The total rate increased by 103.9%, from 0.17% in 2002 to 0.35% in 2013. Events related to drugs, transfusions, and fluids were the most common (19 446, 93.4%, followed by those related to surgery and procedures (1209, 5.8% and those related to vaccines and immunoglobulin (72, 0.3%. Conclusions Based on a comparison with the results of other studies, the total adverse event rate in this study was significantly underestimated. Improving coding practices for ICD-10 Y codes is necessary to precisely monitor the scale of adverse events in Korea.

  13. Development and ITER relevant application of a user friendly interface (TEM) for use with the TMAP4 code

    International Nuclear Information System (INIS)

    Tanaka, M.R.; Fong, C.; Kalyanam, K.M.; Sood, S.K.; Delisle, M.; Natalizio, A.

    1995-01-01

    The Tritium Enclosure Model (TEM) has been developed as a user friendly interface to facilitate the application of the previously validated, verified and ITER approved TMAP4 Code. TEM (and TMAP4) dynamically analyzes the movement of tritium through structures, between structures and adjoining enclosures. Credible ITER relevant accident scenarios were developed and analyzed. The analyses considered the scenario with the cleanup system active or inactive, with and without the surface interactions. For surface interaction cases, the epoxy characteristics reported in the TMAP4 User Manual were used. Typical applications for TEM are the estimation of time-dependent tritium inventories in enclosures, as well as emissions to the environment following an accidental spill into any set of enclosures connected to cleanup systems. This paper outlines the various features of TEM and reports on the application of TEM to determine environmental source terms for the ITER Fuel Cycle and Cooling Systems, under chronic and accidental tritium releases. 3 refs., 2 figs., 1 tab

  14. Benchmark of coupling codes (ALOHA, TOPLHA and GRILL3D) with ITER-relevant Lower Hybrid antenna

    International Nuclear Information System (INIS)

    Milanesio, D.; Hillairet, J.; Panaccione, L.; Maggiora, R.; Artaud, J.F.; Bae, Y.S.; Barbera, A.M.A.; Belo, J.; Berger-By, G.; Bernard, J.M.; Cara, Ph.; Cardinali, A.; Castaldo, C.; Ceccuzzi, S.; Cesario, R.; Decker, J.; Delpech, L.; Ekedahl, A.; Garcia, J.; Garibaldi, P.

    2011-01-01

    In order to assist the design of the future ITER Lower Hybrid launcher, coupling codes ALOHA, from CEA/IRFM, TOPLHA, from Politecnico di Torino, and GRILL3D, developed by Dr. Mikhail Irzak (A.F. Ioffe Physico-Technical Institute, St. Petersburg, Russia) and operated by ENEA Frascati, have been compared with the updated (six modules with four active waveguides per module) Passive-Active Multi-junction (PAM) Lower Hybrid antennas. Both ALOHA and GRILL3D formulate the problem in terms of rectangular waveguides modes, while TOPLHA is based on boundary-value problem with the adoption of a triangular cell-mesh to represent the relevant waveguides surfaces. Several plasma profiles, with varying edge density and density increase, have been adopted to provide a complete description of the simulated launcher in terms of reflection coefficient, computed at the beginning of each LH module, and of power spectra. Good agreement can be observed among codes for all the simulated profiles.

  15. LZW-Kernel: fast kernel utilizing variable length code blocks from LZW compressors for protein sequence classification.

    Science.gov (United States)

    Filatov, Gleb; Bauwens, Bruno; Kertész-Farkas, Attila

    2018-05-07

    Bioinformatics studies often rely on similarity measures between sequence pairs, which often pose a bottleneck in large-scale sequence analysis. Here, we present a new convolutional kernel function for protein sequences called the LZW-Kernel. It is based on code words identified with the Lempel-Ziv-Welch (LZW) universal text compressor. The LZW-Kernel is an alignment-free method, it is always symmetric, is positive, always provides 1.0 for self-similarity and it can directly be used with Support Vector Machines (SVMs) in classification problems, contrary to normalized compression distance (NCD), which often violates the distance metric properties in practice and requires further techniques to be used with SVMs. The LZW-Kernel is a one-pass algorithm, which makes it particularly plausible for big data applications. Our experimental studies on remote protein homology detection and protein classification tasks reveal that the LZW-Kernel closely approaches the performance of the Local Alignment Kernel (LAK) and the SVM-pairwise method combined with Smith-Waterman (SW) scoring at a fraction of the time. Moreover, the LZW-Kernel outperforms the SVM-pairwise method when combined with BLAST scores, which indicates that the LZW code words might be a better basis for similarity measures than local alignment approximations found with BLAST. In addition, the LZW-Kernel outperforms n-gram based mismatch kernels, hidden Markov model based SAM and Fisher kernel, and protein family based PSI-BLAST, among others. Further advantages include the LZW-Kernel's reliance on a simple idea, its ease of implementation, and its high speed, three times faster than BLAST and several magnitudes faster than SW or LAK in our tests. LZW-Kernel is implemented as a standalone C code and is a free open-source program distributed under GPLv3 license and can be downloaded from https://github.com/kfattila/LZW-Kernel. akerteszfarkas@hse.ru. Supplementary data are available at Bioinformatics Online.

  16. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  17. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    Science.gov (United States)

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem

  18. Automatic Modulation Classification of LFM and Polyphase-coded Radar Signals

    Directory of Open Access Journals (Sweden)

    S. B. S. Hanbali

    2017-12-01

    Full Text Available There are several techniques for detecting and classifying low probability of intercept radar signals such as Wigner distribution, Choi-Williams distribution and time-frequency rate distribution, but these distributions require high SNR. To overcome this problem, we propose a new technique for detecting and classifying linear frequency modulation signal and polyphase coded signals using optimum fractional Fourier transform at low SNR. The theoretical analysis and simulation experiments demonstrate the validity and efficiency of the proposed method.

  19. Classification of quantum phases and topology of logical operators in an exactly solved model of quantum codes

    International Nuclear Information System (INIS)

    Yoshida, Beni

    2011-01-01

    Searches for possible new quantum phases and classifications of quantum phases have been central problems in physics. Yet, they are indeed challenging problems due to the computational difficulties in analyzing quantum many-body systems and the lack of a general framework for classifications. While frustration-free Hamiltonians, which appear as fixed point Hamiltonians of renormalization group transformations, may serve as representatives of quantum phases, it is still difficult to analyze and classify quantum phases of arbitrary frustration-free Hamiltonians exhaustively. Here, we address these problems by sharpening our considerations to a certain subclass of frustration-free Hamiltonians, called stabilizer Hamiltonians, which have been actively studied in quantum information science. We propose a model of frustration-free Hamiltonians which covers a large class of physically realistic stabilizer Hamiltonians, constrained to only three physical conditions; the locality of interaction terms, translation symmetries and scale symmetries, meaning that the number of ground states does not grow with the system size. We show that quantum phases arising in two-dimensional models can be classified exactly through certain quantum coding theoretical operators, called logical operators, by proving that two models with topologically distinct shapes of logical operators are always separated by quantum phase transitions.

  20. Population-based evaluation of a suggested anatomic and clinical classification of congenital heart defects based on the International Paediatric and Congenital Cardiac Code

    Directory of Open Access Journals (Sweden)

    Goffinet François

    2011-10-01

    Full Text Available Abstract Background Classification of the overall spectrum of congenital heart defects (CHD has always been challenging, in part because of the diversity of the cardiac phenotypes, but also because of the oft-complex associations. The purpose of our study was to establish a comprehensive and easy-to-use classification of CHD for clinical and epidemiological studies based on the long list of the International Paediatric and Congenital Cardiac Code (IPCCC. Methods We coded each individual malformation using six-digit codes from the long list of IPCCC. We then regrouped all lesions into 10 categories and 23 subcategories according to a multi-dimensional approach encompassing anatomic, diagnostic and therapeutic criteria. This anatomic and clinical classification of congenital heart disease (ACC-CHD was then applied to data acquired from a population-based cohort of patients with CHD in France, made up of 2867 cases (82% live births, 1.8% stillbirths and 16.2% pregnancy terminations. Results The majority of cases (79.5% could be identified with a single IPCCC code. The category "Heterotaxy, including isomerism and mirror-imagery" was the only one that typically required more than one code for identification of cases. The two largest categories were "ventricular septal defects" (52% and "anomalies of the outflow tracts and arterial valves" (20% of cases. Conclusion Our proposed classification is not new, but rather a regrouping of the known spectrum of CHD into a manageable number of categories based on anatomic and clinical criteria. The classification is designed to use the code numbers of the long list of IPCCC but can accommodate ICD-10 codes. Its exhaustiveness, simplicity, and anatomic basis make it useful for clinical and epidemiologic studies, including those aimed at assessment of risk factors and outcomes.

  1. Five-way smoking status classification using text hot-spot identification and error-correcting output codes.

    Science.gov (United States)

    Cohen, Aaron M

    2008-01-01

    We participated in the i2b2 smoking status classification challenge task. The purpose of this task was to evaluate the ability of systems to automatically identify patient smoking status from discharge summaries. Our submission included several techniques that we compared and studied, including hot-spot identification, zero-vector filtering, inverse class frequency weighting, error-correcting output codes, and post-processing rules. We evaluated our approaches using the same methods as the i2b2 task organizers, using micro- and macro-averaged F1 as the primary performance metric. Our best performing system achieved a micro-F1 of 0.9000 on the test collection, equivalent to the best performing system submitted to the i2b2 challenge. Hot-spot identification, zero-vector filtering, classifier weighting, and error correcting output coding contributed additively to increased performance, with hot-spot identification having by far the largest positive effect. High performance on automatic identification of patient smoking status from discharge summaries is achievable with the efficient and straightforward machine learning techniques studied here.

  2. Coding and classification in drug statistics – From national to global application

    Directory of Open Access Journals (Sweden)

    Marit Rønning

    2009-11-01

    Full Text Available  SUMMARYThe Anatomical Therapeutic Chemical (ATC classification system and the defined daily dose (DDDwas developed in Norway in the early seventies. The creation of the ATC/DDD methodology was animportant basis for presenting drug utilisation statistics in a sensible way. Norway was in 1977 also thefirst country to publish national drug utilisation statistics from wholesalers on an annual basis. Thecombination of these activities in Norway in the seventies made us a pioneer country in the area of drugutilisation research. Over the years, the use of the ATC/DDD methodology has gradually increased incountries outside Norway. Since 1996, the methodology has been recommended by WHO for use ininternational drug utilisation studies. The WHO Collaborating Centre for Drug Statistics Methodologyin Oslo handles the maintenance and development of the ATC/DDD system. The Centre is now responsiblefor the global co-ordination. After nearly 30 years of experience with ATC/DDD, the methodologyhas demonstrated its suitability in drug use research. The main challenge in the coming years is toeducate the users worldwide in how to use the methodology properly.

  3. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    Science.gov (United States)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  4. Administrative database concerns: accuracy of International Classification of Diseases, Ninth Revision coding is poor for preoperative anemia in patients undergoing spinal fusion.

    Science.gov (United States)

    Golinvaux, Nicholas S; Bohl, Daniel D; Basques, Bryce A; Grauer, Jonathan N

    2014-11-15

    Cross-sectional study. To objectively evaluate the ability of International Classification of Diseases, Ninth Revision (ICD-9) codes, which are used as the foundation for administratively coded national databases, to identify preoperative anemia in patients undergoing spinal fusion. National database research in spine surgery continues to rise. However, the validity of studies based on administratively coded data, such as the Nationwide Inpatient Sample, are dependent on the accuracy of ICD-9 coding. Such coding has previously been found to have poor sensitivity to conditions such as obesity and infection. A cross-sectional study was performed at an academic medical center. Hospital-reported anemia ICD-9 codes (those used for administratively coded databases) were directly compared with the chart-documented preoperative hematocrits (true laboratory values). A patient was deemed to have preoperative anemia if the preoperative hematocrit was less than the lower end of the normal range (36.0% for females and 41.0% for males). The study included 260 patients. Of these, 37 patients (14.2%) were anemic; however, only 10 patients (3.8%) received an "anemia" ICD-9 code. Of the 10 patients coded as anemic, 7 were anemic by definition, whereas 3 were not, and thus were miscoded. This equates to an ICD-9 code sensitivity of 0.19, with a specificity of 0.99, and positive and negative predictive values of 0.70 and 0.88, respectively. This study uses preoperative anemia to demonstrate the potential inaccuracies of ICD-9 coding. These results have implications for publications using databases that are compiled from ICD-9 coding data. Furthermore, the findings of the current investigation raise concerns regarding the accuracy of additional comorbidities. Although administrative databases are powerful resources that provide large sample sizes, it is crucial that we further consider the quality of the data source relative to its intended purpose.

  5. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS versions 8, 9 and 10.1

    Directory of Open Access Journals (Sweden)

    John Orchard

    2010-10-01

    Full Text Available John Orchard1, Katherine Rae1, John Brooks2, Martin Hägglund3, Lluis Til4, David Wales5, Tim Wood61Sports Medicine at Sydney University, Sydney NSW Australia; 2Rugby Football Union, Twickenham, England, UK; 3Department of Medical and Health Sciences, Linköping University, Linköping, Sweden; 4FC Barcelona, Barcelona, Catalonia, Spain; 5Arsenal FC, Highbury, England, UK; 6Tennis Australia, Melbourne, Vic, AustraliaAbstract: The Orchard Sports Injury Classification System (OSICS is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer, Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes and OSICS 10.1 (four digit codes are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements.Keywords: sports injury classification, epidemiology, surveillance, coding

  6. Possible Relevance of Receptor-Receptor Interactions between Viral- and Host-Coded Receptors for Viral-Induced Disease

    Directory of Open Access Journals (Sweden)

    Luigi F. Agnati

    2007-01-01

    Full Text Available It has been demonstrated that some viruses, such as the cytomegalovirus, code for G-protein coupled receptors not only to elude the immune system, but also to redirect cellular signaling in the receptor networks of the host cells. In view of the existence of receptor-receptor interactions, the hypothesis is introduced that these viral-coded receptors not only operate as constitutively active monomers, but also can affect other receptor function by interacting with receptors of the host cell. Furthermore, it is suggested that viruses could also insert not single receptors (monomers, but clusters of receptors (receptor mosaics, altering the cell metabolism in a profound way. The prevention of viral receptor-induced changes in host receptor networks may give rise to novel antiviral drugs that counteract viral-induced disease.

  7. The Use of System Codes in Scaling Studies: Relevant Techniques for Qualifying NPP Nodalizations for Particular Scenarios

    Directory of Open Access Journals (Sweden)

    V. Martinez-Quiroga

    2014-01-01

    Full Text Available System codes along with necessary nodalizations are valuable tools for thermal hydraulic safety analysis. Qualifying both codes and nodalizations is an essential step prior to their use in any significant study involving code calculations. Since most existing experimental data come from tests performed on the small scale, any qualification process must therefore address scale considerations. This paper describes the methodology developed at the Technical University of Catalonia in order to contribute to the qualification of Nuclear Power Plant nodalizations by means of scale disquisitions. The techniques that are presented include the so-called Kv-scaled calculation approach as well as the use of “hybrid nodalizations” and “scaled-up nodalizations.” These methods have revealed themselves to be very helpful in producing the required qualification and in promoting further improvements in nodalization. The paper explains both the concepts and the general guidelines of the method, while an accompanying paper will complete the presentation of the methodology as well as showing the results of the analysis of scaling discrepancies that appeared during the posttest simulations of PKL-LSTF counterpart tests performed on the PKL-III and ROSA-2 OECD/NEA Projects. Both articles together produce the complete description of the methodology that has been developed in the framework of the use of NPP nodalizations in the support to plant operation and control.

  8. Development of a simple computer code to obtain relevant data on H2 and CO combustion in severe accidents and to aid in PSA-2 assessments

    International Nuclear Information System (INIS)

    Robledo, F.; Martin-Valdepenas, J.M.; Jimenez, M.A.; Martin-Fuertes, F.

    2007-01-01

    By following Consejo de Seguridad Nuclear (CSN) requirements, all of the Spanish NPPs performed plant specific PSA level 2 studies and implemented Severe Accident Management Guidelines during the first years of this century. CSN and contractors made an independent detailed review of these PSA level 2 studies. This independent review included the performance of plant specific calculations by using the MELCOR code and some other stand-alone codes and the calculation of the fission product release frequencies for each plant. One of the aspects treated in detail by CSN evaluations was the calculation of the containment failure probability due to the burn of combustible gases generated during a severe accident. It was shown that it would be useful to have a fast running code with capability to provide the most relevant data concerning H 2 and CO combustion. Therefore, the Polytechnic University of Madrid (UPM) developed the CPPC code for the CSN. This stand-alone module makes fast calculations on maximum static pressures in the containment building generated from H 2 and CO combustion in severe accidents, considering well-mixed atmospheres and includes the most recent advances and developments in the field of H 2 and CO combustion. Code input is simple: mass of H 2 and CO, initial environmental conditions inside the containment before the combustion and simple geometric data, such as the volume of the building enclosing the combustible gases. The code calculates the containment temperature assuming steam saturated atmosphere and provides the following output: - Combustion completeness (CC); - Adiabatic and isochoric combustion pressure (p AICC ); - Chapman-Jouguet pressure (p CJ ); - Chapman-Jouguet reflected pressure (p Cjrefl ). When the combustion regime results in dynamic pressure loads, the CPPC code calculates the equivalent static pressure (effective pressure p eff ) by modeling the containment structure as a simple harmonic oscillator. Additionally, the code

  9. Positive Predictive Values of International Classification of Diseases, 10th Revision Coding Algorithms to Identify Patients With Autosomal Dominant Polycystic Kidney Disease

    Directory of Open Access Journals (Sweden)

    Vinusha Kalatharan

    2016-12-01

    Full Text Available Background: International Classification of Diseases, 10th Revision codes (ICD-10 for autosomal dominant polycystic kidney disease (ADPKD is used within several administrative health care databases. It is unknown whether these codes identify patients who meet strict clinical criteria for ADPKD. Objective: The objective of this study is (1 to determine whether different ICD-10 coding algorithms identify adult patients who meet strict clinical criteria for ADPKD as assessed through medical chart review and (2 to assess the number of patients identified with different ADPKD coding algorithms in Ontario. Design: Validation study of health care database codes, and prevalence. Setting: Ontario, Canada. Patients: For the chart review, 201 adult patients with hospital encounters between April 1, 2002, and March 31, 2014, assigned either ICD-10 codes Q61.2 or Q61.3. Measurements: This study measured positive predictive value of the ICD-10 coding algorithms and the number of Ontarians identified with different coding algorithms. Methods: We manually reviewed a random sample of medical charts in London, Ontario, Canada, and determined whether or not ADPKD was present according to strict clinical criteria. Results: The presence of either ICD-10 code Q61.2 or Q61.3 in a hospital encounter had a positive predictive value of 85% (95% confidence interval [CI], 79%-89% and identified 2981 Ontarians (0.02% of the Ontario adult population. The presence of ICD-10 code Q61.2 in a hospital encounter had a positive predictive value of 97% (95% CI, 86%-100% and identified 394 adults in Ontario (0.003% of the Ontario adult population. Limitations: (1 We could not calculate other measures of validity; (2 the coding algorithms do not identify patients without hospital encounters; and (3 coding practices may differ between hospitals. Conclusions: Most patients with ICD-10 code Q61.2 or Q61.3 assigned during their hospital encounters have ADPKD according to the clinical

  10. Identification of aspects of functioning, disability and health relevant to patients experiencing vertigo: a qualitative study using the international classification of functioning, disability and health

    Science.gov (United States)

    2012-01-01

    Purpose Aims of this study were to identify aspects of functioning and health relevant to patients with vertigo expressed by ICF categories and to explore the potential of the ICF to describe the patient perspective in vertigo. Methods We conducted a series of qualitative semi-structured face-to-face interviews using a descriptive approach. Data was analyzed using the meaning condensation procedure and then linked to categories of the International Classification of Functioning, Disability and Health (ICF). Results From May to July 2010 12 interviews were carried out until saturation was reached. Four hundred and seventy-one single concepts were extracted which were linked to 142 different ICF categories. 40 of those belonged to the component body functions, 62 to the component activity and participation, and 40 to the component environmental factors. Besides the most prominent aspect “dizziness” most participants reported problems within “Emotional functions (b152), problems related to mobility and carrying out the daily routine. Almost all participants reported “Immediate family (e310)” as a relevant modifying environmental factor. Conclusions From the patients’ perspective, vertigo has impact on multifaceted aspects of functioning and disability, mainly body functions and activities and participation. Modifying contextual factors have to be taken into account to cover the complex interaction between the health condition of vertigo on the individuals’ daily life. The results of this study will contribute to developing standards for the measurement of functioning, disability and health relevant for patients suffering from vertigo. PMID:22738067

  11. Relevance and Effectiveness of the WHO Global Code Practice on the International Recruitment of Health Personnel--Ethical and Systems Perspectives.

    Science.gov (United States)

    Brugha, Ruairí; Crowe, Sophie

    2015-05-20

    The relevance and effectiveness of the World Health Organization's (WHO's) Global Code of Practice on the International Recruitment of Health Personnel is being reviewed in 2015. The Code, which is a set of ethical norms and principles adopted by the World Health Assembly (WHA) in 2010, urges members states to train and retain the health personnel they need, thereby limiting demand for international migration, especially from the under-staffed health systems in low- and middle-income countries. Most countries failed to submit a first report in 2012 on implementation of the Code, including those source countries whose health systems are most under threat from the recruitment of their doctors and nurses, often to work in 4 major destination countries: the United States, United Kingdom, Canada and Australia. Political commitment by source country Ministers of Health needs to have been achieved at the May 2015 WHA to ensure better reporting by these countries on Code implementation for it to be effective. This paper uses ethics and health systems perspectives to analyse some of the drivers of international recruitment. The balance of competing ethics principles, which are contained in the Code's articles, reflects a tension that was evident during the drafting of the Code between 2007 and 2010. In 2007-2008, the right of health personnel to migrate was seen as a preeminent principle by US representatives on the Global Council which co-drafted the Code. Consensus on how to balance competing ethical principles--giving due recognition on the one hand to the obligations of health workers to the countries that trained them and the need for distributive justice given the global inequities of health workforce distribution in relation to need, and the right to migrate on the other hand--was only possible after President Obama took office in January 2009. It is in the interests of all countries to implement the Global Code and not just those that are losing their health

  12. Positive predictive values of the International Classification of Disease, 10th edition diagnoses codes for diverticular disease in the Danish National Registry of Patients

    Directory of Open Access Journals (Sweden)

    Rune Erichsen

    2010-10-01

    Full Text Available Rune Erichsen1, Lisa Strate2, Henrik Toft Sørensen1, John A Baron31Department of Clinical Epidemiology, Aarhus University Hospital, Denmark; 2Division of Gastroenterology, University of Washington, Seattle, WA, USA; 3Departments of Medicine and of Community and Family Medicine, Dartmouth Medical School, NH, USAObjective: To investigate the accuracy of diagnostic coding for diverticular disease in the Danish National Registry of Patients (NRP.Study design and setting: At Aalborg Hospital, Denmark, with a catchment area of 640,000 inhabitants, we identified 100 patients recorded in the NRP with a diagnosis of diverticular disease (International Classification of Disease codes, 10th revision [ICD-10] K572–K579 during the 1999–2008 period. We assessed the positive predictive value (PPV as a measure of the accuracy of discharge codes for diverticular disease using information from discharge abstracts and outpatient notes as the reference standard.Results: Of the 100 patients coded with diverticular disease, 49 had complicated diverticular disease, whereas 51 had uncomplicated diverticulosis. For the overall diagnosis of diverticular disease (K57, the PPV was 0.98 (95% confidence intervals [CIs]: 0.93, 0.99. For the more detailed subgroups of diagnosis indicating the presence or absence of complications (K573–K579 the PPVs ranged from 0.67 (95% CI: 0.09, 0.99 to 0.92 (95% CI: 0.52, 1.00. The diagnosis codes did not allow accurate identification of uncomplicated disease or any specific complication. However, the combined ICD-10 codes K572, K574, and K578 had a PPV of 0.91 (95% CI: 0.71, 0.99 for any complication.Conclusion: The diagnosis codes in the NRP can be used to identify patients with diverticular disease in general; however, they do not accurately discern patients with uncomplicated diverticulosis or with specific diverticular complications.Keywords: diverticulum, colon, diverticulitis, validation studies

  13. Classification of high-resolution multi-swath hyperspectral data using Landsat 8 surface reflectance data as a calibration target and a novel histogram based unsupervised classification technique to determine natural classes from biophysically relevant fit parameters

    Science.gov (United States)

    McCann, C.; Repasky, K. S.; Morin, M.; Lawrence, R. L.; Powell, S. L.

    2016-12-01

    Compact, cost-effective, flight-based hyperspectral imaging systems can provide scientifically relevant data over large areas for a variety of applications such as ecosystem studies, precision agriculture, and land management. To fully realize this capability, unsupervised classification techniques based on radiometrically-calibrated data that cluster based on biophysical similarity rather than simply spectral similarity are needed. An automated technique to produce high-resolution, large-area, radiometrically-calibrated hyperspectral data sets based on the Landsat surface reflectance data product as a calibration target was developed and applied to three subsequent years of data covering approximately 1850 hectares. The radiometrically-calibrated data allows inter-comparison of the temporal series. Advantages of the radiometric calibration technique include the need for minimal site access, no ancillary instrumentation, and automated processing. Fitting the reflectance spectra of each pixel using a set of biophysically relevant basis functions reduces the data from 80 spectral bands to 9 parameters providing noise reduction and data compression. Examination of histograms of these parameters allows for determination of natural splitting into biophysical similar clusters. This method creates clusters that are similar in terms of biophysical parameters, not simply spectral proximity. Furthermore, this method can be applied to other data sets, such as urban scenes, by developing other physically meaningful basis functions. The ability to use hyperspectral imaging for a variety of important applications requires the development of data processing techniques that can be automated. The radiometric-calibration combined with the histogram based unsupervised classification technique presented here provide one potential avenue for managing big-data associated with hyperspectral imaging.

  14. Relevance and Effectiveness of the WHO Global Code Practice on the International Recruitment of Health Personnel – Ethical and Systems Perspectives

    Directory of Open Access Journals (Sweden)

    Ruairi Brugha

    2015-06-01

    Full Text Available The relevance and effectiveness of the World Health Organization’s (WHO’s Global Code of Practice on the International Recruitment of Health Personnel is being reviewed in 2015. The Code, which is a set of ethical norms and principles adopted by the World Health Assembly (WHA in 2010, urges members states to train and retain the health personnel they need, thereby limiting demand for international migration, especially from the under-staffed health systems in low- and middle-income countries. Most countries failed to submit a first report in 2012 on implementation of the Code, including those source countries whose health systems are most under threat from the recruitment of their doctors and nurses, often to work in 4 major destination countries: the United States, United Kingdom, Canada and Australia. Political commitment by source country Ministers of Health needs to have been achieved at the May 2015 WHA to ensure better reporting by these countries on Code implementation for it to be effective. This paper uses ethics and health systems perspectives to analyse some of the drivers of international recruitment. The balance of competing ethics principles, which are contained in the Code’s articles, reflects a tension that was evident during the drafting of the Code between 2007 and 2010. In 2007-2008, the right of health personnel to migrate was seen as a preeminent principle by US representatives on the Global Council which co-drafted the Code. Consensus on how to balance competing ethical principles – giving due recognition on the one hand to the obligations of health workers to the countries that trained them and the need for distributive justice given the global inequities of health workforce distribution in relation to need, and the right to migrate on the other hand – was only possible after President Obama took office in January 2009. It is in the interests of all countries to implement the Global Code and not just those that

  15. A New Coding System for Metabolic Disorders Demonstrates Gaps in the International Disease Classifications ICD-10 and SNOMED-CT, Which Can Be Barriers to Genotype-Phenotype Data Sharing

    NARCIS (Netherlands)

    Sollie, Annet; Sijmons, Rolf H.; Lindhout, Dick; van der Ploeg, Ans T.; Gozalbo, M. Estela Rubio; Smit, G. Peter A.; Verheijen, Frans; Waterham, Hans R.; van Weely, Sonja; Wijburg, Frits A.; Wijburg, Rudolph; Visser, Gepke

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of

  16. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1

    Science.gov (United States)

    Orchard, John; Rae, Katherine; Brooks, John; Hägglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    The Orchard Sports Injury Classification System (OSICS) is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer), Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes) and OSICS 10.1 (four digit codes) are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements. PMID:24198559

  17. Nomenclature for congenital and paediatric cardiac disease: the International Paediatric and Congenital Cardiac Code (IPCCC) and the Eleventh Iteration of the International Classification of Diseases (ICD-11).

    Science.gov (United States)

    Franklin, Rodney C G; Béland, Marie J; Colan, Steven D; Walters, Henry L; Aiello, Vera D; Anderson, Robert H; Bailliard, Frédérique; Boris, Jeffrey R; Cohen, Meryl S; Gaynor, J William; Guleserian, Kristine J; Houyel, Lucile; Jacobs, Marshall L; Juraszek, Amy L; Krogmann, Otto N; Kurosawa, Hiromi; Lopez, Leo; Maruszewski, Bohdan J; St Louis, James D; Seslar, Stephen P; Srivastava, Shubhika; Stellin, Giovanni; Tchervenkov, Christo I; Weinberg, Paul M; Jacobs, Jeffrey P

    2017-12-01

    An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many "short list" versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various "short lists". In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the "short list" for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses

  18. Use of International Classification of Functioning, Disability and Health (ICF) to describe patient-reported disability in multiple sclerosis and identification of relevant environmental factors.

    Science.gov (United States)

    Khan, Fary; Pallant, Julie F

    2007-01-01

    To use the International Classification of Functioning, Disability and Health (ICF) to describe patient-reported disability in multiple sclerosis and identify relevant environmental factors. Cross-sectional survey of 101 participants in the community. Their multiple sclerosis-related problems were linked with ICF categories (second level) using a checklist, consensus between health professionals and the "linking rules". The impact of multiple sclerosis on health areas corresponding to 48 ICF categories was also assessed. A total of 170 ICF categories were identified (mean age 49 years, 72 were female). Average number of problems reported was 18. The categories include 48 (42%) for body function, 16 (34%) body structure, 68 (58%) activities and participation and 38 (51%) for environmental factors. Extreme impact in health areas corresponding to ICF categories for activities and participation were reported for mobility, work, everyday home activities, community and social activities. While those for the environmental factors (barriers) included products for mobility, attitudes of extended family, restriction accessing social security and health resources. This study is a first step in the use of the ICF in persons with multiple sclerosis and towards development of the ICF Core set for multiple sclerosis from a broader international perspective.

  19. Classification of multispectral or hyperspectral satellite imagery using clustering of sparse approximations on sparse representations in learned dictionaries obtained using efficient convolutional sparse coding

    Science.gov (United States)

    Moody, Daniela; Wohlberg, Brendt

    2018-01-02

    An approach for land cover classification, seasonal and yearly change detection and monitoring, and identification of changes in man-made features may use a clustering of sparse approximations (CoSA) on sparse representations in learned dictionaries. The learned dictionaries may be derived using efficient convolutional sparse coding to build multispectral or hyperspectral, multiresolution dictionaries that are adapted to regional satellite image data. Sparse image representations of images over the learned dictionaries may be used to perform unsupervised k-means clustering into land cover categories. The clustering process behaves as a classifier in detecting real variability. This approach may combine spectral and spatial textural characteristics to detect geologic, vegetative, hydrologic, and man-made features, as well as changes in these features over time.

  20. Decoding the encoding of functional brain networks: An fMRI classification comparison of non-negative matrix factorization (NMF), independent component analysis (ICA), and sparse coding algorithms.

    Science.gov (United States)

    Xie, Jianwen; Douglas, Pamela K; Wu, Ying Nian; Brody, Arthur L; Anderson, Ariana E

    2017-04-15

    Brain networks in fMRI are typically identified using spatial independent component analysis (ICA), yet other mathematical constraints provide alternate biologically-plausible frameworks for generating brain networks. Non-negative matrix factorization (NMF) would suppress negative BOLD signal by enforcing positivity. Spatial sparse coding algorithms (L1 Regularized Learning and K-SVD) would impose local specialization and a discouragement of multitasking, where the total observed activity in a single voxel originates from a restricted number of possible brain networks. The assumptions of independence, positivity, and sparsity to encode task-related brain networks are compared; the resulting brain networks within scan for different constraints are used as basis functions to encode observed functional activity. These encodings are then decoded using machine learning, by using the time series weights to predict within scan whether a subject is viewing a video, listening to an audio cue, or at rest, in 304 fMRI scans from 51 subjects. The sparse coding algorithm of L1 Regularized Learning outperformed 4 variations of ICA (pcoding algorithms. Holding constant the effect of the extraction algorithm, encodings using sparser spatial networks (containing more zero-valued voxels) had higher classification accuracy (pcoding algorithms suggests that algorithms which enforce sparsity, discourage multitasking, and promote local specialization may capture better the underlying source processes than those which allow inexhaustible local processes such as ICA. Negative BOLD signal may capture task-related activations. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Correlation between patients' reasons for encounters/health problems and population density in Japan: a systematic review of observational studies coded by the International Classification of Health Problems in Primary Care (ICHPPC) and the International Classification of Primary care (ICPC).

    Science.gov (United States)

    Kaneko, Makoto; Ohta, Ryuichi; Nago, Naoki; Fukushi, Motoharu; Matsushima, Masato

    2017-09-13

    The Japanese health care system has yet to establish structured training for primary care physicians; therefore, physicians who received an internal medicine based training program continue to play a principal role in the primary care setting. To promote the development of a more efficient primary health care system, the assessment of its current status in regard to the spectrum of patients' reasons for encounters (RFEs) and health problems is an important step. Recognizing the proportions of patients' RFEs and health problems, which are not generally covered by an internist, can provide valuable information to promote the development of a primary care physician-centered system. We conducted a systematic review in which we searched six databases (PubMed, the Cochrane Library, Google Scholar, Ichushi-Web, JDreamIII and CiNii) for observational studies in Japan coded by International Classification of Health Problems in Primary Care (ICHPPC) and International Classification of Primary Care (ICPC) up to March 2015. We employed population density as index of accessibility. We calculated Spearman's rank correlation coefficient to examine the correlation between the proportion of "non-internal medicine-related" RFEs and health problems in each study area in consideration of the population density. We found 17 studies with diverse designs and settings. Among these studies, "non-internal medicine-related" RFEs, which was not thought to be covered by internists, ranged from about 4% to 40%. In addition, "non-internal medicine-related" health problems ranged from about 10% to 40%. However, no significant correlation was found between population density and the proportion of "non-internal medicine-related" RFEs and health problems. This is the first systematic review on RFEs and health problems coded by ICHPPC and ICPC undertaken to reveal the diversity of health problems in Japanese primary care. These results suggest that primary care physicians in some rural areas of Japan

  2. Relevant areas of functioning in patients with adolescent idiopathic scoliosis on the International Classification of Functioning, Disability and Health: The patients' perspective.

    Science.gov (United States)

    Du, Chunping; Yu, Jiadan; Zhang, Jiaqi; Jiang, Jiaojiao; Lai, Huabin; Liu, Wei; Liu, Yang; Li, Hao; Wang, Pu

    2016-10-12

    To investigate relevant aspects of functioning and disability, and environmental factors in people with adolescent idiopathic scoliosis according to patients' self-reports based on the International Classification of Functioning, Disability and Health for Children and Youth (ICF-CY). Multicentre, empirical, cross-sectional study. Four departments of orthopaedics in 4 hospitals, and 5 departments of rehabilitation medicine in 5 hospitals. Semi-structured interviews were conducted with 975 patients with adolescent idiopathic scoliosis from 5 hospitals according to the patients' self-reporting. In addition, patients were divided into 3 groups according to clinical outcome. Participant information included demographic and disease-related characteristics. Three adolescent idiopathic scoliosis groups were then compared with respect to the problems identified. Interviews were transcribed verbatim. Categories identified by qualitative analysis were subsequently mapped to the ICF-CY using established linking rules. In order to enrich these findings, we also translated the Scoliosis Research Society 22 Patient Questionnaire (SRS-22 PQ) into the language of the ICF-CY, based on ICF linking rules. A total of 1278 themes that linked to 54 ICF-CY cate-gories from 18 chapters were identified. Twenty-two (41%) categories were identified as Body Functions, 7 (13%) as Body Structures, 15 (27%) as Activities and Participation, and 10 (19%) as Environmental Factors. Of the 54 categories, 45 (83%) were second-level, 5 (9%) were third-level, and 4 (7%) were fourth-level. Differences between the SRS-22 PQ results and our findings were observed for several ICF-CY categories. Patients with AIS reported activity limitations and participation restrictions combined with impaired body structures and functions. Environmental factors may act as a barrier to, or facilitator of, patient functioning in daily life. The ICF-CY provides a valuable framework for representing the complexity and

  3. Classification of radiological procedures

    International Nuclear Information System (INIS)

    1989-01-01

    A classification for departments in Danish hospitals which use radiological procedures. The classification codes consist of 4 digits, where the first 2 are the codes for the main groups. The first digit represents the procedure's topographical object and the second the techniques. The last 2 digits describe individual procedures. (CLS)

  4. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  5. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  6. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life

    DEFF Research Database (Denmark)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    : Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers......AIM: To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. METHOD...... of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after...

  7. International Classification of Primary Care-2 coding of primary care data at the general out-patients' clinic of General Hospital, Lagos, Nigeria.

    Science.gov (United States)

    Olagundoye, Olawunmi Abimbola; van Boven, Kees; van Weel, Chris

    2016-01-01

    Primary care serves as an integral part of the health systems of nations especially the African continent. It is the portal of entry for nearly all patients into the health care system. Paucity of accurate data for health statistics remains a challenge in the most parts of Africa because of inadequate technical manpower and infrastructure. Inadequate quality of data systems contributes to inaccurate data. A simple-to-use classification system such as the International Classification of Primary Care (ICPC) may be a solution to this problem at the primary care level. To apply ICPC-2 for secondary coding of reasons for encounter (RfE), problems managed and processes of care in a Nigerian primary care setting. Furthermore, to analyze the value of selected presented symptoms as predictors of the most common diagnoses encountered in the study setting. Content analysis of randomly selected patients' paper records for data collection at the end of clinic sessions conducted by family physicians at the general out-patients' clinics. Contents of clinical consultations were secondarily coded with the ICPC-2 and recorded into excel spreadsheets with fields for sociodemographic data such as age, sex, occupation, religion, and ICPC elements of an encounter: RfE/complaints, diagnoses/problems, and interventions/processes of care. Four hundred and one encounters considered in this study yielded 915 RfEs, 546 diagnoses, and 1221 processes. This implies an average of 2.3 RfE, 1.4 diagnoses, and 3.0 processes per encounter. The top 10 RfE, diagnoses/common illnesses, and processes were determined. Through the determination of the probability of the occurrence of certain diseases beginning with a RfE/complaint, the top five diagnoses that resulted from each of the top five RfE were also obtained. The top five RfE were: headache, fever, pain general/multiple sites, visual disturbance other and abdominal pain/cramps general. The top five diagnoses were: Malaria, hypertension

  8. A comparative analysis of moral principles and behavioral norms in eight ethical codes relevant to health sciences librarianship, medical informatics, and the health professions.

    Science.gov (United States)

    Byrd, Gary D; Winkelstein, Peter

    2014-10-01

    Based on the authors' shared interest in the interprofessional challenges surrounding health information management, this study explores the degree to which librarians, informatics professionals, and core health professionals in medicine, nursing, and public health share common ethical behavior norms grounded in moral principles. Using the "Principlism" framework from a widely cited textbook of biomedical ethics, the authors analyze the statements in the ethical codes for associations of librarians (Medical Library Association [MLA], American Library Association, and Special Libraries Association), informatics professionals (American Medical Informatics Association [AMIA] and American Health Information Management Association), and core health professionals (American Medical Association, American Nurses Association, and American Public Health Association). This analysis focuses on whether and how the statements in these eight codes specify core moral norms (Autonomy, Beneficence, Non-Maleficence, and Justice), core behavioral norms (Veracity, Privacy, Confidentiality, and Fidelity), and other norms that are empirically derived from the code statements. These eight ethical codes share a large number of common behavioral norms based most frequently on the principle of Beneficence, then on Autonomy and Justice, but rarely on Non-Maleficence. The MLA and AMIA codes share the largest number of common behavioral norms, and these two associations also share many norms with the other six associations. The shared core of behavioral norms among these professions, all grounded in core moral principles, point to many opportunities for building effective interprofessional communication and collaboration regarding the development, management, and use of health information resources and technologies.

  9. A new coding system for metabolic disorders demonstrates gaps in the international disease classifications ICD-10 and SNOMED-CT, which can be barriers to genotype-phenotype data sharing.

    Science.gov (United States)

    Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke

    2013-07-01

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.

  10. An imprinted non-coding genomic cluster at 14q32 defines clinically relevant molecular subtypes in osteosarcoma across multiple independent datasets

    OpenAIRE

    Hill, Katherine E.; Kelly, Andrew D.; Kuijjer, Marieke L.; Barry, William; Rattani, Ahmed; Garbutt, Cassandra C.; Kissick, Haydn; Janeway, Katherine; Perez-Atayde, Antonio; Goldsmith, Jeffrey; Gebhardt, Mark C.; Arredouani, Mohamed S.; Cote, Greg; Hornicek, Francis; Choy, Edwin

    2017-01-01

    Background: A microRNA (miRNA) collection on the imprinted 14q32 MEG3 region has been associated with outcome in osteosarcoma. We assessed the clinical utility of this miRNA set and their association with methylation status. Methods: We integrated coding and non-coding RNA data from three independent annotated clinical osteosarcoma cohorts (n = 65, n = 27, and n = 25) and miRNA and methylation data from one in vitro (19 cell lines) and one clinical (NCI Therapeutically Applicable Research to ...

  11. Application of Relevance Maps in Multidimensional Classification of Coal Types / Zastosowanie Map Odniesienia W Wielowymiarowej Klasyfikacji Typów Węgla

    Science.gov (United States)

    Niedoba, Tomasz

    2015-03-01

    Multidimensional data visualization methods are a modern tool allowing to classify some analyzed objects. In the case of grained materials e.g. coal, many characteristics have an influence on the material quality. In case of coal, apart from most obvious features like particle size, particle density or ash contents there are many others which cause significant differences between considered types of material. The paper presents the possibility of applying visualization techniques for coal type identification and determination of significant differences between various types of coal. Author decided to apply relevance maps to achieve this purpose. Three types of coal - 31, 34.2 and 35 (according to Polish classification of coal types) were investigated, which were initially screened on sieves and then divided into density fractions. Then, each size-density fraction was chemically analyzed to obtain other characteristics. It was stated that the applied methodology allows to identify certain coal types efficiently and can be used as a qualitative criterion for grained materials. However, it was impossible to achieve such identification comparing all three types of coal together. The presented methodology is new way of analyzing data concerning widely understood mineral processing. Surowce mineralne, które podlegają wzbogacaniu w celu ich lepszego wykorzystania mogą być (charakteryzują się) charakteryzowane wieloma wskaźnikami opisującymi ich, interesujące przeróbkarza, cechy. Podstawowymi cechami są wielkość ziaren oraz ich gęstość, które decydują o przebiegu rozdziału zbiorów ziaren (nadaw) i efektach takiego rozdziału. Rozdział prowadzi się z reguły, w celu uzyskania produktów o zróżnicowanych wartościach średnich wybranej cechy, która zwykle charakteryzowana jest zawartością określonego składnika surowca wyznaczoną na drodze analiz chemicznych. Takie podejście do surowca mineralnego prowadzi do potraktowania go jako

  12. Searching bioremediation patents through Cooperative Patent Classification (CPC).

    Science.gov (United States)

    Prasad, Rajendra

    2016-03-01

    Patent classification systems have traditionally evolved independently at each patent jurisdiction to classify patents handled by their examiners to be able to search previous patents while dealing with new patent applications. As patent databases maintained by them went online for free access to public as also for global search of prior art by examiners, the need arose for a common platform and uniform structure of patent databases. The diversity of different classification, however, posed problems of integrating and searching relevant patents across patent jurisdictions. To address this problem of comparability of data from different sources and searching patents, WIPO in the recent past developed what is known as International Patent Classification (IPC) system which most countries readily adopted to code their patents with IPC codes along with their own codes. The Cooperative Patent Classification (CPC) is the latest patent classification system based on IPC/European Classification (ECLA) system, developed by the European Patent Office (EPO) and the United States Patent and Trademark Office (USPTO) which is likely to become a global standard. This paper discusses this new classification system with reference to patents on bioremediation.

  13. Munitions Classification Library

    Science.gov (United States)

    2016-04-04

    members of the community to make their own additions to any, or all, of the classification libraries . The next phase entailed data collection over less......Include area code) 04/04/2016 Final Report August 2014 - August 2015 MUNITIONS CLASSIFICATION LIBRARY Mr. Craig Murray, Parsons Dr. Thomas H. Bell, Leidos

  14. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life.

    Science.gov (United States)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers (d codes). Scoring was repeated after 6 months. Psychometric and Rasch data analysis was undertaken. The initial and repeated data had Cronbach α of 0.96 and 0.97, respectively. Inter-code correlation was 0.54 (range: 0.23-0.91) and 0.76 (range: 0.20-0.92). The corrected code-total correlations were 0.72 (range: 0.49-0.83) and 0.75 (range: 0.50-0.87). When repeated, the ICF-CY code qualifier scoring showed a correlation R of 0.90. Rasch analysis of the selected ICF-CY code data demonstrated a mean measure of 0.00 and 0.00, respectively. Code qualifier infit mean square (MNSQ) had a mean of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after repeat. Corresponding measures were -1.10 (range: -5.31 to 5.25) and -1.11 (range: -5.42 to 5.36), respectively. Based on measures obtained at the 2 occasions, the correlation coefficient R was 0.84. The child code map showed coherence of ICF-CY codes at each level. There was continuity in covering the range across disabilities. And, first and foremost, the distribution of codes reflexed a true continuity in disability with codes for motor functions activated first, then codes for cognitive functions

  15. Clinical relevance and effect of surgical wound classification in appendicitis: Retrospective evaluation of wound classification discrepancies between surgeons, Swissnoso-trained infection control nurse, and histology as well as surgical site infection rates by wound class.

    Science.gov (United States)

    Wang-Chan, Anastasija; Gingert, Christian; Angst, Eliane; Hetzer, Franc Heinrich

    2017-07-01

    Surgical wound classification (SWC) is used for risk stratification of surgical site infection (SSI) and serves as the basis for measuring quality of care. The objective was to examine the accuracy and reliability of SWC. This study was purposed to evaluate the discrepancies in SWC as assessed by three groups: surgeons, an infection control nurse, and histopathologic evaluation. The secondary aim was to compare the risk-stratified SSI rates using the different SWC methods for 30 d postoperatively. An analysis was performed of the appendectomies from January 2013 to June 2014 in the Cantonal Hospital of Schaffhausen. SWC was assigned by the operating surgeon at the end of the procedure and retrospectively reviewed by a Swissnoso-trained infection control nurse after reading the operative and pathology report. The level of agreement among the three different SWC assessment groups was determined using kappa statistic. SSI rates were analyzed using a chi-square test. In 246 evaluated cases, the kappa scores for interrater reliability among the SWC assessments across the three groups ranged from 0.05 to 0.2 signifying slight agreement between the groups. SSIs were more frequently associated with trained infection control nurse-assigned SWC than with surgeons based SWC. Our study demonstrated a considerable discordance in the SWC assessments performed by the three groups. Unfortunately, the currently practiced SWC system suffers from ambiguity in definition and/or implementation of these definitions is not clearly stated. This lack of reliability is problematic and may lead to inappropriate comparisons within and between hospitals and surgeons. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. INF Code related matters. Joint IAEA/IMO literature survey on potential consequences of severe maritime accidents involving the transport of radioactive material. 2 volumes. Vol. I - Report and publication titles. Vol. II - Relevant abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-10

    This literature survey was undertaken jointly by the International Maritime Organization (IMO) and the International Atomic Energy Agency (IAEA) as a step in addressing the subject of environmental impact of accidents involving materials subject to the IMO's Code for the Safe Carriage of Irradiated Nuclear Fuel, Plutonium and High-Level Radioactive Wastes in Flasks on Board Ships, also known as the INF Code. The results of the survey are provided in two volumes: the first one containing the description of the search and search results with the list of generated publication titles, and the second volume containing the abstracts of those publications deemed relevant for the purposes of the literature survey. Literature published between 1980 and mid-1999 was reviewed by two independent consultants who generated publication titles by performing searches of appropriate databases, and selected the abstracts of relevant publications for inclusion in this survey. The IAEA operates INIS, the world's leading computerised bibliographical information system on the peaceful uses of nuclear energy. The acronym INIS stands for International Nuclear Information System. INIS Members are responsible for determining the relevant nuclear literature produced within their borders or organizational confines, and then preparing the associated input in accordance with INIS rules. INIS records are included in other major databases such as the Energy, Science and Technology database of the DIALOG service. Because it is the INIS Members, rather than the IAEA Secretariat, who are responsible for its contents, it was considered appropriate that INIS be the primary source of information for this literature review. Selected unpublished reports were also reviewed, e.g. Draft Proceedings of the Special Consultative Meeting of Entities involved in the maritime transport of materials covered by the INF Code (SCM 5), March 1996. Many of the formal papers at SCM 5 were included in the literature

  17. INF Code related matters. Joint IAEA/IMO literature survey on potential consequences of severe maritime accidents involving the transport of radioactive material. 2 volumes. Vol. I - Report and publication titles. Vol. II - Relevant abstracts

    International Nuclear Information System (INIS)

    2000-01-01

    This literature survey was undertaken jointly by the International Maritime Organization (IMO) and the International Atomic Energy Agency (IAEA) as a step in addressing the subject of environmental impact of accidents involving materials subject to the IMO's Code for the Safe Carriage of Irradiated Nuclear Fuel, Plutonium and High-Level Radioactive Wastes in Flasks on Board Ships, also known as the INF Code. The results of the survey are provided in two volumes: the first one containing the description of the search and search results with the list of generated publication titles, and the second volume containing the abstracts of those publications deemed relevant for the purposes of the literature survey. Literature published between 1980 and mid-1999 was reviewed by two independent consultants who generated publication titles by performing searches of appropriate databases, and selected the abstracts of relevant publications for inclusion in this survey. The IAEA operates INIS, the world's leading computerised bibliographical information system on the peaceful uses of nuclear energy. The acronym INIS stands for International Nuclear Information System. INIS Members are responsible for determining the relevant nuclear literature produced within their borders or organizational confines, and then preparing the associated input in accordance with INIS rules. INIS records are included in other major databases such as the Energy, Science and Technology database of the DIALOG service. Because it is the INIS Members, rather than the IAEA Secretariat, who are responsible for its contents, it was considered appropriate that INIS be the primary source of information for this literature review. Selected unpublished reports were also reviewed, e.g. Draft Proceedings of the Special Consultative Meeting of Entities involved in the maritime transport of materials covered by the INF Code (SCM 5), March 1996. Many of the formal papers at SCM 5 were included in the literature

  18. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  19. KWIC Index of nuclear codes (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-01-01

    It is a KWIC Index for 254 nuclear codes in the Nuclear Code Abstracts (1975 edition). The classification of nuclear codes and the form of index are the same as those in the Computer Programme Library at Ispra, Italy. (auth.)

  20. Classification, disease, and diagnosis.

    Science.gov (United States)

    Jutel, Annemarie

    2011-01-01

    Classification shapes medicine and guides its practice. Understanding classification must be part of the quest to better understand the social context and implications of diagnosis. Classifications are part of the human work that provides a foundation for the recognition and study of illness: deciding how the vast expanse of nature can be partitioned into meaningful chunks, stabilizing and structuring what is otherwise disordered. This article explores the aims of classification, their embodiment in medical diagnosis, and the historical traditions of medical classification. It provides a brief overview of the aims and principles of classification and their relevance to contemporary medicine. It also demonstrates how classifications operate as social framing devices that enable and disable communication, assert and refute authority, and are important items for sociological study.

  1. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1

    OpenAIRE

    Orchard, John; Rae, Katherine; Brooks, John; H?gglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    John Orchard1, Katherine Rae1, John Brooks2, Martin Hägglund3, Lluis Til4, David Wales5, Tim Wood61Sports Medicine at Sydney University, Sydney NSW Australia; 2Rugby Football Union, Twickenham, England, UK; 3Department of Medical and Health Sciences, Linköping University, Linköping, Sweden; 4FC Barcelona, Barcelona, Catalonia, Spain; 5Arsenal FC, Highbury, England, UK; 6Tennis Australia, Melbourne, Vic, AustraliaAbstract: The Orchard Sports Injury Classification Sys...

  2. Classification and modelling of functional outputs of computation codes. Application to accidental thermal-hydraulic calculations in pressurized water reactor (PWR)

    International Nuclear Information System (INIS)

    Auder, Benjamin

    2011-01-01

    This research thesis has been made within the frame of a project on nuclear reactor vessel life. It deals with the use of numerical codes aimed at estimating probability densities for every input parameter in order to calculate probability margins at the output level. More precisely, it deals with codes with one-dimensional functional responses. The author studies the numerical simulation of a pressurized thermal shock on a nuclear reactor vessel, i.e. one of the possible accident types. The study of the vessel integrity relies on a thermal-hydraulic analysis and on a mechanical analysis. Algorithms are developed and proposed for each of them. Input-output data are classified using a clustering technique and a graph-based representation. A method for output dimension reduction is proposed, and a regression is applied between inputs and reduced representations. Applications are discussed in the case of modelling and sensitivity analysis for the CATHARE code (a code used at the CEA for the thermal-hydraulic analysis)

  3. A Tale of Two Disability Coding Systems: The Veterans Administration Schedule for Rating Disabilities (VASRD) vs. Diagnostic Coding Using the International Classification of Diseases, 9th Edition, Clinical Modification (ICD-9-CM)

    Science.gov (United States)

    2008-01-01

    of ear and other sense organ disability cases with a disability-related CRO hospital record (N=234). Sickle - cell anemia was the most common...Hemic and Lymphatic Systems VASRD Group, 1984-1999. ICD-9-CM code (number and title) Frequency Percent of total* 282.6 Sickle - Cell Anemia 14...heterogeneous, and ICD-9-CM conditions linked to this VASRD include conditions that may be experienced by men and women. For example, 2 abdominal pain was

  4. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  5. Issues in Developing a Surveillance Case Definition for Nonfatal Suicide Attempt and Intentional Self-harm Using International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) Coded Data.

    Science.gov (United States)

    Hedegaard, Holly; Schoenbaum, Michael; Claassen, Cynthia; Crosby, Alex; Holland, Kristin; Proescholdbell, Scott

    2018-02-01

    Suicide and intentional self-harm are among the leading causes of death in the United States. To study this public health issue, epidemiologists and researchers often analyze data coded using the International Classification of Diseases (ICD). Prior to October 1, 2015, health care organizations and providers used the clinical modification of the Ninth Revision of ICD (ICD-9-CM) to report medical information in electronic claims data. The transition in October 2015 to use of the clinical modification of the Tenth Revision of ICD (ICD-10-CM) resulted in the need to update methods and selection criteria previously developed for ICD-9-CM coded data. This report provides guidance on the use of ICD-10-CM codes to identify cases of nonfatal suicide attempts and intentional self-harm in ICD-10-CM coded data sets. ICD-10-CM codes for nonfatal suicide attempts and intentional self-harm include: X71-X83, intentional self-harm due to drowning and submersion, firearms, explosive or thermal material, sharp or blunt objects, jumping from a high place, jumping or lying in front of a moving object, crashing of motor vehicle, and other specified means; T36-T50 with a 6th character of 2 (except for T36.9, T37.9, T39.9, T41.4, T42.7, T43.9, T45.9, T47.9, and T49.9, which are included if the 5th character is 2), intentional self-harm due to drug poisoning (overdose); T51-T65 with a 6th character of 2 (except for T51.9, T52.9, T53.9, T54.9, T56.9, T57.9, T58.0, T58.1, T58.9, T59.9, T60.9, T61.0, T61.1, T61.9, T62.9, T63.9, T64.0, T64.8, and T65.9, which are included if the 5th character is 2), intentional self-harm due to toxic effects of nonmedicinal substances; T71 with a 6th character of 2, intentional self-harm due to asphyxiation, suffocation, strangulation; and T14.91, Suicide attempt. Issues to consider when selecting records for nonfatal suicide attempts and intentional self-harm from ICD-10-CM coded administrative data sets are also discussed. All material appearing in this

  6. Information gathering for CLP classification

    Directory of Open Access Journals (Sweden)

    Ida Marcello

    2011-01-01

    Full Text Available Regulation 1272/2008 includes provisions for two types of classification: harmonised classification and self-classification. The harmonised classification of substances is decided at Community level and a list of harmonised classifications is included in the Annex VI of the classification, labelling and packaging Regulation (CLP. If a chemical substance is not included in the harmonised classification list it must be self-classified, based on available information, according to the requirements of Annex I of the CLP Regulation. CLP appoints that the harmonised classification will be performed for carcinogenic, mutagenic or toxic to reproduction substances (CMR substances and for respiratory sensitisers category 1 and for other hazard classes on a case-by-case basis. The first step of classification is the gathering of available and relevant information. This paper presents the procedure for gathering information and to obtain data. The data quality is also discussed.

  7. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  8. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  9. nRC: non-coding RNA Classifier based on structural features.

    Science.gov (United States)

    Fiannaca, Antonino; La Rosa, Massimo; La Paglia, Laura; Rizzo, Riccardo; Urso, Alfonso

    2017-01-01

    Non-coding RNA (ncRNA) are small non-coding sequences involved in gene expression regulation of many biological processes and diseases. The recent discovery of a large set of different ncRNAs with biologically relevant roles has opened the way to develop methods able to discriminate between the different ncRNA classes. Moreover, the lack of knowledge about the complete mechanisms in regulative processes, together with the development of high-throughput technologies, has required the help of bioinformatics tools in addressing biologists and clinicians with a deeper comprehension of the functional roles of ncRNAs. In this work, we introduce a new ncRNA classification tool, nRC (non-coding RNA Classifier). Our approach is based on features extraction from the ncRNA secondary structure together with a supervised classification algorithm implementing a deep learning architecture based on convolutional neural networks. We tested our approach for the classification of 13 different ncRNA classes. We obtained classification scores, using the most common statistical measures. In particular, we reach an accuracy and sensitivity score of about 74%. The proposed method outperforms other similar classification methods based on secondary structure features and machine learning algorithms, including the RNAcon tool that, to date, is the reference classifier. nRC tool is freely available as a docker image at https://hub.docker.com/r/tblab/nrc/. The source code of nRC tool is also available at https://github.com/IcarPA-TBlab/nrc.

  10. Supernova Photometric Lightcurve Classification

    Science.gov (United States)

    Zaidi, Tayeb; Narayan, Gautham

    2016-01-01

    This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).

  11. An Analysis of the Relationship between IFAC Code of Ethics and CPI

    Directory of Open Access Journals (Sweden)

    Ayşe İrem Keskin

    2015-11-01

    Full Text Available Abstract Code of ethics has become a significant concept as regards to the business world. That is why occupational organizations have developed their own codes of ethics over time. In this study, primarily the compatibility classification of the accounting code of ethics belonging to the IFAC (The International Federation of Accountants is carried out on the basis of the action plans assessing the levels of usage by the 175 IFAC national accounting organizations. It is determined as a result of the classification that 60,6% of the member organizations are applying the IFAC code in general, the rest 39,4% on the other hand, is not applying the code at all. With this classification, the hypothesis propounding that “The national accounting organizations in highly corrupt countries would be less likely to adopt the IFAC ethic code than those in very clean countries,” is tested using the “Corruption Perception Index-CPI” data. It is determined that the findings support this relevant hypothesis.          

  12. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  13. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  14. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  15. FACET CLASSIFICATIONS OF E-LEARNING TOOLS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2013-12-01

    Full Text Available The article deals with the classification of e-learning tools based on the facet method, which suggests the separation of the parallel set of objects into independent classification groups; at the same time it is not assumed rigid classification structure and pre-built finite groups classification groups are formed by a combination of values taken from the relevant facets. An attempt to systematize the existing classification of e-learning tools from the standpoint of classification theory is made for the first time. Modern Ukrainian and foreign facet classifications of e-learning tools are described; their positive and negative features compared to classifications based on a hierarchical method are analyzed. The original author's facet classification of e-learning tools is proposed.

  16. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  17. Combined genetic and splicing analysis of BRCA1 c.[594-2A>C; 641A>G] highlights the relevance of naturally occurring in-frame transcripts for developing disease gene variant classification algorithms

    OpenAIRE

    de la Hoya, Miguel; Soukarieh, Omar; L��pez-Perolio, Irene; Vega, Ana; Walker, Logan C.; van Ierland, Yvette; Baralle, Diana; Santamari��a, Marta; Lattimore, Vanessa; Wijnen, Juul; Whiley, Philip; Blanco, Ana; Raponi, Michela; Hauke, Jan; Wappenschmidt, Barbara

    2016-01-01

    A recent analysis using family history weighting and co-observation classification modeling indicated that BRCA1 c.594-2A > C (IVS9-2A > C), previously described to cause exon 10 skipping (a truncating alteration), displays characteristics inconsistent with those of a high risk pathogenic BRCA1 variant. We used large-scale genetic and clinical resources from the ENIGMA, CIMBA and BCAC consortia to assess pathogenicity of c.594-2A > C. The combined odds for causality considering case-control, ...

  18. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Land Cover - Minnesota Land Cover Classification System

    Data.gov (United States)

    Minnesota Department of Natural Resources — Land cover data set based on the Minnesota Land Cover Classification System (MLCCS) coding scheme. This data was produced using a combination of aerial photograph...

  20. Diagnosis of periodontal diseases using different classification ...

    African Journals Online (AJOL)

    The codes created for risk factors, periodontal data, and radiographically bone loss were formed as a matrix structure and regarded as inputs for the classification unit. A total of six periodontal conditions was the outputs of the classification unit. The accuracy of the suggested methods was compared according to their ...

  1. Relevant test set using feature selection algorithm for early detection ...

    African Journals Online (AJOL)

    The objective of feature selection is to find the most relevant features for classification. Thus, the dimensionality of the information will be reduced and may improve classification's accuracy. This paper proposed a minimum set of relevant questions that can be used for early detection of dyslexia. In this research, we ...

  2. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  3. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  4. The Importance of Classification to Business Model Research

    OpenAIRE

    Susan Lambert

    2015-01-01

    Purpose: To bring to the fore the scientific significance of classification and its role in business model theory building. To propose a method by which existing classifications of business models can be analyzed and new ones developed. Design/Methodology/Approach: A review of the scholarly literature relevant to classifications of business models is presented along with a brief overview of classification theory applicable to business model research. Existing business model classification...

  5. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  6. Classification of hand eczema

    DEFF Research Database (Denmark)

    Agner, T; Aalto-Korte, K; Andersen, K E

    2015-01-01

    BACKGROUND: Classification of hand eczema (HE) is mandatory in epidemiological and clinical studies, and also important in clinical work. OBJECTIVES: The aim was to test a recently proposed classification system of HE in clinical practice in a prospective multicentre study. METHODS: Patients were...... recruited from nine different tertiary referral centres. All patients underwent examination by specialists in dermatology and were checked using relevant allergy testing. Patients were classified into one of the six diagnostic subgroups of HE: allergic contact dermatitis, irritant contact dermatitis, atopic...... system investigated in the present study was useful, being able to give an appropriate main diagnosis for 89% of HE patients, and for another 7% when using two main diagnoses. The fact that more than half of the patients had one or more additional diagnoses illustrates that HE is a multifactorial disease....

  7. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  8. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  9. The structure of dual Grassmann codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Pinero, Fernando

    2016-01-01

    In this article we study the duals of Grassmann codes, certain codes coming from the Grassmannian variety. Exploiting their structure, we are able to count and classify all their minimum weight codewords. In this classification the lines lying on the Grassmannian variety play a central role. Rela...

  10. [Differentiation of coding quality in orthopaedics by special, illustration-oriented case group analysis in the G-DRG System 2005].

    Science.gov (United States)

    Schütz, U; Reichel, H; Dreinhöfer, K

    2007-01-01

    We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.

  11. Classifications of Patterned Hair Loss: A Review.

    Science.gov (United States)

    Gupta, Mrinal; Mysore, Venkataram

    2016-01-01

    Patterned hair loss is the most common cause of hair loss seen in both the sexes after puberty. Numerous classification systems have been proposed by various researchers for grading purposes. These systems vary from the simpler systems based on recession of the hairline to the more advanced multifactorial systems based on the morphological and dynamic parameters that affect the scalp and the hair itself. Most of these preexisting systems have certain limitations. Currently, the Hamilton-Norwood classification system for males and the Ludwig system for females are most commonly used to describe patterns of hair loss. In this article, we review the various classification systems for patterned hair loss in both the sexes. Relevant articles were identified through searches of MEDLINE and EMBASE. Search terms included but were not limited to androgenic alopecia classification, patterned hair loss classification, male pattern baldness classification, and female pattern hair loss classification. Further publications were identified from the reference lists of the reviewed articles.

  12. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  13. Coding update of the SMFM definition of low risk for cesarean delivery from ICD-9-CM to ICD-10-CM.

    Science.gov (United States)

    Armstrong, Joanne; McDermott, Patricia; Saade, George R; Srinivas, Sindhu K

    2017-07-01

    In 2015, the Society for Maternal-Fetal Medicine developed a low risk for cesarean delivery definition based on administrative claims-based diagnosis codes described by the International Classification of Diseases, Ninth Revision, Clinical Modification. The Society for Maternal-Fetal Medicine definition is a clinical enrichment of 2 available measures from the Joint Commission and the Agency for Healthcare Research and Quality measures. The Society for Maternal-Fetal Medicine measure excludes diagnosis codes that represent clinically relevant risk factors that are absolute or relative contraindications to vaginal birth while retaining diagnosis codes such as labor disorders that are discretionary risk factors for cesarean delivery. The introduction of the International Statistical Classification of Diseases, 10th Revision, Clinical Modification in October 2015 expanded the number of available diagnosis codes and enabled a greater depth and breadth of clinical description. These coding improvements further enhance the clinical validity of the Society for Maternal-Fetal Medicine definition and its potential utility in tracking progress toward the goal of safely lowering the US cesarean delivery rate. This report updates the Society for Maternal-Fetal Medicine definition of low risk for cesarean delivery using International Statistical Classification of Diseases, 10th Revision, Clinical Modification coding. Copyright © 2017. Published by Elsevier Inc.

  14. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  15. Why relevance theory is relevant for lexicography

    DEFF Research Database (Denmark)

    Bothma, Theo; Tarp, Sven

    2014-01-01

    This article starts by providing a brief summary of relevance theory in information science in relation to the function theory of lexicography, explaining the different types of relevance, viz. objective system relevance and the subjective types of relevance, i.e. topical, cognitive, situational...... that is very important for lexicography as well as for information science, viz. functional relevance. Since all lexicographic work is ultimately aimed at satisfying users’ information needs, the article then discusses why the lexicographer should take note of all these types of relevance when planning a new...... dictionary project, identifying new tasks and responsibilities of the modern lexicographer. The article furthermore discusses how relevance theory impacts on teaching dictionary culture and reference skills. By integrating insights from lexicography and information science, the article contributes to new...

  16. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  17. Classification differences and maternal mortality

    DEFF Research Database (Denmark)

    Salanave, B; Bouvier-Colle, M H; Varnoux, N

    1999-01-01

    OBJECTIVES: To compare the ways maternal deaths are classified in national statistical offices in Europe and to evaluate the ways classification affects published rates. METHODS: Data on pregnancy-associated deaths were collected in 13 European countries. Cases were classified by a European panel....... This change was substantial in three countries (P statistical offices appeared to attribute fewer deaths to obstetric causes. In the other countries, no differences were detected. According to official published data, the aggregated maternal mortality rate for participating countries was 7.7 per...... of experts into obstetric or non-obstetric causes. An ICD-9 code (International Classification of Diseases) was attributed to each case. These were compared to the codes given in each country. Correction indices were calculated, giving new estimates of maternal mortality rates. SUBJECTS: There were...

  18. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  19. The Improved Relevance Voxel Machine

    DEFF Research Database (Denmark)

    Ganz, Melanie; Sabuncu, Mert; Van Leemput, Koen

    The concept of sparse Bayesian learning has received much attention in the machine learning literature as a means of achieving parsimonious representations of features used in regression and classification. It is an important family of algorithms for sparse signal recovery and compressed sensing....... Hence in its current form it is reminiscent of a greedy forward feature selection algorithm. In this report, we aim to solve the problems of the original RVoxM algorithm in the spirit of [7] (FastRVM).We call the new algorithm Improved Relevance Voxel Machine (IRVoxM). Our contributions...... and enables basis selection from overcomplete dictionaries. One of the trailblazers of Bayesian learning is MacKay who already worked on the topic in his PhD thesis in 1992 [1]. Later on Tipping and Bishop developed the concept of sparse Bayesian learning [2, 3] and Tipping published the Relevance Vector...

  20. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  1. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  2. 75 FR 78213 - Proposed Information Collection; Comment Request; 2012 Economic Census Classification Report for...

    Science.gov (United States)

    2010-12-15

    ... 8-digit North American Industry Classification System (NAICS) based code for use in the 2012... classification due to changes in NAICS for 2012. Collecting this classification information will ensure the... the reporting burden on sampled sectors. Proper NAICS classification data ensures high quality...

  3. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  4. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  5. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  6. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  7. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  8. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  9. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  10. Evaluation of the Waste Isolation Pilot Plant classification of systems, structures and components

    International Nuclear Information System (INIS)

    1985-07-01

    A review of the classification system for systems, structures, and components at the Waste Isolation Pilot Plant (WIPP) was performed using the WIPP Safety Analysis Report (SAR) and Bechtel document D-76-D-03 as primary source documents. The regulations of the US Nuclear Regulatory Commission (NRC) covering ''Disposal of High level Radioactive Wastes in Geologic Repositories,'' 10 CFR 60, and the regulations relevant to nuclear power plant siting and construction (10 CFR 50, 51, 100) were used as standards to evaluate the WIPP design classification system, although it is recognized that the US Department of Energy (DOE) is not required to comply with these NRC regulations in the design and construction of WIPP. The DOE General Design Criteria Manual (DOE Order 6430.1) and the Safety Analysis and Review System for AL Operation document (AL 54f81.1A) were reviewed in part. This report includes a discussion of the historical basis for nuclear power plant requirements, a review of WIPP and nuclear power plant classification bases, and a comparison of the codes and standards applicable to each quality level. Observations made during the review of the WIPP SAR are noted in the text of this reoport. The conclusions reached by this review are: WIPP classification methodology is comparable to corresponding nuclear power procedures. The classification levels assigned to WIPP systems are qualitatively the same as those assigned to nuclear power plant systems

  11. Deep learning relevance

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Petersen, Casper

    2016-01-01

    train a Recurrent Neural Network (RNN) on existing relevant information to that query. We then use the RNN to "deep learn" a single, synthetic, and we assume, relevant document for that query. We design a crowdsourcing experiment to assess how relevant the "deep learned" document is, compared...... to existing relevant documents. Users are shown a query and four wordclouds (of three existing relevant documents and our deep learned synthetic document). The synthetic document is ranked on average most relevant of all....

  12. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  13. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  14. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  15. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  16. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  17. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  18. The portal hypertension syndrome: etiology, classification, relevance, and animal models.

    Science.gov (United States)

    Bosch, Jaime; Iwakiri, Yasuko

    2018-02-01

    Portal hypertension is a key complication of portal hypertension, which is responsible for the development of varices, ascites, bleeding, and hepatic encephalopathy, which, in turn, cause a high mortality and requirement for liver transplantation. This review deals with the present day state-of-the-art preventative treatments of portal hypertension in cirrhosis according to disease stage. Two main disease stages are considered, compensated and decompensated cirrhosis, the first having good prognosis and being mostly asymptomatic, and the second being heralded by the appearance of bleeding or non-bleeding complications of portal hypertension. The aim of treatment in compensated cirrhosis is preventing clinical decompensation, the more frequent event being ascites, followed by variceal bleeding and hepatic encephalopathy. Complications are mainly driven by an increase of hepatic vein pressure gradient (HVPG) to values ≥10 mmHg (defining the presence of Clinically Significant Portal Hypertension, CSPH). Before CSPH, the treatment is limited to etiologic treatment of cirrhosis and healthy life style (abstain from alcohol, avoid/correct obesity…). When CSPH is present, association of a non-selective beta-blocker (NSBB), including carvedilol should be considered. NSBBs are mandatory if moderate/large varices are present. Patients should also enter a screening program for hepatocellular carcinoma. In decompensated patients, the goal is to prevent further bleeding if the only manifestation of decompensation was a bleeding episode, but to prevent liver transplantation and death in the common scenario where patients have manifested first non-bleeding complications. Treatment is based on the same principles (healthy life style..) associated with administration of NSBBs in combination if possible with endoscopic band ligation if there has been variceal bleeding, and complemented with simvastatin administration (20-40 mg per day in Child-Pugh A/B, 10-20 mg in Child C). Recurrence shall be treated with TIPS. TIPS might be indicated earlier in patients with: 1) Difficult/refractory ascites, who are not the best candidates for NSBBs, 2) patients having bleed under NSBBs or showing no HVPG response (decrease in HVPG of at least 20% of baseline or to values equal or below 12 mmHg). Decompensated patients shall all be considered as potential candidates for liver transplantation. Treatment of portal hypertension has markedly improved in recent years. The present day therapy is based on accurate risk stratification according to disease stage.

  19. Classification of Clinically Relevant Microorganisms in Non-Medical Environments

    National Research Council Canada - National Science Library

    Bowers, Daniel

    2004-01-01

    .... He named them Staphylococcus aureus because of their golden color. S. aureus remains one of the most important pathogens in clinical settings largely due to the rapidity of its evolutionary response to treatment...

  20. Towards Product Lining Model-Driven Development Code Generators

    OpenAIRE

    Roth, Alexander; Rumpe, Bernhard

    2015-01-01

    A code generator systematically transforms compact models to detailed code. Today, code generation is regarded as an integral part of model-driven development (MDD). Despite its relevance, the development of code generators is an inherently complex task and common methodologies and architectures are lacking. Additionally, reuse and extension of existing code generators only exist on individual parts. A systematic development and reuse based on a code generator product line is still in its inf...

  1. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.; Bensmail, H.; Yao, N.; Gao, Xin

    2013-01-01

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  2. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  3. Causes of death and associated conditions (Codac): a utilitarian approach to the classification of perinatal deaths.

    Science.gov (United States)

    Frøen, J Frederik; Pinar, Halit; Flenady, Vicki; Bahrin, Safiah; Charles, Adrian; Chauke, Lawrence; Day, Katie; Duke, Charles W; Facchinetti, Fabio; Fretts, Ruth C; Gardener, Glenn; Gilshenan, Kristen; Gordijn, Sanne J; Gordon, Adrienne; Guyon, Grace; Harrison, Catherine; Koshy, Rachel; Pattinson, Robert C; Petersson, Karin; Russell, Laurie; Saastad, Eli; Smith, Gordon C S; Torabi, Rozbeh

    2009-06-10

    A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD), although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes.We tested the Causes of Death and Associated Conditions (Codac) classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions.The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies), two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy), a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal).For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured.The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions associated with them, and the most

  4. Transcriptome classification reveals molecular subtypes in psoriasis

    Directory of Open Access Journals (Sweden)

    Ainali Chrysanthi

    2012-09-01

    Full Text Available Abstract Background Psoriasis is an immune-mediated disease characterised by chronically elevated pro-inflammatory cytokine levels, leading to aberrant keratinocyte proliferation and differentiation. Although certain clinical phenotypes, such as plaque psoriasis, are well defined, it is currently unclear whether there are molecular subtypes that might impact on prognosis or treatment outcomes. Results We present a pipeline for patient stratification through a comprehensive analysis of gene expression in paired lesional and non-lesional psoriatic tissue samples, compared with controls, to establish differences in RNA expression patterns across all tissue types. Ensembles of decision tree predictors were employed to cluster psoriatic samples on the basis of gene expression patterns and reveal gene expression signatures that best discriminate molecular disease subtypes. This multi-stage procedure was applied to several published psoriasis studies and a comparison of gene expression patterns across datasets was performed. Conclusion Overall, classification of psoriasis gene expression patterns revealed distinct molecular sub-groups within the clinical phenotype of plaque psoriasis. Enrichment for TGFb and ErbB signaling pathways, noted in one of the two psoriasis subgroups, suggested that this group may be more amenable to therapies targeting these pathways. Our study highlights the potential biological relevance of using ensemble decision tree predictors to determine molecular disease subtypes, in what may initially appear to be a homogenous clinical group. The R code used in this paper is available upon request.

  5. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  6. Classification Using Markov Blanket for Feature Selection

    DEFF Research Database (Denmark)

    Zeng, Yifeng; Luo, Jian

    2009-01-01

    Selecting relevant features is in demand when a large data set is of interest in a classification task. It produces a tractable number of features that are sufficient and possibly improve the classification performance. This paper studies a statistical method of Markov blanket induction algorithm...... for filtering features and then applies a classifier using the Markov blanket predictors. The Markov blanket contains a minimal subset of relevant features that yields optimal classification performance. We experimentally demonstrate the improved performance of several classifiers using a Markov blanket...... induction as a feature selection method. In addition, we point out an important assumption behind the Markov blanket induction algorithm and show its effect on the classification performance....

  7. Comparison and analysis for item classifications between AP1000 and traditional PWR

    International Nuclear Information System (INIS)

    Luo Shuiyun; Liu Xiaoyan

    2012-01-01

    The comparison and analysis for the safety classification, seismic category, code classification and QA classification between AP1000 and traditional PWR were presented. The safety could be guaranteed and the construction and manufacture costs could be cut down since all sorts of AP1000 classifications. It is suggested that the QA classification and the QA requirements correspond to the national conditions should be drafted in the process of AP1000 domestication. (authors)

  8. A novel risk classification system for 30-day mortality in children undergoing surgery

    Science.gov (United States)

    Walter, Arianne I.; Jones, Tamekia L.; Huang, Eunice Y.; Davis, Robert L.

    2018-01-01

    A simple, objective and accurate way of grouping children undergoing surgery into clinically relevant risk groups is needed. The purpose of this study, is to develop and validate a preoperative risk classification system for postsurgical 30-day mortality for children undergoing a wide variety of operations. The National Surgical Quality Improvement Project-Pediatric participant use file data for calendar years 2012–2014 was analyzed to determine preoperative variables most associated with death within 30 days of operation (D30). Risk groups were created using classification tree analysis based on these preoperative variables. The resulting risk groups were validated using 2015 data, and applied to neonates and higher risk CPT codes to determine validity in high-risk subpopulations. A five-level risk classification was found to be most accurate. The preoperative need for ventilation, oxygen support, inotropic support, sepsis, the need for emergent surgery and a do not resuscitate order defined non-overlapping groups with observed rates of D30 that vary from 0.075% (Very Low Risk) to 38.6% (Very High Risk). When CPT codes where death was never observed are eliminated or when the system is applied to neonates, the groupings remained predictive of death in an ordinal manner. PMID:29351327

  9. MAD parsing and conversion code

    International Nuclear Information System (INIS)

    Mokhov, Dmitri N.

    2000-01-01

    The authors describe design and implementation issues while developing an embeddable MAD language parser. Two working applications of the parser are also described, namely, MAD-> C++ converter and C++ factory. The report contains some relevant details about the parser and examples of converted code. It also describes some of the problems that were encountered and the solutions found for them

  10. Survey of Codes Employing Nuclear Damage Assessment

    Science.gov (United States)

    1977-10-01

    surveyed codes were com- DO 73Mu 1473 ETN OF 1NOVSSSOLETE UNCLASSIFIED 1 SECURITY CLASSIFICATION OF THIS f AGE (Wh*11 Date Efntered)S<>-~C. I UNCLASSIFIED...level and above) TALLEY/TOTEM not nuclear TARTARUS too highly aggregated (battalion level and above) UNICORN highly aggregated force allocation code...vulnerability data can bq input by the user as he receives them, and there is the abil ’ity to replay any situation using hindsight. The age of target

  11. Review of codes, standards, and regulations for natural gas locomotives.

    Science.gov (United States)

    2014-06-01

    This report identified, collected, and summarized relevant international codes, standards, and regulations with potential : applicability to the use of natural gas as a locomotive fuel. Few international or country-specific codes, standards, and regu...

  12. Diagnosis of periodontal diseases using different classification ...

    African Journals Online (AJOL)

    2014-11-29

    Nov 29, 2014 ... Nigerian Journal of Clinical Practice • May-Jun 2015 • Vol 18 • Issue 3 ... Materials and Methods: A total of 150 patients was divided into two groups such as training ... functions from training data and DT learning is one of ... were represented as numerical codings for classification ..... tool within dentistry.

  13. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer; Ghanem, Bernard

    2017-01-01

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images

  14. Cost reducing code implementation strategies

    International Nuclear Information System (INIS)

    Kurtz, Randall L.; Griswold, Michael E.; Jones, Gary C.; Daley, Thomas J.

    1995-01-01

    Sargent and Lundy's Code consulting experience reveals a wide variety of approaches toward implementing the requirements of various nuclear Codes Standards. This paper will describe various Code implementation strategies which assure that Code requirements are fully met in a practical and cost-effective manner. Applications to be discussed includes the following: new construction; repair, replacement and modifications; assessments and life extensions. Lessons learned and illustrative examples will be included. Preferred strategies and specific recommendations will also be addressed. Sargent and Lundy appreciates the opportunity provided by the Korea Atomic Industrial Forum and Korean Nuclear Society to share our ideas and enhance global cooperation through the exchange of information and views on relevant topics

  15. SAW Classification Algorithm for Chinese Text Classification

    OpenAIRE

    Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang

    2015-01-01

    Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...

  16. Nevada Administrative Code for Special Education Programs.

    Science.gov (United States)

    Nevada State Dept. of Education, Carson City. Special Education Branch.

    This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…

  17. Cracking the code of oscillatory activity.

    Directory of Open Access Journals (Sweden)

    Philippe G Schyns

    2011-05-01

    Full Text Available Neural oscillations are ubiquitous measurements of cognitive processes and dynamic routing and gating of information. The fundamental and so far unresolved problem for neuroscience remains to understand how oscillatory activity in the brain codes information for human cognition. In a biologically relevant cognitive task, we instructed six human observers to categorize facial expressions of emotion while we measured the observers' EEG. We combined state-of-the-art stimulus control with statistical information theory analysis to quantify how the three parameters of oscillations (i.e., power, phase, and frequency code the visual information relevant for behavior in a cognitive task. We make three points: First, we demonstrate that phase codes considerably more information (2.4 times relating to the cognitive task than power. Second, we show that the conjunction of power and phase coding reflects detailed visual features relevant for behavioral response--that is, features of facial expressions predicted by behavior. Third, we demonstrate, in analogy to communication technology, that oscillatory frequencies in the brain multiplex the coding of visual features, increasing coding capacity. Together, our findings about the fundamental coding properties of neural oscillations will redirect the research agenda in neuroscience by establishing the differential role of frequency, phase, and amplitude in coding behaviorally relevant information in the brain.

  18. The classification of easement

    Directory of Open Access Journals (Sweden)

    Popov Danica D.

    2015-01-01

    Full Text Available Easement means, a right enjoyed by the owner of land over the lands of another: such as rights of way, right of light, rights of support, rights to a flow of air or water etc. The dominant tenement is the land owned by the possessor of the easement, and the servient tenement is the land over which the right is enjoyed. An easement must exist for the accommodation and better enjoyment to which it is annexed, otherwise it may amount to mere licence. An easement benefits and binds the land itself and therefore countinious despite any change of ownership of either dominant or servient tenement, although it will be extinguished if the two tenemants come into common ownership. An easement can only be enjoyed in respect of land. This means two parcels of land. First there must be a 'dominant tenement' and a 'servient tenement'. Dominant tenement to which the benefit of the easement attaches, and another (servient tenement which bears the burden of the easement. A positive easement consist of a right to do something on the land of another; a negative easement restrict the use of owner of the serviant tenement may make of his land. An easement may be on land or on the house made on land. The next classification is on easement on the ground, and the other one under the ground. An easement shall be done in accordance with the principle of restrictions. This means that the less burden the servient tenement. When there is doubt about the extent of the actual easement shall take what easier the servient tenement. The new needs of the dominant estate does not result in the expansion of servitude. In the article is made comparison between The Draft Code of property and other real estate, and The Draft of Civil Code of Serbia.

  19. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  20. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  1. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  2. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  3. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  4. Lossless Compression of Classification-Map Data

    Science.gov (United States)

    Hua, Xie; Klimesh, Matthew

    2009-01-01

    A lossless image-data-compression algorithm intended specifically for application to classification-map data is based on prediction, context modeling, and entropy coding. The algorithm was formulated, in consideration of the differences between classification maps and ordinary images of natural scenes, so as to be capable of compressing classification- map data more effectively than do general-purpose image-data-compression algorithms. Classification maps are typically generated from remote-sensing images acquired by instruments aboard aircraft (see figure) and spacecraft. A classification map is a synthetic image that summarizes information derived from one or more original remote-sensing image(s) of a scene. The value assigned to each pixel in such a map is the index of a class that represents some type of content deduced from the original image data for example, a type of vegetation, a mineral, or a body of water at the corresponding location in the scene. When classification maps are generated onboard the aircraft or spacecraft, it is desirable to compress the classification-map data in order to reduce the volume of data that must be transmitted to a ground station.

  5. Classifying Coding DNA with Nucleotide Statistics

    Directory of Open Access Journals (Sweden)

    Nicolas Carels

    2009-10-01

    Full Text Available In this report, we compared the success rate of classification of coding sequences (CDS vs. introns by Codon Structure Factor (CSF and by a method that we called Universal Feature Method (UFM. UFM is based on the scoring of purine bias (Rrr and stop codon frequency. We show that the success rate of CDS/intron classification by UFM is higher than by CSF. UFM classifies ORFs as coding or non-coding through a score based on (i the stop codon distribution, (ii the product of purine probabilities in the three positions of nucleotide triplets, (iii the product of Cytosine (C, Guanine (G, and Adenine (A probabilities in the 1st, 2nd, and 3rd positions of triplets, respectively, (iv the probabilities of G in 1st and 2nd position of triplets and (v the distance of their GC3 vs. GC2 levels to the regression line of the universal correlation. More than 80% of CDSs (true positives of Homo sapiens (>250 bp, Drosophila melanogaster (>250 bp and Arabidopsis thaliana (>200 bp are successfully classified with a false positive rate lower or equal to 5%. The method releases coding sequences in their coding strand and coding frame, which allows their automatic translation into protein sequences with 95% confidence. The method is a natural consequence of the compositional bias of nucleotides in coding sequences.

  6. Research on quality assurance classification methodology for domestic AP1000 nuclear power projects

    International Nuclear Information System (INIS)

    Bai Jinhua; Jiang Huijie; Li Jingyan

    2012-01-01

    To meet the quality assurance classification requirements of domestic nuclear safety codes and standards, this paper analyzes the quality assurance classification methodology of domestic AP1000 nuclear power projects at present, and proposes the quality assurance classification methodology for subsequent AP1000 nuclear power projects. (authors)

  7. Making Deferred Taxes Relevant

    NARCIS (Netherlands)

    Brouwer, Arjan; Naarding, Ewout

    2018-01-01

    We analyse the conceptual problems in current accounting for deferred taxes and provide solutions derived from the literature in order to make International Financial Reporting Standards (IFRS) deferred tax numbers value-relevant. In our view, the empirical results concerning the value relevance of

  8. Parsimonious relevance models

    NARCIS (Netherlands)

    Meij, E.; Weerkamp, W.; Balog, K.; de Rijke, M.; Myang, S.-H.; Oard, D.W.; Sebastiani, F.; Chua, T.-S.; Leong, M.-K.

    2008-01-01

    We describe a method for applying parsimonious language models to re-estimate the term probabilities assigned by relevance models. We apply our method to six topic sets from test collections in five different genres. Our parsimonious relevance models (i) improve retrieval effectiveness in terms of

  9. Asteroid taxonomic classifications

    International Nuclear Information System (INIS)

    Tholen, D.J.

    1989-01-01

    This paper reports on three taxonomic classification schemes developed and applied to the body of available color and albedo data. Asteroid taxonomic classifications according to two of these schemes are reproduced

  10. Computer codes for designing proton linear accelerators

    International Nuclear Information System (INIS)

    Kato, Takao

    1992-01-01

    Computer codes for designing proton linear accelerators are discussed from the viewpoint of not only designing but also construction and operation of the linac. The codes are divided into three categories according to their purposes: 1) design code, 2) generation and simulation code, and 3) electric and magnetic fields calculation code. The role of each category is discussed on the basis of experience at KEK (the design of the 40-MeV proton linac and its construction and operation, and the design of the 1-GeV proton linac). We introduce our recent work relevant to three-dimensional calculation and supercomputer calculation: 1) tuning of MAFIA (three-dimensional electric and magnetic fields calculation code) for supercomputer, 2) examples of three-dimensional calculation of accelerating structures by MAFIA, 3) development of a beam transport code including space charge effects. (author)

  11. Application of Cocktail method in vegetation classification

    Directory of Open Access Journals (Sweden)

    Hamed Asadi

    2016-09-01

    Full Text Available This study intends to assess the application of Cocktail method in the classification of large vegetation databases. For this purpose, Buxus hyrcana dataset consisted of 442 relevés with 89 species were used and by the modified TWINSPAN. For running the Cocktail method, first primarily classification was done by modified TWINSPAN, and by performing phi analysis in the groups resulted five species were selected which had the highest fidelity value. Then sociological species groups were formed by examining co-occurrence of these 5 species with other species in the database. 21 plant communities belongs to 6 variant, 17 sub associations, 11 associations, 4 alliance, 1 order and 1 class were recognized by assigning 379 releves to the sociological species groups by using logical formulas. Also, 63 releves by the logical formula were not assigned to any sociological species groups, by FPFI index were assigned to the sociological species groups which had the most index value. According to 91% classification agreement with Brown-Blanquet classification and Cocktail classification, we suggest Cocktail method to vegetation scientists as an efficient alternative of Braun-Blanquet method to classify large vegetation databases.

  12. Hand eczema classification

    DEFF Research Database (Denmark)

    Diepgen, T L; Andersen, Klaus Ejner; Brandao, F M

    2008-01-01

    of the disease is rarely evidence based, and a classification system for different subdiagnoses of hand eczema is not agreed upon. Randomized controlled trials investigating the treatment of hand eczema are called for. For this, as well as for clinical purposes, a generally accepted classification system...... A classification system for hand eczema is proposed. Conclusions It is suggested that this classification be used in clinical work and in clinical trials....

  13. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  14. Classification with support hyperplanes

    NARCIS (Netherlands)

    G.I. Nalbantov (Georgi); J.C. Bioch (Cor); P.J.F. Groenen (Patrick)

    2006-01-01

    textabstractA new classification method is proposed, called Support Hy- perplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using

  15. Standard classification: Physics

    International Nuclear Information System (INIS)

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  16. Identification of ICD Codes Suggestive of Child Maltreatment

    Science.gov (United States)

    Schnitzer, Patricia G.; Slusher, Paula L.; Kruse, Robin L.; Tarleton, Molly M.

    2011-01-01

    Objective: In order to be reimbursed for the care they provide, hospitals in the United States are required to use a standard system to code all discharge diagnoses: the International Classification of Disease, 9th Revision, Clinical Modification (ICD-9). Although ICD-9 codes specific for child maltreatment exist, they do not identify all…

  17. Containment Code Validation Matrix

    International Nuclear Information System (INIS)

    Chin, Yu-Shan; Mathew, P.M.; Glowa, Glenn; Dickson, Ray; Liang, Zhe; Leitch, Brian; Barber, Duncan; Vasic, Aleks; Bentaib, Ahmed; Journeau, Christophe; Malet, Jeanne; Studer, Etienne; Meynet, Nicolas; Piluso, Pascal; Gelain, Thomas; Michielsen, Nathalie; Peillon, Samuel; Porcheron, Emmanuel; Albiol, Thierry; Clement, Bernard; Sonnenkalb, Martin; Klein-Hessling, Walter; Arndt, Siegfried; Weber, Gunter; Yanez, Jorge; Kotchourko, Alexei; Kuznetsov, Mike; Sangiorgi, Marco; Fontanet, Joan; Herranz, Luis; Garcia De La Rua, Carmen; Santiago, Aleza Enciso; Andreani, Michele; Paladino, Domenico; Dreier, Joerg; Lee, Richard; Amri, Abdallah

    2014-01-01

    The Committee on the Safety of Nuclear Installations (CSNI) formed the CCVM (Containment Code Validation Matrix) task group in 2002. The objective of this group was to define a basic set of available experiments for code validation, covering the range of containment (ex-vessel) phenomena expected in the course of light and heavy water reactor design basis accidents and beyond design basis accidents/severe accidents. It was to consider phenomena relevant to pressurised heavy water reactor (PHWR), pressurised water reactor (PWR) and boiling water reactor (BWR) designs of Western origin as well as of Eastern European VVER types. This work would complement the two existing CSNI validation matrices for thermal hydraulic code validation (NEA/CSNI/R(1993)14) and In-vessel core degradation (NEA/CSNI/R(2001)21). The report initially provides a brief overview of the main features of a PWR, BWR, CANDU and VVER reactors. It also provides an overview of the ex-vessel corium retention (core catcher). It then provides a general overview of the accident progression for light water and heavy water reactors. The main focus is to capture most of the phenomena and safety systems employed in these reactor types and to highlight the differences. This CCVM contains a description of 127 phenomena, broken down into 6 categories: - Containment Thermal-hydraulics Phenomena; - Hydrogen Behaviour (Combustion, Mitigation and Generation) Phenomena; - Aerosol and Fission Product Behaviour Phenomena; - Iodine Chemistry Phenomena; - Core Melt Distribution and Behaviour in Containment Phenomena; - Systems Phenomena. A synopsis is provided for each phenomenon, including a description, references for further information, significance for DBA and SA/BDBA and a list of experiments that may be used for code validation. The report identified 213 experiments, broken down into the same six categories (as done for the phenomena). An experiment synopsis is provided for each test. Along with a test description

  18. GOC: General Orbit Code

    International Nuclear Information System (INIS)

    Maddox, L.B.; McNeilly, G.S.

    1979-08-01

    GOC (General Orbit Code) is a versatile program which will perform a variety of calculations relevant to isochronous cyclotron design studies. In addition to the usual calculations of interest (e.g., equilibrium and accelerated orbits, focusing frequencies, field isochronization, etc.), GOC has a number of options to calculate injections with a charge change. GOC provides both printed and plotted output, and will follow groups of particles to allow determination of finite-beam properties. An interactive PDP-10 program called GIP, which prepares input data for GOC, is available. GIP is a very easy and convenient way to prepare complicated input data for GOC. Enclosed with this report are several microfiche containing source listings of GOC and other related routines and the printed output from a multiple-option GOC run

  19. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  20. Measuring participation according to the International Classification of Functioning to

    NARCIS (Netherlands)

    Perenboom, R.J.M.; Chorus, A.M.J.

    2003-01-01

    Purpose of this study was to report which existing survey instruments assess participation according to the International Classification of Functioning. Disability and Health (ICF). A literature search for relevant survey instruments was conducted. Subsequently, survey instruments were evaluated of

  1. Fuel performance analysis code 'FAIR'

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1994-01-01

    For modelling nuclear reactor fuel rod behaviour of water cooled reactors under severe power maneuvering and high burnups, a mechanistic fuel performance analysis code FAIR has been developed. The code incorporates finite element based thermomechanical module, physically based fission gas release module and relevant models for modelling fuel related phenomena, such as, pellet cracking, densification and swelling, radial flux redistribution across the pellet due to the build up of plutonium near the pellet surface, pellet clad mechanical interaction/stress corrosion cracking (PCMI/SSC) failure of sheath etc. The code follows the established principles of fuel rod analysis programmes, such as coupling of thermal and mechanical solutions along with the fission gas release calculations, analysing different axial segments of fuel rod simultaneously, providing means for performing local analysis such as clad ridging analysis etc. The modular nature of the code offers flexibility in affecting modifications easily to the code for modelling MOX fuels and thorium based fuels. For performing analysis of fuel rods subjected to very long power histories within a reasonable amount of time, the code has been parallelised and is commissioned on the ANUPAM parallel processing system developed at Bhabha Atomic Research Centre (BARC). (author). 37 refs

  2. Classification of refrigerants; Classification des fluides frigorigenes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)

  3. Classifications of patterned hair loss: a review

    Directory of Open Access Journals (Sweden)

    Mrinal Gupta

    2016-01-01

    Full Text Available Patterned hair loss is the most common cause of hair loss seen in both the sexes after puberty. Numerous classification systems have been proposed by various researchers for grading purposes. These systems vary from the simpler systems based on recession of the hairline to the more advanced multifactorial systems based on the morphological and dynamic parameters that affect the scalp and the hair itself. Most of these preexisting systems have certain limitations. Currently, the Hamilton-Norwood classification system for males and the Ludwig system for females are most commonly used to describe patterns of hair loss. In this article, we review the various classification systems for patterned hair loss in both the sexes. Relevant articles were identified through searches of MEDLINE and EMBASE. Search terms included but were not limited to androgenic alopecia classification, patterned hair loss classification, male pattern baldness classification, and female pattern hair loss classification. Further publications were identified from the reference lists of the reviewed articles.

  4. Causes of death and associated conditions (Codac – a utilitarian approach to the classification of perinatal deaths

    Directory of Open Access Journals (Sweden)

    Harrison Catherine

    2009-06-01

    Full Text Available Abstract A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD, although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes. We tested the Causes of Death and Associated Conditions (Codac classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions. The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies, two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy, a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal. For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured. The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions

  5. Causes of death and associated conditions (Codac) – a utilitarian approach to the classification of perinatal deaths

    Science.gov (United States)

    Frøen, J Frederik; Pinar, Halit; Flenady, Vicki; Bahrin, Safiah; Charles, Adrian; Chauke, Lawrence; Day, Katie; Duke, Charles W; Facchinetti, Fabio; Fretts, Ruth C; Gardener, Glenn; Gilshenan, Kristen; Gordijn, Sanne J; Gordon, Adrienne; Guyon, Grace; Harrison, Catherine; Koshy, Rachel; Pattinson, Robert C; Petersson, Karin; Russell, Laurie; Saastad, Eli; Smith, Gordon CS; Torabi, Rozbeh

    2009-01-01

    A carefully classified dataset of perinatal mortality will retain the most significant information on the causes of death. Such information is needed for health care policy development, surveillance and international comparisons, clinical services and research. For comparability purposes, we propose a classification system that could serve all these needs, and be applicable in both developing and developed countries. It is developed to adhere to basic concepts of underlying cause in the International Classification of Diseases (ICD), although gaps in ICD prevent classification of perinatal deaths solely on existing ICD codes. We tested the Causes of Death and Associated Conditions (Codac) classification for perinatal deaths in seven populations, including two developing country settings. We identified areas of potential improvements in the ability to retain existing information, ease of use and inter-rater agreement. After revisions to address these issues we propose Version II of Codac with detailed coding instructions. The ten main categories of Codac consist of three key contributors to global perinatal mortality (intrapartum events, infections and congenital anomalies), two crucial aspects of perinatal mortality (unknown causes of death and termination of pregnancy), a clear distinction of conditions relevant only to the neonatal period and the remaining conditions are arranged in the four anatomical compartments (fetal, cord, placental and maternal). For more detail there are 94 subcategories, further specified in 577 categories in the full version. Codac is designed to accommodate both the main cause of death as well as two associated conditions. We suggest reporting not only the main cause of death, but also the associated relevant conditions so that scenarios of combined conditions and events are captured. The appropriately applied Codac system promises to better manage information on causes of perinatal deaths, the conditions associated with them, and the

  6. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  7. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. Statistical analysis of coding for molecular properties in the olfactory bulb

    Directory of Open Access Journals (Sweden)

    Benjamin eAuffarth

    2011-07-01

    Full Text Available The relationship between molecular properties of odorants and neural activities is arguably one of the most important issues in olfaction and the rules governing this relationship are still not clear. In the olfactory bulb (OB, glomeruli relay olfactory information to second-order neurons which in turn project to cortical areas. We investigate relevance of odorant properties, spatial localization of glomerular coding sites, and size of coding zones in a dataset of 2-deoxyglucose images of glomeruli over the entire OB of the rat. We relate molecular properties to activation of glomeruli in the OB using a nonparametric statistical test and a support-vector machine classification study. Our method permits to systematically map the topographic representation of various classes of odorants in the OB. Our results suggest many localized coding sites for particular molecular properties and some molecular properties that could form the basis for a spatial map of olfactory information. We found that alkynes, alkanes, alkenes, and amines affect activation maps very strongly as compared to other properties and that amines, sulfur-containing compounds, and alkynes have small zones and high relevance to activation changes, while aromatics, alkanes, and carboxylics acid recruit very big zones in the dataset. Results suggest a local spatial encoding for molecular properties.

  9. Benchmark problems for radiological assessment codes. Final report

    International Nuclear Information System (INIS)

    Mills, M.; Vogt, D.; Mann, B.

    1983-09-01

    This report describes benchmark problems to test computer codes used in the radiological assessment of high-level waste repositories. The problems presented in this report will test two types of codes. The first type of code calculates the time-dependent heat generation and radionuclide inventory associated with a high-level waste package. Five problems have been specified for this code type. The second code type addressed in this report involves the calculation of radionuclide transport and dose-to-man. For these codes, a comprehensive problem and two subproblems have been designed to test the relevant capabilities of these codes for assessing a high-level waste repository setting

  10. Culturally Relevant Cyberbullying Prevention

    OpenAIRE

    Phillips, Gregory John

    2017-01-01

    In this action research study, I, along with a student intervention committee of 14 members, developed a cyberbullying intervention for a large urban high school on the west coast. This high school contained a predominantly African American student population. I aimed to discover culturally relevant cyberbullying prevention strategies for African American students. The intervention committee selected video safety messages featuring African American actors as the most culturally relevant cyber...

  11. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  12. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  13. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  14. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  15. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    Science.gov (United States)

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Security classification of information

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  17. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  18. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  19. Migration to the ICD-10 coding system: A primer for spine surgeons (Part 1).

    Science.gov (United States)

    Rahmathulla, Gazanfar; Deen, H Gordon; Dokken, Judith A; Pirris, Stephen M; Pichelmann, Mark A; Nottmeier, Eric W; Reimer, Ronald; Wharen, Robert E

    2014-01-01

    On 1 October 2015, a new federally mandated system goes into effect requiring the replacement of the International Classification of Disease-version 9-Clinical Modification (ICD-9-CM) with ICD-10-CM. These codes are required to be used for reimbursement and to substantiate medical necessity. ICD-10 is composite with as many as 141,000 codes, an increase of 712% when compared to ICD-9. Execution of the ICD-10 system will require significant changes in the clinical administrative and hospital-based practices. Through the transition, diminished productivity and practice revenue can be anticipated, the impacts of which the spine surgeon can minimizeby appropriate education and planning. The advantages of the new system include increased clarity and more accurate definitions reflecting patient condition, information relevant to ambulatory and managed care encounters, expanded injury codes, laterality, specificity, precise data for safety and compliance reporting, data mining for research, and finally, enabling pay-for-performance programs. The disadvantages include the cost per physician, training administrative staff, revenue loss during the learning curve, confusion, the need to upgrade hardware along with software, and overall expense to the healthcare system. With the deadline rapidly approaching, gaps in implementation result in delayed billing, delayed or diminished reimbursements, and absence of quality and outcomes data. It is thereby essential for spine surgeons to understand their role in transitioning to this new environment. Part I of this article discusses the background, coding changes, and costs as well as reviews the salient features of ICD-10 in spine surgery.

  20. On the classification of structures, systems and components of nuclear research and test reactors

    International Nuclear Information System (INIS)

    Mattar Neto, Miguel

    2009-01-01

    The classification of structures, systems and components of nuclear reactors is a relevant issue related to their design because it is directly associated with their safety functions. There is an important statement regarding quality standards and records that says Structures, systems, and components important to safety shall be designed, fabricated, erected, and tested to quality standards commensurate with the importance of the safety functions to be performed. The definition of the codes, standards and technical requirements applied to the nuclear reactor design, fabrication, inspection and tests may be seen as the main result from this statement. There are well established guides to classify structures, systems and components for nuclear power reactors such as the Pressurized Water Reactors but one can not say the same for nuclear research and test reactors. The nuclear reactors safety functions are those required to the safe reactor operation, the safe reactor shutdown and continued safe conditions, the response to anticipated transients, the response to potential accidents and the control of radioactive material. So, it is proposed in this paper an approach to develop the classification of structures, systems and components of these reactors based on their intended safety functions in order to define the applicable set of codes, standards and technical requirements. (author)

  1. The Limits to Relevance

    Science.gov (United States)

    Averill, M.; Briggle, A.

    2006-12-01

    Science policy and knowledge production lately have taken a pragmatic turn. Funding agencies increasingly are requiring scientists to explain the relevance of their work to society. This stems in part from mounting critiques of the "linear model" of knowledge production in which scientists operating according to their own interests or disciplinary standards are presumed to automatically produce knowledge that is of relevance outside of their narrow communities. Many contend that funded scientific research should be linked more directly to societal goals, which implies a shift in the kind of research that will be funded. While both authors support the concept of useful science, we question the exact meaning of "relevance" and the wisdom of allowing it to control research agendas. We hope to contribute to the conversation by thinking more critically about the meaning and limits of the term "relevance" and the trade-offs implicit in a narrow utilitarian approach. The paper will consider which interests tend to be privileged by an emphasis on relevance and address issues such as whose goals ought to be pursued and why, and who gets to decide. We will consider how relevance, narrowly construed, may actually limit the ultimate utility of scientific research. The paper also will reflect on the worthiness of research goals themselves and their relationship to a broader view of what it means to be human and to live in society. Just as there is more to being human than the pragmatic demands of daily life, there is more at issue with knowledge production than finding the most efficient ways to satisfy consumer preferences or fix near-term policy problems. We will conclude by calling for a balanced approach to funding research that addresses society's most pressing needs but also supports innovative research with less immediately apparent application.

  2. Injuries of the Medial Clavicle: A Cohort Analysis in a Level-I-Trauma-Center. Concomitant Injuries. Management. Classification.

    Science.gov (United States)

    Bakir, Mustafa Sinan; Merschin, David; Unterkofler, Jan; Guembel, Denis; Langenbach, Andreas; Ekkernkamp, Axel; Schulz-Drost, Stefan

    2017-01-01

    Introduction: Although shoulder girdle injuries are frequent, those of the medial clavicle are widely unexplored. An applied classification is less used just as a standard management. Methods: A retrospective analysis of medial clavicle injuries (MCI) during a 5-year-term in a Level-1-Trauma-Center. We analyzed amongst others concomitant injuries, therapy strategies and the classification following the AO standards. Results: 19 (2.5%) out of 759 clavicula injuries were medial ones (11 A, 6 B and 2 C-Type fractures) thereunder 27,8% were displaced and thus operatively treated Locked plate osteosynthesis was employed in unstable fractures and a reconstruction of the ligaments at the sternoclavicular joint (SCJ) in case of their disruption. 84,2% of the patients sustained relevant concomitant injuries. Numerous midshaft fractures were miscoded as medial fracture, which limited the study population. Conclusions: MCI resulted from high impact mechanisms of injury, often with relevant dislocation and concomitant injuries. Concerning medial injury's complexity, treatment should occur in specialized hospitals. Unstable fractures and injuries of the SCJ ligaments should be considered for operative treatment. Midshaft fractures should be clearly distinguished from the medial ones in ICD-10-coding. Further studies are required also regarding a subtyping of the AO classification for medial clavicle fractures including ligamental injuries. Celsius.

  3. Relevant Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering aims at detecting clusters in any subspace projection of a high dimensional space. As the number of possible subspace projections is exponential in the number of dimensions, the result is often tremendously large. Recent approaches fail to reduce results to relevant subspace...... clusters. Their results are typically highly redundant, i.e. many clusters are detected multiple times in several projections. In this work, we propose a novel model for relevant subspace clustering (RESCU). We present a global optimization which detects the most interesting non-redundant subspace clusters...... achieves top clustering quality while competing approaches show greatly varying performance....

  4. Classification of Flotation Frothers

    Directory of Open Access Journals (Sweden)

    Jan Drzymala

    2018-02-01

    Full Text Available In this paper, a scheme of flotation frothers classification is presented. The scheme first indicates the physical system in which a frother is present and four of them i.e., pure state, aqueous solution, aqueous solution/gas system and aqueous solution/gas/solid system are distinguished. As a result, there are numerous classifications of flotation frothers. The classifications can be organized into a scheme described in detail in this paper. The frother can be present in one of four physical systems, that is pure state, aqueous solution, aqueous solution/gas and aqueous solution/gas/solid system. It results from the paper that a meaningful classification of frothers relies on choosing the physical system and next feature, trend, parameter or parameters according to which the classification is performed. The proposed classification can play a useful role in characterizing and evaluation of flotation frothers.

  5. Hyperspectral Image Classification Using Discriminative Dictionary Learning

    International Nuclear Information System (INIS)

    Zongze, Y; Hao, S; Kefeng, J; Huanxin, Z

    2014-01-01

    The hyperspectral image (HSI) processing community has witnessed a surge of papers focusing on the utilization of sparse prior for effective HSI classification. In sparse representation based HSI classification, there are two phases: sparse coding with an over-complete dictionary and classification. In this paper, we first apply a novel fisher discriminative dictionary learning method, which capture the relative difference in different classes. The competitive selection strategy ensures that atoms in the resulting over-complete dictionary are the most discriminative. Secondly, motivated by the assumption that spatially adjacent samples are statistically related and even belong to the same materials (same class), we propose a majority voting scheme incorporating contextual information to predict the category label. Experiment results show that the proposed method can effectively strengthen relative discrimination of the constructed dictionary, and incorporating with the majority voting scheme achieve generally an improved prediction performance

  6. Ontologies vs. Classification Systems

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2009-01-01

    What is an ontology compared to a classification system? Is a taxonomy a kind of classification system or a kind of ontology? These are questions that we meet when working with people from industry and public authorities, who need methods and tools for concept clarification, for developing meta...... data sets or for obtaining advanced search facilities. In this paper we will present an attempt at answering these questions. We will give a presentation of various types of ontologies and briefly introduce terminological ontologies. Furthermore we will argue that classification systems, e.g. product...... classification systems and meta data taxonomies, should be based on ontologies....

  7. Iris Image Classification Based on Hierarchical Visual Codebook.

    Science.gov (United States)

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection.

  8. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  9. Is Information Still Relevant?

    Science.gov (United States)

    Ma, Lia

    2013-01-01

    Introduction: The term "information" in information science does not share the characteristics of those of a nomenclature: it does not bear a generally accepted definition and it does not serve as the bases and assumptions for research studies. As the data deluge has arrived, is the concept of information still relevant for information…

  10. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  11. Acute pancreatitis: international classification and nomenclature

    International Nuclear Information System (INIS)

    Bollen, T.L.

    2016-01-01

    The incidence of acute pancreatitis (AP) is increasing and it is associated with a major healthcare concern. New insights in the pathophysiology, better imaging techniques, and novel treatment options for complicated AP prompted the update of the 1992 Atlanta Classification. Updated nomenclature for pancreatic collections based on imaging criteria is proposed. Adoption of the newly Revised Classification of Acute Pancreatitis 2012 by radiologists should help standardise reports and facilitate accurate conveyance of relevant findings to referring physicians involved in the care of patients with AP. This review will clarify the nomenclature of pancreatic collections in the setting of AP.

  12. On application of CFD codes to problems of nuclear reactor safety

    International Nuclear Information System (INIS)

    Muehlbauer, Petr

    2005-01-01

    The 'Exploratory Meeting of Experts to Define an Action Plan on the Application of Computational Fluid Dynamics (CFD) Codes to Nuclear Reactor Safety Problems' held in May 2002 at Aix-en-Province, France, recommended formation of writing groups to report the need of guidelines for use and assessment of CFD in single-phase nuclear reactor safety problems, and on recommended extensions to CFD codes to meet the needs of two-phase problems in nuclear reactor safety. This recommendations was supported also by Working Group on the Analysis and Management of Accidents and led to formation oaf three Writing Groups. The first writing Group prepared a summary of existing best practice guidelines for single phase CFD analysis and made a recommendation on the need for nuclear reactor safety specific guidelines. The second Writing Group selected those nuclear reactor safety applications for which understanding requires or is significantly enhanced by single-phase CFD analysis, and proposed a methodology for establishing assesment matrices relevant to nuclear reactor safety applications. The third writing group performed a classification of nuclear reactor safety problems where extension of CFD to two-phase flow may bring real benefit, a classification of different modeling approaches, and specification and analysis of needs in terms of physical and numerical assessments. This presentation provides a review of these activities with the most important conclusions and recommendations (Authors)

  13. Coding Theory and Applications : 4th International Castle Meeting

    CERN Document Server

    Malonek, Paula; Vettori, Paolo

    2015-01-01

    The topics covered in this book, written by researchers at the forefront of their field, represent some of the most relevant research areas in modern coding theory: codes and combinatorial structures, algebraic geometric codes, group codes, quantum codes, convolutional codes, network coding and cryptography. The book includes a survey paper on the interconnections of coding theory with constrained systems, written by an invited speaker, as well as 37 cutting-edge research communications presented at the 4th International Castle Meeting on Coding Theory and Applications (4ICMCTA), held at the Castle of Palmela in September 2014. The event’s scientific program consisted of four invited talks and 39 regular talks by authors from 24 different countries. This conference provided an ideal opportunity for communicating new results, exchanging ideas, strengthening international cooperation, and introducing young researchers into the coding theory community.

  14. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  15. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  16. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  17. A neural network for noise correlation classification

    Science.gov (United States)

    Paitz, Patrick; Gokhberg, Alexey; Fichtner, Andreas

    2018-02-01

    We present an artificial neural network (ANN) for the classification of ambient seismic noise correlations into two categories, suitable and unsuitable for noise tomography. By using only a small manually classified data subset for network training, the ANN allows us to classify large data volumes with low human effort and to encode the valuable subjective experience of data analysts that cannot be captured by a deterministic algorithm. Based on a new feature extraction procedure that exploits the wavelet-like nature of seismic time-series, we efficiently reduce the dimensionality of noise correlation data, still keeping relevant features needed for automated classification. Using global- and regional-scale data sets, we show that classification errors of 20 per cent or less can be achieved when the network training is performed with as little as 3.5 per cent and 16 per cent of the data sets, respectively. Furthermore, the ANN trained on the regional data can be applied to the global data, and vice versa, without a significant increase of the classification error. An experiment where four students manually classified the data, revealed that the classification error they would assign to each other is substantially larger than the classification error of the ANN (>35 per cent). This indicates that reproducibility would be hampered more by human subjectivity than by imperfections of the ANN.

  18. Civil classification of the acquirer and operator of a photovoltaic power plant. Consumer or entrepreneur?; Zivilrechtliche Einordnung des Erwerbers und Betreibers einer Photovoltaikanlage. Verbraucher oder Unternehmer?

    Energy Technology Data Exchange (ETDEWEB)

    Schneidewindt, Holger [Verbraucherzentrale Nordrhein-Westfalen e.V., Duesseldorf (Germany)

    2013-03-15

    With the prospect of revenue from the feed consumption and cost savings by means of private 'small investors' for the energy policy turnaround are obtained. The civil protection in acquisition and operation of the photovoltaic power plant largely depends on their classification according to paragraph paragraph 13, 14 Civil Law Code (BGB). paragraph paragraph 305 ff. BGB are fully applicable only to consumers. Consumer organizations can act only under the participation of consumers. The recent judgments show that the registration of relevant aspects and their proper legal assessment are a major challenge. Therefore, the feed-in tariff as a demarcation criterion was a wrong decision.

  19. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  20. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  1. A Computer Oriented Scheme for Coding Chemicals in the Field of Biomedicine.

    Science.gov (United States)

    Bobka, Marilyn E.; Subramaniam, J.B.

    The chemical coding scheme of the Medical Coding Scheme (MCS), developed for use in the Comparative Systems Laboratory (CSL), is outlined and evaluated in this report. The chemical coding scheme provides a classification scheme and encoding method for drugs and chemical terms. Using the scheme complicated chemical structures may be expressed…

  2. Using the International Classification of Functioning, Disability, and Health to identify outcome domains for a core outcome set for aphasia: a comparison of stakeholder perspectives.

    Science.gov (United States)

    Wallace, Sarah J; Worrall, Linda; Rose, Tanya; Le Dorze, Guylaine

    2017-11-12

    This study synthesised the findings of three separate consensus processes exploring the perspectives of key stakeholder groups about important aphasia treatment outcomes. This process was conducted to generate recommendations for outcome domains to be included in a core outcome set for aphasia treatment trials. International Classification of Functioning, Disability, and Health codes were examined to identify where the groups of: (1) people with aphasia, (2) family members, (3) aphasia researchers, and (4) aphasia clinicians/managers, demonstrated congruence in their perspectives regarding important treatment outcomes. Codes were contextualized using qualitative data. Congruence across three or more stakeholder groups was evident for ICF chapters: Mental functions; Communication; and Services, systems, and policies. Quality of life was explicitly identified by clinicians/managers and researchers, while people with aphasia and their families identified outcomes known to be determinants of quality of life. Core aphasia outcomes include: language, emotional wellbeing, communication, patient-reported satisfaction with treatment and impact of treatment, and quality of life. International Classification of Functioning, Disability, and Health coding can be used to compare stakeholder perspectives and identify domains for core outcome sets. Pairing coding with qualitative data may ensure important nuances of meaning are retained. Implications for rehabilitation The outcomes measured in treatment research should be relevant to stakeholders and support health care decision making. Core outcome sets (agreed, minimum set of outcomes, and outcome measures) are increasingly being used to ensure the relevancy and consistency of the outcomes measured in treatment studies. Important aphasia treatment outcomes span all components of the International Classification of Functioning, Disability, and Health. Stakeholders demonstrated congruence in the identification of important

  3. [Student nurses and the code of ethics].

    Science.gov (United States)

    Trolliet, Julie

    2017-09-01

    Student nurses, just like all practising professionals, are expected to be aware of and to respect the code of ethics governing their profession. Since the publication of this code, actions to raise awareness of it and explain it to all the relevant players have been put in place. The French National Federation of Student Nurses decided to survey future professionals regarding this new text. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  4. Monte Carlo codes use in neutron therapy

    International Nuclear Information System (INIS)

    Paquis, P.; Mokhtari, F.; Karamanoukian, D.; Pignol, J.P.; Cuendet, P.; Iborra, N.

    1998-01-01

    Monte Carlo calculation codes allow to study accurately all the parameters relevant to radiation effects, like the dose deposition or the type of microscopic interactions, through one by one particle transport simulation. These features are very useful for neutron irradiations, from device development up to dosimetry. This paper illustrates some applications of these codes in Neutron Capture Therapy and Neutron Capture Enhancement of fast neutrons irradiations. (authors)

  5. An algorithm for the arithmetic classification of multilattices.

    Science.gov (United States)

    Indelicato, Giuliana

    2013-01-01

    A procedure for the construction and the classification of monoatomic multilattices in arbitrary dimension is developed. The algorithm allows one to determine the location of the points of all monoatomic multilattices with a given symmetry, or to determine whether two assigned multilattices are arithmetically equivalent. This approach is based on ideas from integral matrix theory, in particular the reduction to the Smith normal form, and can be coded to provide a classification software package.

  6. Classification Accuracy Increase Using Multisensor Data Fusion

    Science.gov (United States)

    Makarau, A.; Palubinskas, G.; Reinartz, P.

    2011-09-01

    The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to

  7. Image Classification Workflow Using Machine Learning Methods

    Science.gov (United States)

    Christoffersen, M. S.; Roser, M.; Valadez-Vergara, R.; Fernández-Vega, J. A.; Pierce, S. A.; Arora, R.

    2016-12-01

    Recent increases in the availability and quality of remote sensing datasets have fueled an increasing number of scientifically significant discoveries based on land use classification and land use change analysis. However, much of the software made to work with remote sensing data products, specifically multispectral images, is commercial and often prohibitively expensive. The free to use solutions that are currently available come bundled up as small parts of much larger programs that are very susceptible to bugs and difficult to install and configure. What is needed is a compact, easy to use set of tools to perform land use analysis on multispectral images. To address this need, we have developed software using the Python programming language with the sole function of land use classification and land use change analysis. We chose Python to develop our software because it is relatively readable, has a large body of relevant third party libraries such as GDAL and Spectral Python, and is free to install and use on Windows, Linux, and Macintosh operating systems. In order to test our classification software, we performed a K-means unsupervised classification, Gaussian Maximum Likelihood supervised classification, and a Mahalanobis Distance based supervised classification. The images used for testing were three Landsat rasters of Austin, Texas with a spatial resolution of 60 meters for the years of 1984 and 1999, and 30 meters for the year 2015. The testing dataset was easily downloaded using the Earth Explorer application produced by the USGS. The software should be able to perform classification based on any set of multispectral rasters with little to no modification. Our software makes the ease of land use classification using commercial software available without an expensive license.

  8. Clinical Relevance of Adipokines

    Directory of Open Access Journals (Sweden)

    Matthias Blüher

    2012-10-01

    Full Text Available The incidence of obesity has increased dramatically during recent decades. Obesity increases the risk for metabolic and cardiovascular diseases and may therefore contribute to premature death. With increasing fat mass, secretion of adipose tissue derived bioactive molecules (adipokines changes towards a pro-inflammatory, diabetogenic and atherogenic pattern. Adipokines are involved in the regulation of appetite and satiety, energy expenditure, activity, endothelial function, hemostasis, blood pressure, insulin sensitivity, energy metabolism in insulin sensitive tissues, adipogenesis, fat distribution and insulin secretion in pancreatic β-cells. Therefore, adipokines are clinically relevant as biomarkers for fat distribution, adipose tissue function, liver fat content, insulin sensitivity, chronic inflammation and have the potential for future pharmacological treatment strategies for obesity and its related diseases. This review focuses on the clinical relevance of selected adipokines as markers or predictors of obesity related diseases and as potential therapeutic tools or targets in metabolic and cardiovascular diseases.

  9. Colombia: Territorial classification

    International Nuclear Information System (INIS)

    Mendoza Morales, Alberto

    1998-01-01

    The article is about the approaches of territorial classification, thematic axes, handling principles and territorial occupation, politician and administrative units and administration regions among other topics. Understanding as Territorial Classification the space distribution on the territory of the country, of the geographical configurations, the human communities, the political-administrative units and the uses of the soil, urban and rural, existent and proposed

  10. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  11. Library Classification 2020

    Science.gov (United States)

    Harris, Christopher

    2013-01-01

    In this article the author explores how a new library classification system might be designed using some aspects of the Dewey Decimal Classification (DDC) and ideas from other systems to create something that works for school libraries in the year 2020. By examining what works well with the Dewey Decimal System, what features should be carried…

  12. Spectroscopic classification of transients

    DEFF Research Database (Denmark)

    Stritzinger, M. D.; Fraser, M.; Hummelmose, N. N.

    2017-01-01

    We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017.......We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017....

  13. Information Needs/Relevance

    OpenAIRE

    Wildemuth, Barbara M.

    2009-01-01

    A user's interaction with a DL is often initiated as the result of the user experiencing an information need of some kind. Aspects of that experience and how it might affect the user's interactions with the DL are discussed in this module. In addition, users continuously make decisions about and evaluations of the materials retrieved from a DL, relative to their information needs. Relevance judgments, and their relationship to the user's information needs, are discussed in this module. Draft

  14. DOE LLW classification rationale

    International Nuclear Information System (INIS)

    Flores, A.Y.

    1991-01-01

    This report was about the rationale which the US Department of Energy had with low-level radioactive waste (LLW) classification. It is based on the Nuclear Regulatory Commission's classification system. DOE site operators met to review the qualifications and characteristics of the classification systems. They evaluated performance objectives, developed waste classification tables, and compiled dose limits on the waste. A goal of the LLW classification system was to allow each disposal site the freedom to develop limits to radionuclide inventories and concentrations according to its own site-specific characteristics. This goal was achieved with the adoption of a performance objectives system based on a performance assessment, with site-specific environmental conditions and engineered disposal systems

  15. Constructing criticality by classification

    DEFF Research Database (Denmark)

    Machacek, Erika

    2017-01-01

    " in the bureaucratic practice of classification: Experts construct material criticality in assessments as they allot information on the materials to the parameters of the assessment framework. In so doing, they ascribe a new set of connotations to the materials, namely supply risk, and their importance to clean energy......, legitimizing a criticality discourse.Specifically, the paper introduces a typology delineating the inferences made by the experts from their produced recommendations in the classification of rare earth element criticality. The paper argues that the classification is a specific process of constructing risk....... It proposes that the expert bureaucratic practice of classification legitimizes (i) the valorisation that was made in the drafting of the assessment framework for the classification, and (ii) political operationalization when enacted that might have (non-)distributive implications for the allocation of public...

  16. System for selecting relevant information for decision support.

    Science.gov (United States)

    Kalina, Jan; Seidl, Libor; Zvára, Karel; Grünfeldová, Hana; Slovák, Dalibor; Zvárová, Jana

    2013-01-01

    We implemented a prototype of a decision support system called SIR which has a form of a web-based classification service for diagnostic decision support. The system has the ability to select the most relevant variables and to learn a classification rule, which is guaranteed to be suitable also for high-dimensional measurements. The classification system can be useful for clinicians in primary care to support their decision-making tasks with relevant information extracted from any available clinical study. The implemented prototype was tested on a sample of patients in a cardiological study and performs an information extraction from a high-dimensional set containing both clinical and gene expression data.

  17. A comparative evaluation of sequence classification programs

    Directory of Open Access Journals (Sweden)

    Bazinet Adam L

    2012-05-01

    Full Text Available Abstract Background A fundamental problem in modern genomics is to taxonomically or functionally classify DNA sequence fragments derived from environmental sampling (i.e., metagenomics. Several different methods have been proposed for doing this effectively and efficiently, and many have been implemented in software. In addition to varying their basic algorithmic approach to classification, some methods screen sequence reads for ’barcoding genes’ like 16S rRNA, or various types of protein-coding genes. Due to the sheer number and complexity of methods, it can be difficult for a researcher to choose one that is well-suited for a particular analysis. Results We divided the very large number of programs that have been released in recent years for solving the sequence classification problem into three main categories based on the general algorithm they use to compare a query sequence against a database of sequences. We also evaluated the performance of the leading programs in each category on data sets whose taxonomic and functional composition is known. Conclusions We found significant variability in classification accuracy, precision, and resource consumption of sequence classification programs when used to analyze various metagenomics data sets. However, we observe some general trends and patterns that will be useful to researchers who use sequence classification programs.

  18. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  19. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... solutions. Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  20. Mapping a classification system to architectural education

    DEFF Research Database (Denmark)

    Hermund, Anders; Klint, Lars; Rostrup, Nicolai

    2015-01-01

    This paper examines to what extent a new classification system, Cuneco Classification System, CCS, proves useful in the education of architects, and to what degree the aim of an architectural education, rather based on an arts and crafts approach than a polytechnic approach, benefits from...... the distinct terminology of the classification system. The method used to examine the relationship between education, practice and the CCS bifurcates in a quantitative and a qualitative exploration: Quantitative comparison of the curriculum with the students’ own descriptions of their studies through...... a questionnaire survey among 88 students in graduate school. Qualitative interviews with a handful of practicing architects, to be able to cross check the relevance of the education with the profession. The examination indicates the need of a new definition, in addition to the CCS’s scale, covering the earliest...

  1. Exploring different approaches for music genre classification

    Directory of Open Access Journals (Sweden)

    Antonio Jose Homsi Goulart

    2012-07-01

    Full Text Available In this letter, we present different approaches for music genre classification. The proposed techniques, which are composed of a feature extraction stage followed by a classification procedure, explore both the variations of parameters used as input and the classifier architecture. Tests were carried out with three styles of music, namely blues, classical, and lounge, which are considered informally by some musicians as being “big dividers” among music genres, showing the efficacy of the proposed algorithms and establishing a relationship between the relevance of each set of parameters for each music style and each classifier. In contrast to other works, entropies and fractal dimensions are the features adopted for the classifications.

  2. Biogeographic classification of the Caspian Sea

    DEFF Research Database (Denmark)

    Fendereski, F.; Vogt, M.; Payne, Mark

    2014-01-01

    Like other inland seas, the Caspian Sea (CS) has been influenced by climate change and anthropogenic disturbance during recent decades, yet the scientific understanding of this water body remains poor. In this study, an eco-geographical classification of the CS based on physical information deriv...... confirms the relevance of the ecoregions as proxies for habitats with common biological characteristics....... from space and in-situ data is developed and tested against a set of biological observations. We used a two-step classification procedure, consisting of (i) a data reduction with self-organizing maps (SOMs) and (ii) a synthesis of the most relevant features into a reduced number of marine ecoregions...... in phytoplankton phenology, with differences in the date of bloom initiation, its duration and amplitude between ecoregions. A first qualitative evaluation of differences in community composition based on recorded presence-absence patterns of 27 different species of plankton, fish and benthic invertebrate also...

  3. Classification of movement disorders.

    Science.gov (United States)

    Fahn, Stanley

    2011-05-01

    The classification of movement disorders has evolved. Even the terminology has shifted, from an anatomical one of extrapyramidal disorders to a phenomenological one of movement disorders. The history of how this shift came about is described. The history of both the definitions and the classifications of the various neurologic conditions is then reviewed. First is a review of movement disorders as a group; then, the evolving classifications for 3 of them--parkinsonism, dystonia, and tremor--are covered in detail. Copyright © 2011 Movement Disorder Society.

  4. [Relevant public health enteropathogens].

    Science.gov (United States)

    Riveros, Maribel; Ochoa, Theresa J

    2015-01-01

    Diarrhea remains the third leading cause of death in children under five years, despite recent advances in the management and prevention of this disease. It is caused by multiple pathogens, however, the prevalence of each varies by age group, geographical area and the scenario where cases (community vs hospital) are recorded. The most relevant pathogens in public health are those associated with the highest burden of disease, severity, complications and mortality. In our country, norovirus, Campylobacter and diarrheagenic E. coli are the most prevalent pathogens at the community level in children. In this paper we review the local epidemiology and potential areas of development in five selected pathogens: rotavirus, norovirus, Shiga toxin-producing E. coli (STEC), Shigella and Salmonella. Of these, rotavirus is the most important in the pediatric population and the main agent responsible for child mortality from diarrhea. The introduction of rotavirus vaccination in Peru will have a significant impact on disease burden and mortality from diarrhea. However, surveillance studies are needed to determine the impact of vaccination and changes in the epidemiology of diarrhea in Peru following the introduction of new vaccines, as well as antibiotic resistance surveillance of clinical relevant bacteria.

  5. Reply to the letters to the editor submitted by T. L. Ogden and K.T. Du Clos, and by R. Foster regarding the paper ‘SWeRF—a method for estimating the relevant fine particle fraction in bulk materials for classification and labelling purposes’.

    Science.gov (United States)

    Pensis, Ingeborg; Luetzenkirchen, Frank; Friede, Bernd

    2014-07-01

    The authors respond to the points raised in the Letters to the Editor raised by Ogden and Du Clos and by Foster. Ad 1: The debate of the classification of respirable cyrstalline silica is outside the scope of the technical paper. Ad 2: A standard for the determination of SWeRF is under development, in which indeed the provision is made that for a correct determination all quartz within the fine fraction needs to be liberated. Ad 3: Dustiness tests provide useful information for occupational hygienists, but are not suitable for fulfilling classification and labelling requirements. Ad 4: Pipette effects are not discussed in the paper because the difference between calculating the SWeRF from the particle size distribution and the SWeRF from sedimentation is very small.

  6. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  7. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  8. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  9. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  10. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  11. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  12. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  13. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  14. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  15. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  16. Update on diabetes classification.

    Science.gov (United States)

    Thomas, Celeste C; Philipson, Louis H

    2015-01-01

    This article highlights the difficulties in creating a definitive classification of diabetes mellitus in the absence of a complete understanding of the pathogenesis of the major forms. This brief review shows the evolving nature of the classification of diabetes mellitus. No classification scheme is ideal, and all have some overlap and inconsistencies. The only diabetes in which it is possible to accurately diagnose by DNA sequencing, monogenic diabetes, remains undiagnosed in more than 90% of the individuals who have diabetes caused by one of the known gene mutations. The point of classification, or taxonomy, of disease, should be to give insight into both pathogenesis and treatment. It remains a source of frustration that all schemes of diabetes mellitus continue to fall short of this goal. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Learning Apache Mahout classification

    CERN Document Server

    Gupta, Ashish

    2015-01-01

    If you are a data scientist who has some experience with the Hadoop ecosystem and machine learning methods and want to try out classification on large datasets using Mahout, this book is ideal for you. Knowledge of Java is essential.

  18. CLASSIFICATION OF VIRUSES

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. CLASSIFICATION OF VIRUSES. On basis of morphology. On basis of chemical composition. On basis of structure of genome. On basis of mode of replication. Notes:

  19. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  20. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  1. Other relevant biological papers

    International Nuclear Information System (INIS)

    Shimizu, M.

    1989-01-01

    A considerable number of CRESP-relevant papers concerning deep-sea biology and radioecology have been published. It is the purpose of this study to call attention to them. They fall into three general categories. The first is papers of general interest. They are mentioned only briefly, and include text references to the global bibliography at the end of the volume. The second are papers that are not only mentioned and referenced, but for various reasons are described in abstract form. The last is a list of papers compiled by H.S.J. Roe specifically for this volume. They are listed in bibliographic form, and are also included in the global bibliography at the end of the volume

  2. Improved accuracy of co-morbidity coding over time after the introduction of ICD-10 administrative data.

    Science.gov (United States)

    Januel, Jean-Marie; Luthi, Jean-Christophe; Quan, Hude; Borst, François; Taffé, Patrick; Ghali, William A; Burnand, Bernard

    2011-08-18

    Co-morbidity information derived from administrative data needs to be validated to allow its regular use. We assessed evolution in the accuracy of coding for Charlson and Elixhauser co-morbidities at three time points over a 5-year period, following the introduction of the International Classification of Diseases, 10th Revision (ICD-10), coding of hospital discharges. Cross-sectional time trend evaluation study of coding accuracy using hospital chart data of 3'499 randomly selected patients who were discharged in 1999, 2001 and 2003, from two teaching and one non-teaching hospital in Switzerland. We measured sensitivity, positive predictive and Kappa values for agreement between administrative data coded with ICD-10 and chart data as the 'reference standard' for recording 36 co-morbidities. For the 17 the Charlson co-morbidities, the sensitivity - median (min-max) - was 36.5% (17.4-64.1) in 1999, 42.5% (22.2-64.6) in 2001 and 42.8% (8.4-75.6) in 2003. For the 29 Elixhauser co-morbidities, the sensitivity was 34.2% (1.9-64.1) in 1999, 38.6% (10.5-66.5) in 2001 and 41.6% (5.1-76.5) in 2003. Between 1999 and 2003, sensitivity estimates increased for 30 co-morbidities and decreased for 6 co-morbidities. The increase in sensitivities was statistically significant for six conditions and the decrease significant for one. Kappa values were increased for 29 co-morbidities and decreased for seven. Accuracy of administrative data in recording clinical conditions improved slightly between 1999 and 2003. These findings are of relevance to all jurisdictions introducing new coding systems, because they demonstrate a phenomenon of improved administrative data accuracy that may relate to a coding 'learning curve' with the new coding system.

  3. Case file coding of child maltreatment: Methods, challenges, and innovations in a longitudinal project of youth in foster care☆

    Science.gov (United States)

    Huffhines, Lindsay; Tunno, Angela M.; Cho, Bridget; Hambrick, Erin P.; Campos, Ilse; Lichty, Brittany; Jackson, Yo

    2016-01-01

    State social service agency case files are a common mechanism for obtaining information about a child’s maltreatment history, yet these documents are often challenging for researchers to access, and then to process in a manner consistent with the requirements of social science research designs. Specifically, accessing and navigating case files is an extensive undertaking, and a task that many researchers have had to maneuver with little guidance. Even after the files are in hand and the research questions and relevant variables have been clarified, case file information about a child’s maltreatment exposure can be idiosyncratic, vague, inconsistent, and incomplete, making coding such information into useful variables for statistical analyses difficult. The Modified Maltreatment Classification System (MMCS) is a popular tool used to guide the process, and though comprehensive, this coding system cannot cover all idiosyncrasies found in case files. It is not clear from the literature how researchers implement this system while accounting for issues outside of the purview of the MMCS or that arise during MMCS use. Finally, a large yet reliable file coding team is essential to the process, however, the literature lacks training guidelines and methods for establishing reliability between coders. In an effort to move the field toward a common approach, the purpose of the present discussion is to detail the process used by one large-scale study of child maltreatment, the Studying Pathways to Adjustment and Resilience in Kids (SPARK) project, a longitudinal study of resilience in youth in foster care. The article addresses each phase of case file coding, from accessing case files, to identifying how to measure constructs of interest, to dealing with exceptions to the coding system, to coding variables reliably, to training large teams of coders and monitoring for fidelity. Implications for a comprehensive and efficient approach to case file coding are discussed. PMID

  4. Towards secondary fingerprint classification

    CSIR Research Space (South Africa)

    Msiza, IS

    2011-07-01

    Full Text Available an accuracy figure of 76.8%. This small difference between the two figures is indicative of the validity of the proposed secondary classification module. Keywords?fingerprint core; fingerprint delta; primary classifi- cation; secondary classification I..., namely, the fingerprint core and the fingerprint delta. Forensically, a fingerprint core is defined as the innermost turning point where the fingerprint ridges form a loop, while the fingerprint delta is defined as the point where these ridges form a...

  5. Expected Classification Accuracy

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2005-08-01

    Full Text Available Every time we make a classification based on a test score, we should expect some number..of misclassifications. Some examinees whose true ability is within a score range will have..observed scores outside of that range. A procedure for providing a classification table of..true and expected scores is developed for polytomously scored items under item response..theory and applied to state assessment data. A simplified procedure for estimating the..table entries is also presented.

  6. Latent classification models

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2005-01-01

    parametric family ofdistributions.  In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....

  7. Audit of Clinical Coding of Major Head and Neck Operations

    Science.gov (United States)

    Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean

    2009-01-01

    INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944

  8. 78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-11-18

    ...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and... response to requests from the U.S. cotton industry and ICE, AMS will offer a futures classification option...

  9. Classification across gene expression microarray studies

    Directory of Open Access Journals (Sweden)

    Kuner Ruprecht

    2009-12-01

    Full Text Available Abstract Background The increasing number of gene expression microarray studies represents an important resource in biomedical research. As a result, gene expression based diagnosis has entered clinical practice for patient stratification in breast cancer. However, the integration and combined analysis of microarray studies remains still a challenge. We assessed the potential benefit of data integration on the classification accuracy and systematically evaluated the generalization performance of selected methods on four breast cancer studies comprising almost 1000 independent samples. To this end, we introduced an evaluation framework which aims to establish good statistical practice and a graphical way to monitor differences. The classification goal was to correctly predict estrogen receptor status (negative/positive and histological grade (low/high of each tumor sample in an independent study which was not used for the training. For the classification we chose support vector machines (SVM, predictive analysis of microarrays (PAM, random forest (RF and k-top scoring pairs (kTSP. Guided by considerations relevant for classification across studies we developed a generalization of kTSP which we evaluated in addition. Our derived version (DV aims to improve the robustness of the intrinsic invariance of kTSP with respect to technologies and preprocessing. Results For each individual study the generalization error was benchmarked via complete cross-validation and was found to be similar for all classification methods. The misclassification rates were substantially higher in classification across studies, when each single study was used as an independent test set while all remaining studies were combined for the training of the classifier. However, with increasing number of independent microarray studies used in the training, the overall classification performance improved. DV performed better than the average and showed slightly less variance. In

  10. The 2002 Revision of the American Psychological Association's Ethics Code: Implications for School Psychologists

    Science.gov (United States)

    Flanagan, Rosemary; Miller, Jeffrey A.; Jacob, Susan

    2005-01-01

    The Ethical Principles for Psychologists and Code of Conduct has been recently revised. The organization of the code changed, and the language was made more specific. A number of points relevant to school psychology are explicitly stated in the code. A clear advantage of including these items in the code is the assistance to school psychologists…

  11. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  12. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  13. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  14. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  15. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  16. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  17. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  18. Classification of parotidectomy: a proposed modification to the European Salivary Gland Society classification system.

    Science.gov (United States)

    Wong, Wai Keat; Shetty, Subhaschandra

    2017-08-01

    Parotidectomy remains the mainstay of treatment for both benign and malignant lesions of the parotid gland. There exists a wide range of possible surgical options in parotidectomy in terms of extent of parotid tissue removed. There is increasing need for uniformity of terminology resulting from growing interest in modifications of the conventional parotidectomy. It is, therefore, of paramount importance for a standardized classification system in describing extent of parotidectomy. Recently, the European Salivary Gland Society (ESGS) proposed a novel classification system for parotidectomy. The aim of this study is to evaluate this system. A classification system proposed by the ESGS was critically re-evaluated and modified to increase its accuracy and its acceptability. Modifications mainly focused on subdividing Levels I and II into IA, IB, IIA, and IIB. From June 2006 to June 2016, 126 patients underwent 130 parotidectomies at our hospital. The classification system was tested in that cohort of patient. While the ESGS classification system is comprehensive, it does not cover all possibilities. The addition of Sublevels IA, IB, IIA, and IIB may help to address some of the clinical situations seen and is clinically relevant. We aim to test the modified classification system for partial parotidectomy to address some of the challenges mentioned.

  19. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  20. Genetic coding and gene expression - new Quadruplet genetic coding model

    Science.gov (United States)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  1. User perspectives on relevance criteria

    DEFF Research Database (Denmark)

    Maglaughlin, Kelly L.; Sonnenwald, Diane H.

    2002-01-01

    , partially relevant, or not relevant to their information need; and explained their decisions in an interview. Analysis revealed 29 criteria, discussed positively and negatively, that were used by the participants when selecting passages that contributed or detracted from a document's relevance......This study investigates the use of criteria to assess relevant, partially relevant, and not-relevant documents. Study participants identified passages within 20 document representations that they used to make relevance judgments; judged each document representation as a whole to be relevant...... matter, thought catalyst), full text (e.g., audience, novelty, type, possible content, utility), journal/publisher (e.g., novelty, main focus, perceived quality), and personal (e.g., competition, time requirements). Results further indicate that multiple criteria are used when making relevant, partially...

  2. Monte Carlo codes use in neutron therapy; Application de codes Monte Carlo en neutrontherapie

    Energy Technology Data Exchange (ETDEWEB)

    Paquis, P.; Mokhtari, F.; Karamanoukian, D. [Hopital Pasteur, 06 - Nice (France); Pignol, J.P. [Hopital du Hasenrain, 68 - Mulhouse (France); Cuendet, P. [CEA Centre d' Etudes de Saclay, 91 - Gif-sur-Yvette (France). Direction des Reacteurs Nucleaires; Fares, G.; Hachem, A. [Faculte des Sciences, 06 - Nice (France); Iborra, N. [Centre Antoine-Lacassagne, 06 - Nice (France)

    1998-04-01

    Monte Carlo calculation codes allow to study accurately all the parameters relevant to radiation effects, like the dose deposition or the type of microscopic interactions, through one by one particle transport simulation. These features are very useful for neutron irradiations, from device development up to dosimetry. This paper illustrates some applications of these codes in Neutron Capture Therapy and Neutron Capture Enhancement of fast neutrons irradiations. (authors)

  3. Comparison of codes and standards for radiographic inspection

    International Nuclear Information System (INIS)

    Bingoeldag, M. M.; Aksu, M.; Akguen, A. F.

    1995-01-01

    This report compares the procedurel requirements and acceptance criteria for radiographic inspections specified in the relevant national and international codes and standards. In particular, detailed analysis of inspection conditions such as exposure arrangements, and contrast requirements are given

  4. Diabetes Mellitus Coding Training for Family Practice Residents.

    Science.gov (United States)

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  5. A complete electrical hazard classification system and its application

    Energy Technology Data Exchange (ETDEWEB)

    Gordon, Lloyd B [Los Alamos National Laboratory; Cartelli, Laura [Los Alamos National Laboratory

    2009-01-01

    The Standard for Electrical Safety in the Workplace, NFPA 70E, and relevant OSHA electrical safety standards evolved to address the hazards of 60-Hz power that are faced primarily by electricians, linemen, and others performing facility and utility work. This leaves a substantial gap in the management of electrical hazards in Research and Development (R&D) and specialized high voltage and high power equipment. Examples include lasers, accelerators, capacitor banks, electroplating systems, induction and dielectric heating systems, etc. Although all such systems are fed by 50/60 Hz alternating current (ac) power, we find substantial use of direct current (dc) electrical energy, and the use of capacitors, inductors, batteries, and radiofrequency (RF) power. The electrical hazards of these forms of electricity and their systems are different than for 50160 Hz power. Over the past 10 years there has been an effort to develop a method of classifying all of the electrical hazards found in all types of R&D and utilization equipment. Examples of the variation of these hazards from NFPA 70E include (a) high voltage can be harmless, if the available current is sufficiently low, (b) low voltage can be harmful if the available current/power is high, (c) high voltage capacitor hazards are unique and include severe reflex action, affects on the heart, and tissue damage, and (d) arc flash hazard analysis for dc and capacitor systems are not provided in existing standards. This work has led to a comprehensive electrical hazard classification system that is based on various research conducted over the past 100 years, on analysis of such systems in R&D, and on decades of experience. Initially, national electrical safety codes required the qualified worker only to know the source voltage to determine the shock hazard. Later, as arc flash hazards were understood, the fault current and clearing time were needed. These items are still insufficient to fully characterize all types of

  6. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  7. EOG feature relevance determination for microsleep detection

    Directory of Open Access Journals (Sweden)

    Golz Martin

    2017-09-01

    Full Text Available Automatic relevance determination (ARD was applied to two-channel EOG recordings for microsleep event (MSE recognition. 10 s immediately before MSE and also before counterexamples of fatigued, but attentive driving were analysed. Two type of signal features were extracted: the maximum cross correlation (MaxCC and logarithmic power spectral densities (PSD averaged in spectral bands of 0.5 Hz width ranging between 0 and 8 Hz. Generalised learn-ing vector quantisation (GRLVQ was used as ARD method to show the potential of feature reduction. This is compared to support-vector machines (SVM, in which the feature reduction plays a much smaller role. Cross validation yielded mean normalised relevancies of PSD features in the range of 1.6 – 4.9 % and 1.9 – 10.4 % for horizontal and vertical EOG, respectively. MaxCC relevancies were 0.002 – 0.006 % and 0.002 – 0.06 %, respectively. This shows that PSD features of vertical EOG are indispensable, whereas MaxCC can be neglected. Mean classification accuracies were estimated at 86.6±b 1.3 % and 92.3±b 0.2 % for GRLVQ and SVM, respectively. GRLVQ permits objective feature reduction by inclusion of all processing stages, but is not as accurate as SVM.

  8. EOG feature relevance determination for microsleep detection

    Directory of Open Access Journals (Sweden)

    Golz Martin

    2017-09-01

    Full Text Available Automatic relevance determination (ARD was applied to two-channel EOG recordings for microsleep event (MSE recognition. 10 s immediately before MSE and also before counterexamples of fatigued, but attentive driving were analysed. Two type of signal features were extracted: the maximum cross correlation (MaxCC and logarithmic power spectral densities (PSD averaged in spectral bands of 0.5 Hz width ranging between 0 and 8 Hz. Generalised learn-ing vector quantisation (GRLVQ was used as ARD method to show the potential of feature reduction. This is compared to support-vector machines (SVM, in which the feature reduction plays a much smaller role. Cross validation yielded mean normalised relevancies of PSD features in the range of 1.6 - 4.9 % and 1.9 - 10.4 % for horizontal and vertical EOG, respectively. MaxCC relevancies were 0.002 - 0.006 % and 0.002 - 0.06 %, respectively. This shows that PSD features of vertical EOG are indispensable, whereas MaxCC can be neglected. Mean classification accuracies were estimated at 86.6±b 1.3 % and 92.3±b 0.2 % for GRLVQ and SVM, respec-tively. GRLVQ permits objective feature reduction by inclu-sion of all processing stages, but is not as accurate as SVM.

  9. Vulnerabilities Classification for Safe Development on Android

    Directory of Open Access Journals (Sweden)

    Ricardo Luis D. M. Ferreira

    2016-06-01

    Full Text Available The global sales market is currently led by devices with the Android operating system. In 2015, more than 1 billion smartphones were sold, of which 81.5% were operated by the Android platform. In 2017, it is estimated that 267.78 billion applications will be downloaded from Google Play. According to Qian, 90% of applications are vulnerable, despite the recommendations of rules and standards for the safe software development. This study presents a classification of vulnerabilities, indicating the vulnerability, the safety aspect defined by the Brazilian Association of Technical Standards (Associação Brasileira de Normas Técnicas - ABNT norm NBR ISO/IEC 27002 which will be violated, which lines of code generate the vulnerability and what should be done to avoid it, and the threat agent used by each of them. This classification allows the identification of possible points of vulnerability, allowing the developer to correct the identified gaps.

  10. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  11. Improving the accuracy of operation coding in surgical discharge summaries

    Science.gov (United States)

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  12. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  13. Code of practice for ionizing radiation

    International Nuclear Information System (INIS)

    Khoo Boo Huat

    1995-01-01

    Prior to 1984, the use of ionizing radiation in Malaysia was governed by the Radioactive Substances Act of 1968. After 1984, its use came under the control of Act 304, called the Atomic Energy Licensing Act 1984. Under powers vested by the Act, the Radiation Protection (Basic Safety Standards) Regulations 1988 were formulated to regulate its use. These Acts do not provide information on proper working procedures. With the publication of the codes of Practice by The Standards and Industrial Research Institute of Malaysia (SIRIM), the users are now able to follow proper guidelines and use ionizing radiation safely and beneficially. This paper discusses the relevant sections in the following codes: 1. Code of Practice for Radiation Protection (Medical X-ray Diagnosis) MS 838:1983. 2. Code of Practice for Safety in Laboratories Part 4: Ionizing radiation MS 1042: Part 4: 1992. (author)

  14. Code of practice for food handler activities.

    Science.gov (United States)

    Smith, T A; Kanas, R P; McCoubrey, I A; Belton, M E

    2005-08-01

    The food industry regulates various aspects of food handler activities, according to legislation and customer expectations. The purpose of this paper is to provide a code of practice which delineates a set of working standards for food handler hygiene, handwashing, use of protective equipment, wearing of jewellery and body piercing. The code was developed by a working group of occupational physicians with expertise in both food manufacturing and retail, using a risk assessment approach. Views were also obtained from other occupational physicians working within the food industry and the relevant regulatory bodies. The final version of the code (available in full as Supplementary data in Occupational Medicine Online) therefore represents a broad consensus of opinion. The code of practice represents a set of minimum standards for food handler suitability and activities, based on a practical assessment of risk, for application in food businesses. It aims to provide useful working advice to food businesses of all sizes.

  15. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  16. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  17. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  18. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  19. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  20. A method for modeling co-occurrence propensity of clinical codes with application to ICD-10-PCS auto-coding.

    Science.gov (United States)

    Subotin, Michael; Davis, Anthony R

    2016-09-01

    Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  2. Interobserver variation in classification of malleolar fractures

    International Nuclear Information System (INIS)

    Verhage, S.M.; Hoogendoorn, J.M.; Rhemrev, S.J.; Keizer, S.B.; Quarles van Ufford, H.M.E.

    2015-01-01

    Classification of malleolar fractures is a matter of debate. In the ideal situation, a classification system is easy to use, shows good inter- and intraobserver agreement, and has implications for treatment or research. Interobserver study. Four observers distributed 100 X-rays to the Weber, AO and Lauge-Hansen classification. In case of a trimalleolar fracture, the size of the posterior fragment was measured. Interobserver agreement was calculated with Cohen's kappa. Agreement on the size of the posterior fragment was calculated with the intraclass correlation coefficient. Moderate agreement was found with all classification systems: the Weber (K = 0.49), AO (K = 0.45) and Lauge-Hansen (K = 0.47). Interobserver agreement on the presence of a posterior fracture was substantial (K = 0.63). Estimation of the size of the fragment showed moderate agreement (ICC = 0.57). Classification according to the classical systems showed moderate interobserver agreement, probably due to an unclear trauma mechanism or the difficult relation between the level of the fibular fracture and syndesmosis. Substantial agreement on posterior malleolar fractures is mostly due to small (<5 %) posterior fragments. A classification system that describes the presence and location of fibular fractures, presence of medial malleolar fractures or deep deltoid ligament injury, and presence of relevant and dislocated posterior malleolar fractures is more useful in the daily setting than the traditional systems. In case of a trimalleolar fracture, a CT scan is in our opinion very useful in the detection of small posterior fragments and preoperative planning. (orig.)

  3. Interobserver variation in classification of malleolar fractures

    Energy Technology Data Exchange (ETDEWEB)

    Verhage, S.M.; Hoogendoorn, J.M. [MC Haaglanden, Department of Surgery, The Hague (Netherlands); Secretariaat Heelkunde, MC Haaglanden, locatie Westeinde, Postbus 432, CK, The Hague (Netherlands); Rhemrev, S.J. [MC Haaglanden, Department of Surgery, The Hague (Netherlands); Keizer, S.B. [MC Haaglanden, Department of Orthopaedic Surgery, The Hague (Netherlands); Quarles van Ufford, H.M.E. [MC Haaglanden, Department of Radiology, The Hague (Netherlands)

    2015-10-15

    Classification of malleolar fractures is a matter of debate. In the ideal situation, a classification system is easy to use, shows good inter- and intraobserver agreement, and has implications for treatment or research. Interobserver study. Four observers distributed 100 X-rays to the Weber, AO and Lauge-Hansen classification. In case of a trimalleolar fracture, the size of the posterior fragment was measured. Interobserver agreement was calculated with Cohen's kappa. Agreement on the size of the posterior fragment was calculated with the intraclass correlation coefficient. Moderate agreement was found with all classification systems: the Weber (K = 0.49), AO (K = 0.45) and Lauge-Hansen (K = 0.47). Interobserver agreement on the presence of a posterior fracture was substantial (K = 0.63). Estimation of the size of the fragment showed moderate agreement (ICC = 0.57). Classification according to the classical systems showed moderate interobserver agreement, probably due to an unclear trauma mechanism or the difficult relation between the level of the fibular fracture and syndesmosis. Substantial agreement on posterior malleolar fractures is mostly due to small (<5 %) posterior fragments. A classification system that describes the presence and location of fibular fractures, presence of medial malleolar fractures or deep deltoid ligament injury, and presence of relevant and dislocated posterior malleolar fractures is more useful in the daily setting than the traditional systems. In case of a trimalleolar fracture, a CT scan is in our opinion very useful in the detection of small posterior fragments and preoperative planning. (orig.)

  4. Cellular image classification

    CERN Document Server

    Xu, Xiang; Lin, Feng

    2017-01-01

    This book introduces new techniques for cellular image feature extraction, pattern recognition and classification. The authors use the antinuclear antibodies (ANAs) in patient serum as the subjects and the Indirect Immunofluorescence (IIF) technique as the imaging protocol to illustrate the applications of the described methods. Throughout the book, the authors provide evaluations for the proposed methods on two publicly available human epithelial (HEp-2) cell datasets: ICPR2012 dataset from the ICPR'12 HEp-2 cell classification contest and ICIP2013 training dataset from the ICIP'13 Competition on cells classification by fluorescent image analysis. First, the reading of imaging results is significantly influenced by one’s qualification and reading systems, causing high intra- and inter-laboratory variance. The authors present a low-order LP21 fiber mode for optical single cell manipulation and imaging staining patterns of HEp-2 cells. A focused four-lobed mode distribution is stable and effective in optical...

  5. Bosniak classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2016-01-01

    BACKGROUND: The Bosniak classification was originally based on computed tomographic (CT) findings. Magnetic resonance (MR) and contrast-enhanced ultrasonography (CEUS) imaging may demonstrate findings that are not depicted at CT, and there may not always be a clear correlation between the findings...... at MR and CEUS imaging and those at CT. PURPOSE: To compare diagnostic accuracy of MR, CEUS, and CT when categorizing complex renal cystic masses according to the Bosniak classification. MATERIAL AND METHODS: From February 2011 to June 2012, 46 complex renal cysts were prospectively evaluated by three...... readers. Each mass was categorized according to the Bosniak classification and CT was chosen as gold standard. Kappa was calculated for diagnostic accuracy and data was compared with pathological results. RESULTS: CT images found 27 BII, six BIIF, seven BIII, and six BIV. Forty-three cysts could...

  6. Bosniak Classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2014-01-01

    Background: The Bosniak classification is a diagnostic tool for the differentiation of cystic changes in the kidney. The process of categorizing renal cysts may be challenging, involving a series of decisions that may affect the final diagnosis and clinical outcome such as surgical management....... Purpose: To investigate the inter- and intra-observer agreement among experienced uroradiologists when categorizing complex renal cysts according to the Bosniak classification. Material and Methods: The original categories of 100 cystic renal masses were chosen as “Gold Standard” (GS), established...... to the calculated weighted κ all readers performed “very good” for both inter-observer and intra-observer variation. Most variation was seen in cysts catagorized as Bosniak II, IIF, and III. These results show that radiologists who evaluate complex renal cysts routinely may apply the Bosniak classification...

  7. Expressing Youth Voice through Video Games and Coding

    Science.gov (United States)

    Martin, Crystle

    2017-01-01

    A growing body of research focuses on the impact of video games and coding on learning. The research often elevates learning the technical skills associated with video games and coding or the importance of problem solving and computational thinking, which are, of course, necessary and relevant. However, the literature less often explores how young…

  8. Diglossia and Code Switching in Nigeria: Implications for English ...

    African Journals Online (AJOL)

    This paper discusses a sociolinguistic phenomenon, 'diglossia' and how it relates to code-switching in bilingual and multilingual societies like Nigeria. First, it examines the relevance of the social and linguistic contexts in determining the linguistic code that bilinguals and multilinguals use in various communicative ...

  9. IRSN Code of Ethics and Professional Conduct. Annex VII [TSO Mission Statement and Code of Ethics

    International Nuclear Information System (INIS)

    2018-01-01

    IRSN has adopted, in 2013, a Code of Ethics and Professional Conduct, the contents of which are summarized. As a preamble, it is indicated that the Code, which was adopted in 2013 by the Ethics Commission of IRSN and the Board of IRSN, complies with relevant constitutional and legal requirements. The introduction to the Code presents the role and missions of IRSN in the French system, as well as the various conditions and constraints that frame its action, in particular with respect to ethical issues. It states that the Code sets principles and establishes guidance for addressing these constraints and resolving conflicts that may arise, thus constituting references for the Institute and its staff, and helping IRSN’s partners in their interaction with the Institute. The stipulations of the Code are organized in four articles, reproduced and translated.

  10. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  11. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  12. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  13. IR-360 nuclear power plant safety functions and component classification

    Energy Technology Data Exchange (ETDEWEB)

    Yousefpour, F., E-mail: fyousefpour@snira.co [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of); Shokri, F.; Soltani, H. [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of)

    2010-10-15

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  14. IR-360 nuclear power plant safety functions and component classification

    International Nuclear Information System (INIS)

    Yousefpour, F.; Shokri, F.; Soltani, H.

    2010-01-01

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  15. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    Science.gov (United States)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  16. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  17. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  18. Acoustic classification of dwellings

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2014-01-01

    insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant diversity in terms...... exchanging experiences about constructions fulfilling different classes, reducing trade barriers, and finally increasing the sound insulation of dwellings.......Schemes for the classification of dwellings according to different building performances have been proposed in the last years worldwide. The general idea behind these schemes relates to the positive impact a higher label, and thus a better performance, should have. In particular, focusing on sound...

  19. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  20. Classification of iconic images

    OpenAIRE

    Zrianina, Mariia; Kopf, Stephan

    2016-01-01

    Iconic images represent an abstract topic and use a presentation that is intuitively understood within a certain cultural context. For example, the abstract topic “global warming” may be represented by a polar bear standing alone on an ice floe. Such images are widely used in media and their automatic classification can help to identify high-level semantic concepts. This paper presents a system for the classification of iconic images. It uses a variation of the Bag of Visual Words approach wi...

  1. Casemix classification systems.

    Science.gov (United States)

    Fetter, R B

    1999-01-01

    The idea of using casemix classification to manage hospital services is not new, but has been limited by available technology. It was not until after the introduction of Medicare in the United States in 1965 that serious attempts were made to measure hospital production in order to contain spiralling costs. This resulted in a system of casemix classification known as diagnosis related groups (DRGs). This paper traces the development of DRGs and their evolution from the initial version to the All Patient Refined DRGs developed in 1991.

  2. The Search for Symmetries in the Genetic Code:

    Science.gov (United States)

    Antoneli, Fernando; Forger, Michael; Hornos, José Eduardo M.

    We give a full classification of the possible schemes for obtaining the distribution of multiplets observed in the standard genetic code by symmetry breaking in the context of finite groups, based on an extended notion of partial symmetry breaking that incorporates the intuitive idea of "freezing" first proposed by Francis Crick, which is given a precise mathematical meaning.

  3. Subordinate-level object classification reexamined.

    Science.gov (United States)

    Biederman, I; Subramaniam, S; Bar, M; Kalocsai, P; Fiser, J

    1999-01-01

    The classification of a table as round rather than square, a car as a Mazda rather than a Ford, a drill bit as 3/8-inch rather than 1/4-inch, and a face as Tom have all been regarded as a single process termed "subordinate classification." Despite the common label, the considerable heterogeneity of the perceptual processing required to achieve such classifications requires, minimally, a more detailed taxonomy. Perceptual information relevant to subordinate-level shape classifications can be presumed to vary on continua of (a) the type of distinctive information that is present, nonaccidental or metric, (b) the size of the relevant contours or surfaces, and (c) the similarity of the to-be-discriminated features, such as whether a straight contour has to be distinguished from a contour of low curvature versus high curvature. We consider three, relatively pure cases. Case 1 subordinates may be distinguished by a representation, a geon structural description (GSD), specifying a nonaccidental characterization of an object's large parts and the relations among these parts, such as a round table versus a square table. Case 2 subordinates are also distinguished by GSDs, except that the distinctive GSDs are present at a small scale in a complex object so the location and mapping of the GSDs are contingent on an initial basic-level classification, such as when we use a logo to distinguish various makes of cars. Expertise for Cases 1 and 2 can be easily achieved through specification, often verbal, of the GSDs. Case 3 subordinates, which have furnished much of the grist for theorizing with "view-based" template models, require fine metric discriminations. Cases 1 and 2 account for the overwhelming majority of shape-based basic- and subordinate-level object classifications that people can and do make in their everyday lives. These classifications are typically made quickly, accurately, and with only modest costs of viewpoint changes. Whereas the activation of an array of

  4. The paradox of atheoretical classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2016-01-01

    A distinction can be made between “artificial classifications” and “natural classifications,” where artificial classifications may adequately serve some limited purposes, but natural classifications are overall most fruitful by allowing inference and thus many different purposes. There is strong...... support for the view that a natural classification should be based on a theory (and, of course, that the most fruitful theory provides the most fruitful classification). Nevertheless, atheoretical (or “descriptive”) classifications are often produced. Paradoxically, atheoretical classifications may...... be very successful. The best example of a successful “atheoretical” classification is probably the prestigious Diagnostic and Statistical Manual of Mental Disorders (DSM) since its third edition from 1980. Based on such successes one may ask: Should the claim that classifications ideally are natural...

  5. A Proposal for Cardiac Arrhythmia Classification using Complexity Measures

    Directory of Open Access Journals (Sweden)

    AROTARITEI, D.

    2017-08-01

    Full Text Available Cardiovascular diseases are one of the major problems of humanity and therefore one of their component, arrhythmia detection and classification drawn an increased attention worldwide. The presence of randomness in discrete time series, like those arising in electrophysiology, is firmly connected with computational complexity measure. This connection can be used, for instance, in the analysis of RR-intervals of electrocardiographic (ECG signal, coded as binary string, to detect and classify arrhythmia. Our approach uses three algorithms (Lempel-Ziv, Sample Entropy and T-Code to compute the information complexity applied and a classification tree to detect 13 types of arrhythmia with encouraging results. To overcome the computational effort required for complexity calculus, a cloud computing solution with executable code deployment is also proposed.

  6. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  7. Classification of Polarimetric SAR Data Using Dictionary Learning

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg; Dahl, Anders Lindbjerg

    2012-01-01

    This contribution deals with classification of multilook fully polarimetric synthetic aperture radar (SAR) data by learning a dictionary of crop types present in the Foulum test site. The Foulum test site contains a large number of agricultural fields, as well as lakes, forests, natural vegetation......, grasslands and urban areas, which make it ideally suited for evaluation of classification algorithms. Dictionary learning centers around building a collection of image patches typical for the classification problem at hand. This requires initial manual labeling of the classes present in the data and is thus...... a method for supervised classification. Sparse coding of these image patches aims to maintain a proficient number of typical patches and associated labels. Data is consecutively classified by a nearest neighbor search of the dictionary elements and labeled with probabilities of each class. Each dictionary...

  8. Formalization of the classification pattern: survey of classification modeling in information systems engineering.

    Science.gov (United States)

    Partridge, Chris; de Cesare, Sergio; Mitchell, Andrew; Odell, James

    2018-01-01

    Formalization is becoming more common in all stages of the development of information systems, as a better understanding of its benefits emerges. Classification systems are ubiquitous, no more so than in domain modeling. The classification pattern that underlies these systems provides a good case study of the move toward formalization in part because it illustrates some of the barriers to formalization, including the formal complexity of the pattern and the ontological issues surrounding the "one and the many." Powersets are a way of characterizing the (complex) formal structure of the classification pattern, and their formalization has been extensively studied in mathematics since Cantor's work in the late nineteenth century. One can use this formalization to develop a useful benchmark. There are various communities within information systems engineering (ISE) that are gradually working toward a formalization of the classification pattern. However, for most of these communities, this work is incomplete, in that they have not yet arrived at a solution with the expressiveness of the powerset benchmark. This contrasts with the early smooth adoption of powerset by other information systems communities to, for example, formalize relations. One way of understanding the varying rates of adoption is recognizing that the different communities have different historical baggage. Many conceptual modeling communities emerged from work done on database design, and this creates hurdles to the adoption of the high level of expressiveness of powersets. Another relevant factor is that these communities also often feel, particularly in the case of domain modeling, a responsibility to explain the semantics of whatever formal structures they adopt. This paper aims to make sense of the formalization of the classification pattern in ISE and surveys its history through the literature, starting from the relevant theoretical works of the mathematical literature and gradually shifting focus

  9. Web Page Classification Method Using Neural Networks

    Science.gov (United States)

    Selamat, Ali; Omatu, Sigeru; Yanagimoto, Hidekazu; Fujinaka, Toru; Yoshioka, Michifumi

    Automatic categorization is the only viable method to deal with the scaling problem of the World Wide Web (WWW). In this paper, we propose a news web page classification method (WPCM). The WPCM uses a neural network with inputs obtained by both the principal components and class profile-based features (CPBF). Each news web page is represented by the term-weighting scheme. As the number of unique words in the collection set is big, the principal component analysis (PCA) has been used to select the most relevant features for the classification. Then the final output of the PCA is combined with the feature vectors from the class-profile which contains the most regular words in each class before feeding them to the neural networks. We have manually selected the most regular words that exist in each class and weighted them using an entropy weighting scheme. The fixed number of regular words from each class will be used as a feature vectors together with the reduced principal components from the PCA. These feature vectors are then used as the input to the neural networks for classification. The experimental evaluation demonstrates that the WPCM method provides acceptable classification accuracy with the sports news datasets.

  10. Validation of a new classification for periprosthetic shoulder fractures.

    Science.gov (United States)

    Kirchhoff, Chlodwig; Beirer, Marc; Brunner, Ulrich; Buchholz, Arne; Biberthaler, Peter; Crönlein, Moritz

    2018-06-01

    Successful treatment of periprosthetic shoulder fractures depends on the right strategy, starting with a well-structured classification of the fracture. Unfortunately, clinically relevant factors for treatment planning are missing in the pre-existing classifications. Therefore, the aim of the present study was to describe a new specific classification system for periprosthetic shoulder fractures including a structured treatment algorithm for this important fragility fracture issue. The classification was established, focussing on five relevant items, naming the prosthesis type, the fracture localisation, the rotator cuff status, the anatomical fracture region and the stability of the implant. After considering each single item, the individual treatment concept can be assessed in one last step. To evaluate the introduced classification, a retrospective analysis of pre- and post-operative data of patients, treated with periprosthetic shoulder fractures, was conducted by two board certified trauma surgery consultants. The data of 19 patients (8 male, 11 female) with a mean age of 74 ± five years have been analysed in our study. The suggested treatment algorithm was proven to be reliable, detected by good clinical outcome in 15 of 16 (94%) cases, where the suggested treatment was maintained. Only one case resulted in poor outcome due to post-operative wound infection and had to be revised. The newly developed six-step classification is easy to utilise and extends the pre-existing classification systems in terms of clinically-relevant information. This classification should serve as a simple tool for the surgeon to consider the optimal treatment for his patients.

  11. Deep Recurrent Neural Networks for Supernovae Classification

    Science.gov (United States)

    Charnock, Tom; Moss, Adam

    2017-03-01

    We apply deep recurrent neural networks, which are capable of learning complex sequential information, to classify supernovae (code available at https://github.com/adammoss/supernovae). The observational time and filter fluxes are used as inputs to the network, but since the inputs are agnostic, additional data such as host galaxy information can also be included. Using the Supernovae Photometric Classification Challenge (SPCC) data, we find that deep networks are capable of learning about light curves, however the performance of the network is highly sensitive to the amount of training data. For a training size of 50% of the representational SPCC data set (around 104 supernovae) we obtain a type-Ia versus non-type-Ia classification accuracy of 94.7%, an area under the Receiver Operating Characteristic curve AUC of 0.986 and an SPCC figure-of-merit F 1 = 0.64. When using only the data for the early-epoch challenge defined by the SPCC, we achieve a classification accuracy of 93.1%, AUC of 0.977, and F 1 = 0.58, results almost as good as with the whole light curve. By employing bidirectional neural networks, we can acquire impressive classification results between supernovae types I, II and III at an accuracy of 90.4% and AUC of 0.974. We also apply a pre-trained model to obtain classification probabilities as a function of time and show that it can give early indications of supernovae type. Our method is competitive with existing algorithms and has applications for future large-scale photometric surveys.

  12. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  13. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  14. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  15. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  16. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  18. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  19. Conformal radiotherapy: principles and classification

    International Nuclear Information System (INIS)

    Rosenwald, J.C.; Gaboriaud, G.; Pontvert, D.

    1999-01-01

    'Conformal radiotherapy' is the name fixed by usage and given to a new form of radiotherapy resulting from the technological improvements observed during the last ten years. While this terminology is now widely used, no precise definition can be found in the literature. Conformal radiotherapy refers to an approach in which the dose distribution is more closely 'conformed' or adapted to the actual shape of the target volume. However, the achievement of a consensus on a more specific definition is hampered by various difficulties, namely in characterizing the degree of 'conformality'. We have therefore suggested a classification scheme be established on the basis of the tools and the procedures actually used for all steps of the process, i.e., from prescription to treatment completion. Our classification consists of four levels: schematically, at level 0, there is no conformation (rectangular fields); at level 1, a simple conformation takes place, on the basis of conventional 2D imaging; at level 2, a 3D reconstruction of the structures is used for a more accurate conformation; and level 3 includes research and advanced dynamic techniques. We have used our personal experience, contacts with colleagues and data from the literature to analyze all the steps of the planning process, and to define the tools and procedures relevant to a given level. The corresponding tables have been discussed and approved at the European level within the Dynarad concerted action. It is proposed that the term 'conformal radiotherapy' be restricted to procedures where all steps are at least at level 2. (author)

  20. Ecosystem classification, Chapter 2

    Science.gov (United States)

    M.J. Robin-Abbott; L.H. Pardo

    2011-01-01

    The ecosystem classification in this report is based on the ecoregions developed through the Commission for Environmental Cooperation (CEC) for North America (CEC 1997). Only ecosystems that occur in the United States are included. CEC ecoregions are described, with slight modifications, below (CEC 1997) and shown in Figures 2.1 and 2.2. We chose this ecosystem...

  1. The classification of phocomelia.

    Science.gov (United States)

    Tytherleigh-Strong, G; Hooper, G

    2003-06-01

    We studied 24 patients with 44 phocomelic upper limbs. Only 11 limbs could be grouped in the classification system of Frantz and O' Rahilly. The non-classifiable limbs were further studied and their characteristics identified. It is confirmed that phocomelia is not an intercalary defect.

  2. Principles for ecological classification

    Science.gov (United States)

    Dennis H. Grossman; Patrick Bourgeron; Wolf-Dieter N. Busch; David T. Cleland; William Platts; G. Ray; C. Robins; Gary Roloff

    1999-01-01

    The principal purpose of any classification is to relate common properties among different entities to facilitate understanding of evolutionary and adaptive processes. In the context of this volume, it is to facilitate ecosystem stewardship, i.e., to help support ecosystem conservation and management objectives.

  3. Mimicking human texture classification

    NARCIS (Netherlands)

    Rogowitz, B.E.; van Rikxoort, Eva M.; van den Broek, Egon; Pappas, T.N.; Schouten, Theo E.; Daly, S.J.

    2005-01-01

    In an attempt to mimic human (colorful) texture classification by a clustering algorithm three lines of research have been encountered, in which as test set 180 texture images (both their color and gray-scale equivalent) were drawn from the OuTex and VisTex databases. First, a k-means algorithm was

  4. Classification, confusion and misclassification

    African Journals Online (AJOL)

    The classification of objects and phenomena in science and nature has fascinated academics since Carl Linnaeus, the Swedish botanist and zoologist, created his binomial description of living things in the 1700s and probably long before in accounts of others in textbooks long since gone. It must have concerned human ...

  5. Classifications in popular music

    NARCIS (Netherlands)

    van Venrooij, A.; Schmutz, V.; Wright, J.D.

    2015-01-01

    The categorical system of popular music, such as genre categories, is a highly differentiated and dynamic classification system. In this article we present work that studies different aspects of these categorical systems in popular music. Following the work of Paul DiMaggio, we focus on four

  6. Shark Teeth Classification

    Science.gov (United States)

    Brown, Tom; Creel, Sally; Lee, Velda

    2009-01-01

    On a recent autumn afternoon at Harmony Leland Elementary in Mableton, Georgia, students in a fifth-grade science class investigated the essential process of classification--the act of putting things into groups according to some common characteristics or attributes. While they may have honed these skills earlier in the week by grouping their own…

  7. Text document classification

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana

    č. 62 (2005), s. 53-54 ISSN 0926-4981 R&D Projects: GA AV ČR IAA2075302; GA AV ČR KSK1019101; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : document representation * categorization * classification Subject RIV: BD - Theory of Information

  8. Classification in Medical Imaging

    DEFF Research Database (Denmark)

    Chen, Chen

    Classification is extensively used in the context of medical image analysis for the purpose of diagnosis or prognosis. In order to classify image content correctly, one needs to extract efficient features with discriminative properties and build classifiers based on these features. In addition...... on characterizing human faces and emphysema disease in lung CT images....

  9. Improving Student Question Classification

    Science.gov (United States)

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  10. NOUN CLASSIFICATION IN ESAHIE

    African Journals Online (AJOL)

    The present work deals with noun classification in Esahie (Kwa, Niger ... phonological information influences the noun (form) class system of Esahie. ... between noun classes and (grammatical) Gender is interrogated (in the light of ..... the (A) argument6 precedes the verb and the (P) argument7 follows the verb in a simple.

  11. Dynamic Latent Classification Model

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre

    as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...

  12. Classification of myocardial infarction

    DEFF Research Database (Denmark)

    Saaby, Lotte; Poulsen, Tina Svenstrup; Hosbond, Susanne Elisabeth

    2013-01-01

    The classification of myocardial infarction into 5 types was introduced in 2007 as an important component of the universal definition. In contrast to the plaque rupture-related type 1 myocardial infarction, type 2 myocardial infarction is considered to be caused by an imbalance between demand...

  13. Event Classification using Concepts

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2013-01-01

    The semantic gap is one of the challenges in the GOOSE project. In this paper a Semantic Event Classification (SEC) system is proposed as an initial step in tackling the semantic gap challenge in the GOOSE project. This system uses semantic text analysis, multiple feature detectors using the BoW

  14. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  15. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  16. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  17. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  18. Tumor taxonomy for the developmental lineage classification of neoplasms

    International Nuclear Information System (INIS)

    Berman, Jules J

    2004-01-01

    The new 'Developmental lineage classification of neoplasms' was described in a prior publication. The classification is simple (the entire hierarchy is described with just 39 classifiers), comprehensive (providing a place for every tumor of man), and consistent with recent attempts to characterize tumors by cytogenetic and molecular features. A taxonomy is a list of the instances that populate a classification. The taxonomy of neoplasia attempts to list every known term for every known tumor of man. The taxonomy provides each concept with a unique code and groups synonymous terms under the same concept. A Perl script validated successive drafts of the taxonomy ensuring that: 1) each term occurs only once in the taxonomy; 2) each term occurs in only one tumor class; 3) each concept code occurs in one and only one hierarchical position in the classification; and 4) the file containing the classification and taxonomy is a well-formed XML (eXtensible Markup Language) document. The taxonomy currently contains 122,632 different terms encompassing 5,376 neoplasm concepts. Each concept has, on average, 23 synonyms. The taxonomy populates 'The developmental lineage classification of neoplasms,' and is available as an XML file, currently 9+ Megabytes in length. A representation of the classification/taxonomy listing each term followed by its code, followed by its full ancestry, is available as a flat-file, 19+ Megabytes in length. The taxonomy is the largest nomenclature of neoplasms, with more than twice the number of neoplasm names found in other medical nomenclatures, including the 2004 version of the Unified Medical Language System, the Systematized Nomenclature of Medicine Clinical Terminology, the National Cancer Institute's Thesaurus, and the International Classification of Diseases Oncolology version. This manuscript describes a comprehensive taxonomy of neoplasia that collects synonymous terms under a unique code number and assigns each

  19. NEW CLASSIFICATION OF ECOPOLICES

    Directory of Open Access Journals (Sweden)

    VOROBYOV V. V.

    2016-09-01

    Full Text Available Problem statement. Ecopolices are the newest stage of the urban planning. They have to be consideredsuchas material and energy informational structures, included to the dynamic-evolutionary matrix netsofex change processes in the ecosystems. However, there are not made the ecopolice classifications, developing on suchapproaches basis. And this determined the topicality of the article. Analysis of publications on theoretical and applied aspects of the ecopolices formation showed, that the work on them is managed mainly in the context of the latest scientific and technological achievements in the various knowledge fields. These settlements are technocratic. They are connected with the morphology of space, network structures of regional and local natural ecosystems, without independent stability, can not exist without continuous man support. Another words, they do not work in with an ecopolices idea. It is come to a head for objective, symbiotic searching of ecopolices concept with the development of their classifications. Purpose statement is to develop the objective evidence for ecopolices and to propose their new classification. Conclusion. On the base of the ecopolices classification have to lie an elements correlation idea of their general plans and men activity type according with natural mechanism of accepting, reworking and transmission of material, energy and information between geo-ecosystems, planet, man, ecopolices material part and Cosmos. New ecopolices classification should be based on the principles of multi-dimensional, time-spaced symbiotic clarity with exchange ecosystem networks. The ecopolice function with this approach comes not from the subjective anthropocentric economy but from the holistic objective of Genesis paradigm. Or, otherwise - not from the Consequence, but from the Cause.

  20. Efficient Fingercode Classification

    Science.gov (United States)

    Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang

    In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.

  1. Differential Classification of Dementia

    Directory of Open Access Journals (Sweden)

    E. Mohr

    1995-01-01

    Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.

  2. COOLII code conversion from Cyber to HP

    International Nuclear Information System (INIS)

    Lee, Hae Cho; Kim, Hee Kyung

    1996-06-01

    COOLII computer code determines the time required to cooldown the plant from shutdown cooling system initiation condition to cold shutdown or refueling condition. Required time for cooldown is calculated under the various assumption on shutdown cooling heat exchanger(SDCHX) availability, reactor coolant system (RCS) low pressure safety injection(LPSI) flowrate. RCS cooldown rates and component cooling system flow rates. This report firstly describes detailed work carried out for installation of COOLII on HP 9000/700 series as well as relevant code validation results. Attached is a report on software verification and validation results. 7 refs. (Author) .new

  3. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-09-09

    ... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: The... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S...

  4. Excitation equilibria in plasmas: a classification

    International Nuclear Information System (INIS)

    Mullen, J.-J.A.M. van der.

    1986-01-01

    In this thesis the author presents a classification of plasmas based on the atomic state distribution function. The study is based on the relation between the distribution function and the underlying processes and starts with the proper understanding of thermodynamic equilibrium (TE). Four types of proper balances are relevant: The 'Maxwell balance' of kinetic energy transfer, the 'Boltzmann balance' of excitation/deexcitation, the 'Saha balance' of ionization/recombination and the 'Planck balance' for interaction of atoms with radiation. Special attention is paid to the distribution function of the ionizing excitation saturation balance. The classification theory of the distribution functions in relation with underlying balances is supported by experimental evidence in an ionizing argon plasma. The AR I system provides a pertinent support of the theory. Experimental facts found in the AR II system can be interpreted in global terms. (Auth.)

  5. Prediction and classification of respiratory motion

    CERN Document Server

    Lee, Suk Jin

    2014-01-01

    This book describes recent radiotherapy technologies including tools for measuring target position during radiotherapy and tracking-based delivery systems. This book presents a customized prediction of respiratory motion with clustering from multiple patient interactions. The proposed method contributes to the improvement of patient treatments by considering breathing pattern for the accurate dose calculation in radiotherapy systems. Real-time tumor-tracking, where the prediction of irregularities becomes relevant, has yet to be clinically established. The statistical quantitative modeling for irregular breathing classification, in which commercial respiration traces are retrospectively categorized into several classes based on breathing pattern are discussed as well. The proposed statistical classification may provide clinical advantages to adjust the dose rate before and during the external beam radiotherapy for minimizing the safety margin. In the first chapter following the Introduction  to this book, we...

  6. Drug safety: Pregnancy rating classifications and controversies.

    Science.gov (United States)

    Wilmer, Erin; Chai, Sandy; Kroumpouzos, George

    2016-01-01

    This contribution consolidates data on international pregnancy rating classifications, including the former US Food and Drug Administration (FDA), Swedish, and Australian classification systems, as well as the evidence-based medicine system, and discusses discrepancies among them. It reviews the new Pregnancy and Lactation Labeling Rule (PLLR) that replaced the former FDA labeling system with narrative-based labeling requirements. PLLR emphasizes on human data and highlights pregnancy exposure registry information. In this context, the review discusses important data on the safety of most medications used in the management of skin disease in pregnancy. There are also discussions of controversies relevant to the safety of certain dermatologic medications during gestation. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Lattice-Like Total Perfect Codes

    Directory of Open Access Journals (Sweden)

    Araujo Carlos

    2014-02-01

    Full Text Available A contribution is made to the classification of lattice-like total perfect codes in integer lattices Λn via pairs (G, Φ formed by abelian groups G and homomorphisms Φ: Zn → G. A conjecture is posed that the cited contribution covers all possible cases. A related conjecture on the unfinished work on open problems on lattice-like perfect dominating sets in Λn with induced components that are parallel paths of length > 1 is posed as well.

  8. Radon Protection in the Technical Building Code

    International Nuclear Information System (INIS)

    Frutos, B.; Garcia, J. P.; Martin, J. L.; Olaya, M.; Serrano, J I.; Suarez, E.; Fernandez, J. A.; Rodrigo, F.

    2003-01-01

    Building construction in areas with high radon gas contamination in land requires the incorporation of certain measures in order to prevent the accumulation of this gas on the inside of buildings. These measures should be considered primarily in the design and construction phases and should take the area of the country into consideration where the construction will take place depending on the potential risk of radon entrance. Within the Technical Building Code, radon protection has been considered through general classification of the country and specific areas where building construction is to take place, in different risk categories and in the introduction of building techniques appropriate for each area. (Author) 17 refs

  9. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  10. 32 CFR 2700.22 - Classification guides.

    Science.gov (United States)

    2010-07-01

    ... SECURITY INFORMATION REGULATIONS Derivative Classification § 2700.22 Classification guides. OMSN shall... direct derivative classification, shall identify the information to be protected in specific and uniform...

  11. Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.

    Science.gov (United States)

    Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O

    2017-04-01

    Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and

  12. Biogeographic classification of the Caspian Sea

    Science.gov (United States)

    Fendereski, F.; Vogt, M.; Payne, M. R.; Lachkar, Z.; Gruber, N.; Salmanmahiny, A.; Hosseini, S. A.

    2014-11-01

    Like other inland seas, the Caspian Sea (CS) has been influenced by climate change and anthropogenic disturbance during recent decades, yet the scientific understanding of this water body remains poor. In this study, an eco-geographical classification of the CS based on physical information derived from space and in situ data is developed and tested against a set of biological observations. We used a two-step classification procedure, consisting of (i) a data reduction with self-organizing maps (SOMs) and (ii) a synthesis of the most relevant features into a reduced number of marine ecoregions using the hierarchical agglomerative clustering (HAC) method. From an initial set of 12 potential physical variables, 6 independent variables were selected for the classification algorithm, i.e., sea surface temperature (SST), bathymetry, sea ice, seasonal variation of sea surface salinity (DSSS), total suspended matter (TSM) and its seasonal variation (DTSM). The classification results reveal a robust separation between the northern and the middle/southern basins as well as a separation of the shallow nearshore waters from those offshore. The observed patterns in ecoregions can be attributed to differences in climate and geochemical factors such as distance from river, water depth and currents. A comparison of the annual and monthly mean Chl a concentrations between the different ecoregions shows significant differences (one-way ANOVA, P qualitative evaluation of differences in community composition based on recorded presence-absence patterns of 25 different species of plankton, fish and benthic invertebrate also confirms the relevance of the ecoregions as proxies for habitats with common biological characteristics.

  13. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  14. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  15. Nonterminals, homomorphisms and codings in different variations of OL-systems. II. Nondeterministic systems

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Rozenberg, Grzegorz; Salomaa, Arto

    1974-01-01

    Continuing the work begun in Part I of this paper, we consider now variations of nondeterministic OL-systems. The present Part II of the paper contains a systematic classification of the effect of nonterminals, codings, weak codings, nonerasing homomorphisms and homomorphisms for all basic variat...

  16. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  17. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  18. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  19. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  20. IAEA Classification of Uranium Deposits

    International Nuclear Information System (INIS)

    Bruneton, Patrice

    2014-01-01

    Classifications of uranium deposits follow two general approaches, focusing on: • descriptive features such as the geotectonic position, the host rock type, the orebody morphology, …… : « geologic classification »; • or on genetic aspects: « genetic classification »

  1. Classification of Osteogenesis Imperfecta revisited

    NARCIS (Netherlands)

    van Dijk, F. S.; Pals, G.; van Rijn, R. R.; Nikkels, P. G. J.; Cobben, J. M.

    2010-01-01

    In 1979 Sillence proposed a classification of Osteogenesis Imperfecta (OI) in OI types I, II, III and IV. In 2004 and 2007 this classification was expanded with OI types V-VIII because of distinct clinical features and/or different causative gene mutations. We propose a revised classification of OI

  2. The future of general classification

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2013-01-01

    Discusses problems related to accessing multiple collections using a single retrieval language. Surveys the concepts of interoperability and switching language. Finds that mapping between more indexing languages always will be an approximation. Surveys the issues related to general classification...... and contrasts that to special classifications. Argues for the use of general classifications to provide access to collections nationally and internationally....

  3. Classification of Strawberry Fruit Shape by Machine Learning

    Science.gov (United States)

    Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; Isobe, S.; Kochi, N.

    2018-05-01

    Shape is one of the most important traits of agricultural products due to its relationships with the quality, quantity, and value of the products. For strawberries, the nine types of fruit shape were defined and classified by humans based on the sampler patterns of the nine types. In this study, we tested the classification of strawberry shapes by machine learning in order to increase the accuracy of the classification, and we introduce the concept of computerization into this field. Four types of descriptors were extracted from the digital images of strawberries: (1) the Measured Values (MVs) including the length of the contour line, the area, the fruit length and width, and the fruit width/length ratio; (2) the Ellipse Similarity Index (ESI); (3) Elliptic Fourier Descriptors (EFDs), and (4) Chain Code Subtraction (CCS). We used these descriptors for the classification test along with the random forest approach, and eight of the nine shape types were classified with combinations of MVs + CCS + EFDs. CCS is a descriptor that adds human knowledge to the chain codes, and it showed higher robustness in classification than the other descriptors. Our results suggest machine learning's high ability to classify fruit shapes accurately. We will attempt to increase the classification accuracy and apply the machine learning methods to other plant species.

  4. Recommendations for the classification of HIV associated neuromanifestations in the German DRG system.

    Science.gov (United States)

    Evers, Stefan; Fiori, W; Brockmeyer, N; Arendt, G; Husstedt, I-W

    2005-09-12

    HIV associated neuromanifestations are of growing importance in the in-patient treatment of HIV infected patients. In Germany, all in-patients have to be coded according to the ICD-10 classification and the German DRG-system. We present recommendations how to code the different primary and secondary neuromanifestations of HIV infection. These recommendations are based on the commentary of the German DRG procedures and are aimed to establish uniform coding of neuromanifestations.

  5. A European classification of services for long-term care—the EU-project eDESDE-LTC

    Science.gov (United States)

    Weber, Germain; Brehmer, Barbara; Zeilinger, Elisabeth; Salvador-Carulla, Luis

    2009-01-01

    Purpose and theory The eDESDE-LTC project aims at developing an operational system for coding, mapping and comparing services for long-term care (LTC) across EU. The projects strategy is to improve EU listing and access to relevant sources of healthcare information via development of SEMANTIC INTER-OPERABILITY in eHEALTH (coding and listing of services for LTC); to increase access to relevant sources of information on LTC services, and to improve linkages between national and regional websites; to foster cooperation with international organizations (OECD). Methods This operational system will include a standard classification of main types of care for persons with LTC needs and an instrument for mapping and standard description of services. These instruments are based on previous classification systems for mental health services (ESMS), disabilities services (DESDE) and ageing services (DESDAE). A Delphi panel made by seven partners developed a DESDE-LTC beta version, which was translated into six languages. The feasibility of DESDE-LTC is tested in six countries using national focal groups. Then the final version will be developed by the Delphi panel, a webpage, training material and course will be carried out. Results and conclusions The eDESDE-LTC system will be piloted in two EU countries (Spain and Bulgaria). Evaluation will focus primarily on usability and impact analysis. Discussion The added value of this project is related to the right of “having access to high-quality healthcare when and where it is needed” by EU citizens. Due to semantic variability and service complexity, existing national listings of services do not provide an adequate framework for patient mobility.

  6. AIRPORTS CLASSIFICATION AND PRIORITY OF THEIR RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    K. V. Marintseva

    2014-03-01

    Full Text Available Purpose. It is important for Ukraine to have a network of airports, which would promote the current and long-term implementation of air transportation needs of the population and the economics. This study aims to establish criteria of airports classification to determine their role in the development of the air transport system of Ukraine. Methodology. The methods of statistical analysis were used for the processing of data according to categories of airport productivity and geographic information system for data visualization. Findings. It is established that the existing division of Ukrainian airports into international and domestic, as well as into coordinated and non-coordinated ones is not relevant for determining the role of airport in the development of air transport system of the country and accordingly for the priority in financing of their modernization. The approach to the determination of airports classifications using analysis of performance categories was developed. Originality. Classification criterions of the airports in Ukraine are proposed: by type of activity and by the maintenance of scheduled route network. It is proposed to classify the airports by the type of activity to the primary commercial, commercial, cargo primary commercial, cargo commercial and general aviation. According to the scheduled route network maintenane it is proposed to classify the airports as the primary, non-primary and auxiliary hubs. An example of classification by the given criteria is submitted. Practical value. The value of the obtained results is in the possibility of using the proposed classification in the task of determining the priorities for financing the country's airports. As opposed to the practice of directed funding procedure in the framework of the state program of airports development, it is proposed to take into account the fact that the resumption of the functioning of the airport and/or its modernization should be as a response to

  7. Binary Classification Method of Social Network Users

    Directory of Open Access Journals (Sweden)

    I. A. Poryadin

    2017-01-01

    Full Text Available The subject of research is a binary classification method of social network users based on the data analysis they have placed. Relevance of the task to gain information about a person by examining the content of his/her pages in social networks is exemplified. The most common approach to its solution is a visual browsing. The order of the regional authority in our country illustrates that its using in school education is needed. The article shows restrictions on the visual browsing of pupil’s pages in social networks as a tool for the teacher and the school psychologist and justifies that a process of social network users’ data analysis should be automated. Explores publications, which describe such data acquisition, processing, and analysis methods and considers their advantages and disadvantages. The article also gives arguments to support a proposal to study the classification method of social network users. One such method is credit scoring, which is used in banks and credit institutions to assess the solvency of clients. Based on the high efficiency of the method there is a proposal for significant expansion of its using in other areas of society. The possibility to use logistic regression as the mathematical apparatus of the proposed method of binary classification has been justified. Such an approach enables taking into account the different types of data extracted from social networks. Among them: the personal user data, information about hobbies, friends, graphic and text information, behaviour characteristics. The article describes a number of existing methods of data transformation that can be applied to solve the problem. An experiment of binary gender-based classification of social network users is described. A logistic model obtained for this example includes multiple logical variables obtained by transforming the user surnames. This experiment confirms the feasibility of the proposed method. Further work is to define a system

  8. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  9. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  10. Derivation and validation of the Systemic Lupus International Collaborating Clinics classification criteria for systemic lupus erythematosus

    DEFF Research Database (Denmark)

    Petri, Michelle; Orbai, Ana-Maria; Alarcón, Graciela S

    2012-01-01

    The Systemic Lupus International Collaborating Clinics (SLICC) group revised and validated the American College of Rheumatology (ACR) systemic lupus erythematosus (SLE) classification criteria in order to improve clinical relevance, meet stringent methodology requirements, and incorporate new...

  11. Classifications of Acute Scaphoid Fractures: A Systematic Literature Review.

    Science.gov (United States)

    Ten Berg, Paul W; Drijkoningen, Tessa; Strackee, Simon D; Buijze, Geert A

    2016-05-01

    Background In the lack of consensus, surgeon-based preference determines how acute scaphoid fractures are classified. There is a great variety of classification systems with considerable controversies. Purposes The purpose of this study was to provide an overview of the different classification systems, clarifying their subgroups and analyzing their popularity by comparing citation indexes. The intention was to improve data comparison between studies using heterogeneous fracture descriptions. Methods We performed a systematic review of the literature based on a search of medical literature from 1950 to 2015, and a manual search using the reference lists in relevant book chapters. Only original descriptions of classifications of acute scaphoid fractures in adults were included. Popularity was based on citation index as reported in the databases of Web of Science (WoS) and Google Scholar. Articles that were cited <10 times in WoS were excluded. Results Our literature search resulted in 308 potentially eligible descriptive reports of which 12 reports met the inclusion criteria. We distinguished 13 different (sub) classification systems based on (1) fracture location, (2) fracture plane orientation, and (3) fracture stability/displacement. Based on citations numbers, the Herbert classification was most popular, followed by the Russe and Mayo classifications. All classification systems were based on plain radiography. Conclusions Most classification systems were based on fracture location, displacement, or stability. Based on the controversy and limited reliability of current classification systems, suggested research areas for an updated classification include three-dimensional fracture pattern etiology and fracture fragment mobility assessed by dynamic imaging.

  12. [Headache: classification and diagnosis].

    Science.gov (United States)

    Carbaat, P A T; Couturier, E G M

    2016-11-01

    There are many types of headache and, moreover, many people have different types of headache at the same time. Adequate treatment is possible only on the basis of the correct diagnosis. Technically and in terms of content the current diagnostics process for headache is based on the 'International Classification of Headache Disorders' (ICHD-3-beta) that was produced under the auspices of the International Headache Society. This classification is based on a distinction between primary and secondary headaches. The most common primary headache types are the tension type headache, migraine and the cluster headache. Application of uniform diagnostic concepts is essential to come to the most appropriate treatment of the various types of headache.

  13. Sound classification of dwellings

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2012-01-01

    National schemes for sound classification of dwellings exist in more than ten countries in Europe, typically published as national standards. The schemes define quality classes reflecting different levels of acoustical comfort. Main criteria concern airborne and impact sound insulation between...... dwellings, facade sound insulation and installation noise. The schemes have been developed, implemented and revised gradually since the early 1990s. However, due to lack of coordination between countries, there are significant discrepancies, and new standards and revisions continue to increase the diversity...... is needed, and a European COST Action TU0901 "Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing Constructions", has been established and runs 2009-2013, one of the main objectives being to prepare a proposal for a European sound classification scheme with a number of quality...

  14. CONTRANS 2 code conversion from Apollo to HP

    International Nuclear Information System (INIS)

    Lee, Hae Cho

    1996-01-01

    CONTRANS2 computer code is used to calculate transient thermal hydraulic responses of containment building to loss of coolant and main steam line break accident. Mass and energy release to the containment following an accident are code inputs. This report firstly describes detailed work carried out for installation of CONTRANS2 on Apollo DN10000 and code validation results after installation. Secondly, A series of work is also describes in relation to installation of CONTRANS2 on HP 9000/700 series as well as relevant code validation results. Attached is a report on software verification and validation results. 7 refs. (Author) .new

  15. CESEC III code conversion from Apollo to HP9000

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hae Cho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-01-01

    CESEC code is a computer program used to analyze transient behaviour of reactor coolant systems of nuclear power plants. CESEC III is an extension of original CESEC code in order to apply wide range of accident analysis including ATWS model. Major parameters during the transients are calculated by CESEC. This report firstly describes detailed work carried out for installation of CESEC III on Apollo DN10000 and code validation results after installation. Secondly, A series of work is also described in relation to installation of CESECIII on HP 9000/700 series as well as relevant code validation results. Attached is a report on software verification and validation results. 7 refs. (Author) .new.

  16. CESEC III code conversion from Apollo to HP9000

    International Nuclear Information System (INIS)

    Lee, Hae Cho

    1996-01-01

    CESEC code is a computer program used to analyze transient behaviour of reactor coolant systems of nuclear power plants. CESEC III is an extension of original CESEC code in order to apply wide range of accident analysis including ATWS model. Major parameters during the transients are calculated by CESEC. This report firstly describes detailed work carried out for installation of CESEC III on Apollo DN10000 and code validation results after installation. Secondly, A series of work is also described in relation to installation of CESECIII on HP 9000/700 series as well as relevant code validation results. Attached is a report on software verification and validation results. 7 refs. (Author) .new

  17. Exploiting monotonicity constraints for active learning in ordinal classification

    NARCIS (Netherlands)

    Soons, Pieter; Feelders, Adrianus

    2014-01-01

    We consider ordinal classification and instance ranking problems where each attribute is known to have an increasing or decreasing relation with the class label or rank. For example, it stands to reason that the number of query terms occurring in a document has a positive influence on its relevance

  18. Towards an international classification for patient safety : the conceptual framework

    NARCIS (Netherlands)

    Sherman, H.; Castro, G.; Fletcher, M.; Hatlie, M.; Hibbert, P.; Jakob, R.; Koss, R.; Lewalle, P.; Loeb, J.; Perneger, Th.; Runciman, W.; Thomson, R.; Schaaf, van der T.W.; Virtanen, M.

    2009-01-01

    Global advances in patient safety have been hampered by the lack of a uniform classification of patient safety concepts. This is a significant barrier to developing strategies to reduce risk, performing evidence-based research and evaluating existing healthcare policies relevant to patient safety.

  19. Granular loess classification based

    International Nuclear Information System (INIS)

    Browzin, B.S.

    1985-01-01

    This paper discusses how loess might be identified by two index properties: the granulometric composition and the dry unit weight. These two indices are necessary but not always sufficient for identification of loess. On the basis of analyses of samples from three continents, it was concluded that the 0.01-0.5-mm fraction deserves the name loessial fraction. Based on the loessial fraction concept, a granulometric classification of loess is proposed. A triangular chart is used to classify loess

  20. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  1. CLASSIFICATION OF CRIMINAL GROUPS

    OpenAIRE

    Natalia Romanova

    2013-01-01

    New types of criminal groups are emerging in modern society.  These types have their special criminal subculture. The research objective is to develop new parameters of classification of modern criminal groups, create a new typology of criminal groups and identify some features of their subculture. Research methodology is based on the system approach that includes using the method of analysis of documentary sources (materials of a criminal case), method of conversations with themembers of the...

  2. Decimal Classification Editions

    Directory of Open Access Journals (Sweden)

    Zenovia Niculescu

    2009-01-01

    Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  3. Decimal Classification Editions

    OpenAIRE

    Zenovia Niculescu

    2009-01-01

    The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  4. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  5. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  6. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  7. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  8. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  9. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  10. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  11. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  12. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  13. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  14. Transporter taxonomy - a comparison of different transport protein classification schemes.

    Science.gov (United States)

    Viereck, Michael; Gaulton, Anna; Digles, Daniela; Ecker, Gerhard F

    2014-06-01

    Currently, there are more than 800 well characterized human membrane transport proteins (including channels and transporters) and there are estimates that about 10% (approx. 2000) of all human genes are related to transport. Membrane transport proteins are of interest as potential drug targets, for drug delivery, and as a cause of side effects and drug–drug interactions. In light of the development of Open PHACTS, which provides an open pharmacological space, we analyzed selected membrane transport protein classification schemes (Transporter Classification Database, ChEMBL, IUPHAR/BPS Guide to Pharmacology, and Gene Ontology) for their ability to serve as a basis for pharmacology driven protein classification. A comparison of these membrane transport protein classification schemes by using a set of clinically relevant transporters as use-case reveals the strengths and weaknesses of the different taxonomy approaches.

  15. Nonlinear programming for classification problems in machine learning

    Science.gov (United States)

    Astorino, Annabella; Fuduli, Antonio; Gaudioso, Manlio

    2016-10-01

    We survey some nonlinear models for classification problems arising in machine learning. In the last years this field has become more and more relevant due to a lot of practical applications, such as text and web classification, object recognition in machine vision, gene expression profile analysis, DNA and protein analysis, medical diagnosis, customer profiling etc. Classification deals with separation of sets by means of appropriate separation surfaces, which is generally obtained by solving a numerical optimization model. While linear separability is the basis of the most popular approach to classification, the Support Vector Machine (SVM), in the recent years using nonlinear separating surfaces has received some attention. The objective of this work is to recall some of such proposals, mainly in terms of the numerical optimization models. In particular we tackle the polyhedral, ellipsoidal, spherical and conical separation approaches and, for some of them, we also consider the semisupervised versions.

  16. Classifications of track structures

    International Nuclear Information System (INIS)

    Paretzke, H.G.

    1984-01-01

    When ionizing particles interact with matter they produce random topological structures of primary activations which represent the initial boundary conditions for all subsequent physical, chemical and/or biological reactions. There are two important aspects of research on such track structures, namely their experimental or theoretical determination on one hand and the quantitative classification of these complex structures which is a basic pre-requisite for the understanding of mechanisms of radiation actions. This paper deals only with the latter topic, i.e. the problems encountered in and possible approaches to quantitative ordering and grouping of these multidimensional objects by their degrees of similarity with respect to their efficiency in producing certain final radiation effects, i.e. to their ''radiation quality.'' Various attempts of taxonometric classification with respect to radiation efficiency have been made in basic and applied radiation research including macro- and microdosimetric concepts as well as track entities and stopping power based theories. In this paper no review of those well-known approaches is given but rather an outline and discussion of alternative methods new to this field of radiation research which have some very promising features and which could possibly solve at least some major classification problems

  17. Neuromuscular disease classification system

    Science.gov (United States)

    Sáez, Aurora; Acha, Begoña; Montero-Sánchez, Adoración; Rivas, Eloy; Escudero, Luis M.; Serrano, Carmen

    2013-06-01

    Diagnosis of neuromuscular diseases is based on subjective visual assessment of biopsies from patients by the pathologist specialist. A system for objective analysis and classification of muscular dystrophies and neurogenic atrophies through muscle biopsy images of fluorescence microscopy is presented. The procedure starts with an accurate segmentation of the muscle fibers using mathematical morphology and a watershed transform. A feature extraction step is carried out in two parts: 24 features that pathologists take into account to diagnose the diseases and 58 structural features that the human eye cannot see, based on the assumption that the biopsy is considered as a graph, where the nodes are represented by each fiber, and two nodes are connected if two fibers are adjacent. A feature selection using sequential forward selection and sequential backward selection methods, a classification using a Fuzzy ARTMAP neural network, and a study of grading the severity are performed on these two sets of features. A database consisting of 91 images was used: 71 images for the training step and 20 as the test. A classification error of 0% was obtained. It is concluded that the addition of features undetectable by the human visual inspection improves the categorization of atrophic patterns.

  18. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  19. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  20. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  1. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  2. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  3. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  4. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  5. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  6. Compensatory neurofuzzy model for discrete data classification in biomedical

    Science.gov (United States)

    Ceylan, Rahime

    2015-03-01

    Biomedical data is separated to two main sections: signals and discrete data. So, studies in this area are about biomedical signal classification or biomedical discrete data classification. There are artificial intelligence models which are relevant to classification of ECG, EMG or EEG signals. In same way, in literature, many models exist for classification of discrete data taken as value of samples which can be results of blood analysis or biopsy in medical process. Each algorithm could not achieve high accuracy rate on classification of signal and discrete data. In this study, compensatory neurofuzzy network model is presented for classification of discrete data in biomedical pattern recognition area. The compensatory neurofuzzy network has a hybrid and binary classifier. In this system, the parameters of fuzzy systems are updated by backpropagation algorithm. The realized classifier model is conducted to two benchmark datasets (Wisconsin Breast Cancer dataset and Pima Indian Diabetes dataset). Experimental studies show that compensatory neurofuzzy network model achieved 96.11% accuracy rate in classification of breast cancer dataset and 69.08% accuracy rate was obtained in experiments made on diabetes dataset with only 10 iterations.

  7. Effectiveness of Multivariate Time Series Classification Using Shapelets

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.

  8. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  9. COAST code conversion from Cyber to HP

    International Nuclear Information System (INIS)

    Lee, Hae Cho

    1996-04-01

    The transient thermal hydraulic behavior of reactor coolant system in a nuclear power plant following loss of coolant flow is analyzed by use of COAST digital computer code. COAST calculates individual loop flow rates and steam generator pressure drops is a function of time following coast-down of any number of reactor coolant pumps. This report firstly describes detailed work carried out for installation of COAST on HP 9000/700 series and code validation results after installation. Secondly, a series of work is also describes in relation to installation of COAST on Apollo DN10000 series as well as relevant code validation results. Attached is a report on software verification and validation results. 7 refs. (Author) .new

  10. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  11. Sandia National Laboratories analysis code data base

    Science.gov (United States)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  12. Comparabilidad entre la novena y la décima revisión de la Clasificación Internacional de Enfermedades aplicada a la codificación de la causa de muerte en España Comparability between the ninth and tenth revisions of the International Classification of Diseases applied to coding causes of death in Spain

    Directory of Open Access Journals (Sweden)

    M. Ruiz

    2002-12-01

    cuantificar el cambio en las grandes causas de muerte en España.Objective: To analyze comparability between the ninth and tenth revisions of the International Classification of Diseases (ICD applied to coding causes of death in Spain. Methods: According to the ninth and tenth revisions of the ICD, 80,084 statistical bulletins of mortality registered in 1999 were assigned the Basic Cause of Death. The statistical bulletins corresponded to the Autonomous Communities of Andalusia, Cantabria, Murcia, Navarre and the Basque Country, and the city of Barcelona. The underlying causes of death were classified into 17 groups. Simple correspondence, the Kappa index and the comparability ratio for major causes were calculated. Results: A total of 3.6% of deaths changed group due to an increase (36.4% in infectious and parasitic diseases, mainly because of the inclusion of AIDS, and a corresponding decrease due to the exclusion of endocrine, nutritional and metabolic disorders. Furthermore, myelodysplastic syndrome was moved to the category of neoplasm. The group including nervous system diseases, eye and related diseases, and ear and mastoid apophysis diseases increased (14.7% at the expense of mental and behavior disorders, due to the inclusion of senile and presenile organic psychosis. Poorly-defined entities increased (14.1% due to the inclusion of cardiac arrest and its synonyms, together with heart failure, to the detriment of diseases of the vascular system. Diseases of the respiratory system increased (4.8% due to the inclusion of respiratory failure, previously considered as a poorly defined cause. The correspondence for all causes was 96.4% and kappa's index was 94.9%. Conclusions: The introduction of ICD-10 affects the comparability of statistical series of mortality according to cause. The results of this study allow us to identify the main modifications and to quantify the changes in the major causes of mortality in Spain.

  13. A rule-based electronic phenotyping algorithm for detecting clinically relevant cardiovascular disease cases.

    Science.gov (United States)

    Esteban, Santiago; Rodríguez Tablado, Manuel; Ricci, Ricardo Ignacio; Terrasa, Sergio; Kopitowski, Karin

    2017-07-14

    The implementation of electronic medical records (EMR) is becoming increasingly common. Error and data loss reduction, patient-care efficiency increase, decision-making assistance and facilitation of event surveillance, are some of the many processes that EMRs help improve. In addition, they show a lot of promise in terms of data collection to facilitate observational epidemiological studies and their use for this purpose has increased significantly over the recent years. Even though the quantity and availability of the data are clearly improved thanks to EMRs, still, the problem of the quality of the data remains. This is especially important when attempting to determine if an event has actually occurred or not. We sought to assess the sensitivity, specificity, and agreement level of a codes-based algorithm for the detection of clinically relevant cardiovascular (CaVD) and cerebrovascular (CeVD) disease cases, using data from EMRs. Three family physicians from the research group selected clinically relevant CaVD and CeVD terms from the international classification of primary care, Second Edition (ICPC-2), the ICD 10 version 2015 and SNOMED-CT 2015 Edition. These terms included both signs, symptoms, diagnoses and procedures associated with CaVD and CeVD. Terms not related to symptoms, signs, diagnoses or procedures of CaVD or CeVD and also those describing incidental findings without clinical relevance were excluded. The algorithm yielded a positive result if the patient had at least one of the selected terms in their medical records, as long as it was not recorded as an error. Else, if no terms were found, the patient was classified as negative. This algorithm was applied to a randomly selected sample of the active patients within the hospital's HMO by 1/1/2005 that were 40-79 years old, had at least one year of seniority in the HMO and at least one clinical encounter. Thus, patients were classified into four groups: (1) Negative patients (2) Patients with Ca

  14. An upper bound on the number of errors corrected by a convolutional code

    DEFF Research Database (Denmark)

    Justesen, Jørn

    2000-01-01

    The number of errors that a convolutional codes can correct in a segment of the encoded sequence is upper bounded by the number of distinct syndrome sequences of the relevant length.......The number of errors that a convolutional codes can correct in a segment of the encoded sequence is upper bounded by the number of distinct syndrome sequences of the relevant length....

  15. Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.

    Science.gov (United States)

    Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic

    2014-06-01

    In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.

  16. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  17. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  18. A computerized energy systems code and information library at Soreq

    Energy Technology Data Exchange (ETDEWEB)

    Silverman, I; Shapira, M; Caner, D; Sapier, D [Israel Atomic Energy Commission, Yavne (Israel). Soreq Nuclear Research Center

    1996-12-01

    In the framework of the contractual agreement between the Ministry of Energy and Infrastructure and the Division of Nuclear Engineering of the Israel Atomic Energy Commission, both Soreq-NRC and Ben-Gurion University have agreed to establish, in 1991, a code center. This code center contains a library of computer codes and relevant data, with particular emphasis on nuclear power plant research and development support. The code center maintains existing computer codes and adapts them to the ever changing computing environment, keeps track of new code developments in the field of nuclear engineering, and acquires the most recent revisions of computer codes of interest. An attempt is made to collect relevant codes developed in Israel and to assure that proper documentation and application instructions are available. En addition to computer programs, the code center collects sample problems and international benchmarks to verify the codes and their applications to various areas of interest to nuclear power plant engineering and safety evaluation. Recently, the reactor simulation group at Soreq acquired, using funds provided by the Ministry of Energy and Infrastructure, a PC work station operating under a Linux operating system to give users of the library an easy on-line way to access resources available at the library. These resources include the computer codes and their documentation, reports published by the reactor simulation group, and other information databases available at Soreq. Registered users set a communication line, through a modem, between their computer and the new workstation at Soreq and use it to download codes and/or information or to solve their problems, using codes from the library, on the computer at Soreq (authors).

  19. A computerized energy systems code and information library at Soreq

    International Nuclear Information System (INIS)

    Silverman, I.; Shapira, M.; Caner, D.; Sapier, D.

    1996-01-01

    In the framework of the contractual agreement between the Ministry of Energy and Infrastructure and the Division of Nuclear Engineering of the Israel Atomic Energy Commission, both Soreq-NRC and Ben-Gurion University have agreed to establish, in 1991, a code center. This code center contains a library of computer codes and relevant data, with particular emphasis on nuclear power plant research and development support. The code center maintains existing computer codes and adapts them to the ever changing computing environment, keeps track of new code developments in the field of nuclear engineering, and acquires the most recent revisions of computer codes of interest. An attempt is made to collect relevant codes developed in Israel and to assure that proper documentation and application instructions are available. En addition to computer programs, the code center collects sample problems and international benchmarks to verify the codes and their applications to various areas of interest to nuclear power plant engineering and safety evaluation. Recently, the reactor simulation group at Soreq acquired, using funds provided by the Ministry of Energy and Infrastructure, a PC work station operating under a Linux operating system to give users of the library an easy on-line way to access resources available at the library. These resources include the computer codes and their documentation, reports published by the reactor simulation group, and other information databases available at Soreq. Registered users set a communication line, through a modem, between their computer and the new workstation at Soreq and use it to download codes and/or information or to solve their problems, using codes from the library, on the computer at Soreq (authors)

  20. Mirror neurons and their clinical relevance.

    Science.gov (United States)

    Rizzolatti, Giacomo; Fabbri-Destro, Maddalena; Cattaneo, Luigi

    2009-01-01

    One of the most exciting events in neurosciences over the past few years has been the discovery of a mechanism that unifies action perception and action execution. The essence of this 'mirror' mechanism is as follows: whenever individuals observe an action being done by someone else, a set of neurons that code for that action is activated in the observers' motor system. Since the observers are aware of the outcome of their motor acts, they also understand what the other individual is doing without the need for intermediate cognitive mediation. In this Review, after discussing the most pertinent data concerning the mirror mechanism, we examine the clinical relevance of this mechanism. We first discuss the relationship between mirror mechanism impairment and some core symptoms of autism. We then outline the theoretical principles of neurorehabilitation strategies based on the mirror mechanism. We conclude by examining the relationship between the mirror mechanism and some features of the environmental dependency syndromes.

  1. Application of the International Classification of Functioning, Disability and Health (ICF) to people with dysphagia following non-surgical head and neck cancer management.

    Science.gov (United States)

    Nund, Rebecca L; Scarinci, Nerina A; Cartmill, Bena; Ward, Elizabeth C; Kuipers, Pim; Porceddu, Sandro V

    2014-12-01

    The International Classification of Functioning, Disability, and Health (ICF) is an internationally recognized framework which allows its user to describe the consequences of a health condition on an individual in the context of their environment. With growing recognition that dysphagia can have broad ranging physical and psychosocial impacts, the aim of this paper was to identify the ICF domains and categories that describe the full functional impact of dysphagia following non-surgical head and neck cancer (HNC) management, from the perspective of the person with dysphagia. A secondary analysis was conducted on previously published qualitative study data which explored the lived experiences of dysphagia of 24 individuals with self-reported swallowing difficulties following HNC management. Categories and sub-categories identified by the qualitative analysis were subsequently mapped to the ICF using the established linking rules to develop a set of ICF codes relevant to the impact of dysphagia following HNC management. The 69 categories and sub-categories that had emerged from the qualitative analysis were successfully linked to 52 ICF codes. The distribution of these codes across the ICF framework revealed that the components of Body Functions, Activities and Participation, and Environmental Factors were almost equally represented. The findings confirm that the ICF is a valuable framework for representing the complexity and multifaceted impact of dysphagia following HNC. This list of ICF codes, which reflect the diverse impact of dysphagia associated with HNC on the individual, can be used to guide more holistic assessment and management for this population.

  2. Environmental Monitoring, Water Quality - MO 2009 Water Quality Standards - Table G Lake Classifications and Use Designations (SHP)

    Data.gov (United States)

    NSGIC State | GIS Inventory — This data set contains Missouri Water Quality Standards (WQS) lake classifications and use designations described in the Missouri Code of State Regulations (CSR), 10...

  3. ILAE Classification of the Epilepsies Position Paper of the ILAE Commission for Classification and Terminology

    Science.gov (United States)

    Scheffer, Ingrid E; Berkovic, Samuel; Capovilla, Giuseppe; Connolly, Mary B; French, Jacqueline; Guilhoto, Laura; Hirsch, Edouard; Jain, Satish; Mathern, Gary W.; Moshé, Solomon L; Nordli, Douglas R; Perucca, Emilio; Tomson, Torbjörn; Wiebe, Samuel; Zhang, Yue-Hua; Zuberi, Sameer M

    2017-01-01

    Summary The ILAE Classification of the Epilepsies has been updated to reflect our gain in understanding of the epilepsies and their underlying mechanisms following the major scientific advances which have taken place since the last ratified classification in 1989. As a critical tool for the practising clinician, epilepsy classification must be relevant and dynamic to changes in thinking, yet robust and translatable to all areas of the globe. Its primary purpose is for diagnosis of patients, but it is also critical for epilepsy research, development of antiepileptic therapies and communication around the world. The new classification originates from a draft document submitted for public comments in 2013 which was revised to incorporate extensive feedback from the international epilepsy community over several rounds of consultation. It presents three levels, starting with seizure type where it assumes that the patient is having epileptic seizures as defined by the new 2017 ILAE Seizure Classification. After diagnosis of the seizure type, the next step is diagnosis of epilepsy type, including focal epilepsy, generalized epilepsy, combined generalized and focal epilepsy, and also an unknown epilepsy group. The third level is that of epilepsy syndrome where a specific syndromic diagnosis can be made. The new classification incorporates etiology along each stage, emphasizing the need to consider etiology at each step of diagnosis as it often carries significant treatment implications. Etiology is broken into six subgroups, selected because of their potential therapeutic consequences. New terminology is introduced such as developmental and epileptic encephalopathy. The term benign is replaced by the terms self-limited and pharmacoresponsive, to be used where appropriate. It is hoped that this new framework will assist in improving epilepsy care and research in the 21st century. PMID:28276062

  4. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  5. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  6. Classification of Cancer-related Death Certificates using Machine Learning

    Directory of Open Access Journals (Sweden)

    Luke Butt

    2013-05-01

    Full Text Available BackgroundCancer monitoring and prevention relies on the critical aspect of timely notification of cancer cases. However, the abstraction and classification of cancer from the free-text of pathology reports and other relevant documents, such as death certificates, exist as complex and time-consuming activities.AimsIn this paper, approaches for the automatic detection of notifiable cancer cases as the cause of death from free-text death certificates supplied to Cancer Registries are investigated.Method A number of machine learning classifiers were studied. Features were extracted using natural language techniques and the Medtex toolkit. The numerous features encompassed stemmed words, bi-grams, and concepts from the SNOMED CT medical terminology. The baseline consisted of a keyword spotter using keywords extracted from the long description of ICD-10 cancer related codes.ResultsDeath certificates with notifiable cancer listed as the cause of death can be effectively identified with the methods studied in this paper. A Support Vector Machine (SVM classifier achieved best performance with an overall F-measure of 0.9866 when evaluated on a set of 5,000 free-text death certificates using the token stem feature set. The SNOMED CT concept plus token stem feature set reached the lowest variance (0.0032 and false negative rate (0.0297 while achieving an F-measure of 0.9864. The SVM classifier accounts for the first 18 of the top 40 evaluated runs, and entails the most robust classifier with a variance of 0.001141, half the variance of the other classifiers.ConclusionThe selection of features significantly produced the most influences on the performance of the classifiers, although the type of classifier employed also affects performance. In contrast, the feature weighting schema created a negligible effect on performance. Specifically, it is found that stemmed tokens with or without SNOMED CT concepts create the most effective feature when combined with

  7. Relevance of Neuroinflammation and Encephalitis in Autism

    Directory of Open Access Journals (Sweden)

    Janet eKern

    2016-01-01

    Full Text Available In recent years, many studies indicate that children with an autism spectrum disorder (ASD diagnosis have brain pathology suggestive of ongoing neuroinflammation or encephalitis in different regions of their brains. Evidence of neuroinflammation or encephalitis in ASD includes: microglial and astrocytic activation, a unique and elevated proinflammatory profile of cytokines, and aberrant expression of nuclear factor kappa-light-chain-enhancer of activated B cells. A conservative estimate based on the research suggests that at least 69% of individuals with an ASD diagnosis have microglial activation or neuroinflammation. Encephalitis, which is defined as inflammation of the brain, is medical diagnosis code G04.90 in the International Classification of Disease, 10th revision; however, children with an ASD diagnosis are not generally assessed for a possible medical diagnosis of encephalitis. This is unfortunate because if a child with ASD has neuroinflammation, then treating the underlying brain inflammation could lead to improved outcomes. The purpose of this review of the literature is to examine the evidence of neuroinflammation/encephalitis in those with an ASD diagnosis and to address how a medical diagnosis of encephalitis, when appropriate, could benefit these children by driving more immediate and targeted treatments.

  8. Self-organizing ontology of biochemically relevant small molecules.

    Science.gov (United States)

    Chepelev, Leonid L; Hastings, Janna; Ennis, Marcus; Steinbeck, Christoph; Dumontier, Michel

    2012-01-06

    The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. We conclude that the proposed methodology can ease the burden of chemical data annotators and

  9. Self-organizing ontology of biochemically relevant small molecules

    Science.gov (United States)

    2012-01-01

    Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of

  10. Self-organizing ontology of biochemically relevant small molecules

    Directory of Open Access Journals (Sweden)

    Chepelev Leonid L

    2012-01-01

    Full Text Available Abstract Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology

  11. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  12. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  13. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  14. Profiles of Dialogue for Relevance

    Directory of Open Access Journals (Sweden)

    Douglas Walton

    2016-12-01

    Full Text Available This paper uses argument diagrams, argumentation schemes, and some tools from formal argumentation systems developed in artificial intelligence to build a graph-theoretic model of relevance shown to be applicable (with some extensions as a practical method for helping a third party judge issues of relevance or irrelevance of an argument in real examples. Examples used to illustrate how the method works are drawn from disputes about relevance in natural language discourse, including a criminal trial and a parliamentary debate.

  15. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  16. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  17. The Nursing Code of Ethics: Its Value, Its History.

    Science.gov (United States)

    Epstein, Beth; Turner, Martha

    2015-05-31

    To practice competently and with integrity, today's nurses must have in place several key elements that guide the profession, such as an accreditation process for education, a rigorous system for licensure and certification, and a relevant code of ethics. The American Nurses Association has guided and supported nursing practice through creation and implementation of a nationally accepted Code of Ethics for Nurses with Interpretive Statements. This article will discuss ethics in society, professions, and nursing and illustrate how a professional code of ethics can guide nursing practice in a variety of settings. We also offer a brief history of the Code of Ethics, discuss the modern Code of Ethics, and describe the importance of periodic revision, including the inclusive and thorough process used to develop the 2015 Code and a summary of recent changes. Finally, the article provides implications for practicing nurses to assure that this document is a dynamic, useful resource in a variety of healthcare settings.

  18. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  19. Classification of IRAS asteroids

    International Nuclear Information System (INIS)

    Tedesco, E.F.; Matson, D.L.; Veeder, G.J.

    1989-01-01

    Albedos and spectral reflectances are essential for classifying asteroids. For example, classes E, M and P are indistinguishable without albedo data. Colorometric data are available for about 1000 asteroids but, prior to IRAS, albedo data was available for only about 200. IRAS broke this bottleneck by providing albedo data on nearly 2000 asteroids. Hence, excepting absolute magnitudes, the albedo and size are now the most common asteroid physical parameters known. In this chapter the authors present the results of analyses of IRAS-derived asteroid albedos, discuss their application to asteroid classification, and mention several studies which might be done to exploit further this data set

  20. SPORT FOOD ADDITIVE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    I. P. Prokopenko

    2015-01-01

    Full Text Available Correctly organized nutritive and pharmacological support is an important component of an athlete's preparation for competitions, an optimal shape maintenance, fast recovery and rehabilitation after traumas and defatigation. Special products of enhanced biological value (BAS for athletes nutrition are used with this purpose. Easy-to-use energy sources are administered into athlete's organism, yielded materials and biologically active substances which regulate and activate exchange reactions which proceed with difficulties during certain physical trainings. The article presents sport supplements classification which can be used before warm-up and trainings, after trainings and in competitions breaks.