WorldWideScience

Sample records for ih classification codes

  1. Blind Signal Classification via Spare Coding

    Science.gov (United States)

    2016-04-10

    Blind Signal Classification via Sparse Coding Youngjune Gwon MIT Lincoln Laboratory gyj@ll.mit.edu Siamak Dastangoo MIT Lincoln Laboratory sia...achieve blind signal classification with no prior knowledge about signals (e.g., MCS, pulse shaping) in an arbitrary RF channel. Since modulated RF...classification method. Our results indicate that we can separate different classes of digitally modulated signals from blind sampling with 70.3% recall and 24.6

  2. Breathing (and Coding?) a Bit Easier: Changes to International Classification of Disease Coding for Pulmonary Hypertension.

    Science.gov (United States)

    Mathai, Stephen C; Mathew, Sherin

    2018-04-20

    International Classification of Disease (ICD) coding system is broadly utilized by healthcare providers, hospitals, healthcare payers, and governments to track health trends and statistics at the global, national, and local levels and to provide a reimbursement framework for medical care based upon diagnosis and severity of illness. The current iteration of the ICD system, ICD-10, was implemented in 2015. While many changes to the prior ICD-9 system were included in the ICD-10 system, the newer revision failed to adequately reflect advances in the clinical classification of certain diseases such as pulmonary hypertension (PH). Recently, a proposal to modify the ICD-10 codes for PH was considered and ultimately adopted for inclusion as updates to ICD-10 coding system. While these revisions better reflect the current clinical classification of PH, in the future, further changes should be considered to improve the accuracy and ease of coding for all forms of PH. Copyright © 2018. Published by Elsevier Inc.

  3. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Liu, Si; Xu, Changsheng; Ahuja, Narendra

    2013-01-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  4. Low-Rank Sparse Coding for Image Classification

    KAUST Repository

    Zhang, Tianzhu

    2013-12-01

    In this paper, we propose a low-rank sparse coding (LRSC) method that exploits local structure information among features in an image for the purpose of image-level classification. LRSC represents densely sampled SIFT descriptors, in a spatial neighborhood, collectively as low-rank, sparse linear combinations of code words. As such, it casts the feature coding problem as a low-rank matrix learning problem, which is different from previous methods that encode features independently. This LRSC has a number of attractive properties. (1) It encourages sparsity in feature codes, locality in codebook construction, and low-rankness for spatial consistency. (2) LRSC encodes local features jointly by considering their low-rank structure information, and is computationally attractive. We evaluate the LRSC by comparing its performance on a set of challenging benchmarks with that of 7 popular coding and other state-of-the-art methods. Our experiments show that by representing local features jointly, LRSC not only outperforms the state-of-the-art in classification accuracy but also improves the time complexity of methods that use a similar sparse linear representation model for feature coding.

  5. What Is IH (Intracranial Hypertension)?

    Science.gov (United States)

    ... Store What is IH? What is IH? Intracranial hypertension literally means that the pressure of cerebrospinal fluid ( ... is too high. “Intracranial” means “within the skull.” “Hypertension” means “high fluid pressure.” To understand how this ...

  6. 49 CFR 173.52 - Classification codes and compatibility groups of explosives.

    Science.gov (United States)

    2010-10-01

    ... containing both an explosive substance and flammable liquid or gel J 1.1J1.2J 1.3J Article containing both an... classification codes for substances and articles described in the first column of table 1. Table 2 shows the... possible classification codes for explosives. Table 1—Classification Codes Description of substances or...

  7. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Directory of Open Access Journals (Sweden)

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  8. On the classification of long non-coding RNAs

    KAUST Repository

    Ma, Lina

    2013-06-01

    Long non-coding RNAs (lncRNAs) have been found to perform various functions in a wide variety of important biological processes. To make easier interpretation of lncRNA functionality and conduct deep mining on these transcribed sequences, it is convenient to classify lncRNAs into different groups. Here, we summarize classification methods of lncRNAs according to their four major features, namely, genomic location and context, effect exerted on DNA sequences, mechanism of functioning and their targeting mechanism. In combination with the presently available function annotations, we explore potential relationships between different classification categories, and generalize and compare biological features of different lncRNAs within each category. Finally, we present our view on potential further studies. We believe that the classifications of lncRNAs as indicated above are of fundamental importance for lncRNA studies, helpful for further investigation of specific lncRNAs, for formulation of new hypothesis based on different features of lncRNA and for exploration of the underlying lncRNA functional mechanisms. © 2013 Landes Bioscience.

  9. Effects Of Field Distortions In Ih-apf Linac

    CERN Document Server

    Kapin, Valery; Yamada, S

    2004-01-01

    The project on developing compact medical accelera-tors for the tumor therapy using carbon ions has been started at the National Institute of Radiological Sciences (NIRS). Alternating-phase-focused (APF) linac using an interdigital H-mode (IH) cavity has been proposed for the injector linac. The IH-cavity is doubly ridged circular resonator loaded by the drift-tubes mounted on ridges with supporting stems. The effects of intrinsic and random field distortions in a practical design of the 4-MeV/u 200 MHz IH-APF linac are considered. The intrinsic field distortions in IH-cavity are caused by the asymmetry of the gap field due to presence of the drift-tube supporting stems and pair of ridges. The random field distortions are caused by drift-tube misalignments and non-regular deviations of the voltage distribution from programmed law. The RF fields in IH-cavity have been calculated using Microwave Studio (MWS) code. The effects of field distortions on beam dynamics have been simulated numerically.

  10. A systematic literature review of automated clinical coding and classification systems.

    Science.gov (United States)

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  11. 78 FR 21612 - Medical Device Classification Product Codes; Guidance for Industry and Food and Drug...

    Science.gov (United States)

    2013-04-11

    ... driving force for CDRH's internal organizational structure as well. These Panels were established with the... guidance represents the Agency's current thinking on medical device classification product codes. It does...

  12. Diffuse scattering in Ih ice

    International Nuclear Information System (INIS)

    Wehinger, Björn; Krisch, Michael; Bosak, Alexeï; Chernyshov, Dmitry; Bulat, Sergey; Ezhov, Victor

    2014-01-01

    Single crystals of ice Ih, extracted from the subglacial Lake Vostok accretion ice layer (3621 m depth) were investigated by means of diffuse x-ray scattering and inelastic x-ray scattering. The diffuse scattering was identified as mainly inelastic and rationalized in the frame of ab initio calculations for the ordered ice XI approximant. Together with Monte-Carlo modelling, our data allowed reconsidering previously available neutron diffuse scattering data of heavy ice as the sum of thermal diffuse scattering and static disorder contribution. (paper)

  13. On the classification of long non-coding RNAs

    KAUST Repository

    Ma, Lina; Bajic, Vladimir B.; Zhang, Zhang

    2013-01-01

    Long non-coding RNAs (lncRNAs) have been found to perform various functions in a wide variety of important biological processes. To make easier interpretation of lncRNA functionality and conduct deep mining on these transcribed sequences

  14. Classification of working processes to facilitate occupational hazard coding on industrial trawlers

    DEFF Research Database (Denmark)

    Jensen, Olaf C; Stage, Søren; Noer, Preben

    2003-01-01

    BACKGROUND: Commercial fishing is an extremely dangerous economic activity. In order to more accurately describe the risks involved, a specific injury coding based on the working process was developed. METHOD: Observation on six different types of vessels was conducted and allowed a description...... and a classification of the principal working processes on all kinds of vessels and a detailed classification for industrial trawlers. In industrial trawling, fish are landed for processing purposes, for example, for the production of fish oil and fish meal. The classification was subsequently used to code...... the injuries reported to the Danish Maritime Authority over a 5-year period. RESULTS: On industrial trawlers, 374 of 394 (95%) injuries were captured by the classification. Setting out and hauling in the gear and nets were the processes with the most injuries and accounted for 58.9% of all injuries...

  15. Improving the coding and classification of ambulance data through the application of International Classification of Disease 10th revision.

    Science.gov (United States)

    Cantwell, Kate; Morgans, Amee; Smith, Karen; Livingston, Michael; Dietze, Paul

    2014-02-01

    This paper aims to examine whether an adaptation of the International Classification of Disease (ICD) coding system can be applied retrospectively to final paramedic assessment data in an ambulance dataset with a view to developing more fine-grained, clinically relevant case definitions than are available through point-of-call data. Over 1.2 million case records were extracted from the Ambulance Victoria data warehouse. Data fields included dispatch code, cause (CN) and final primary assessment (FPA). Each FPA was converted to an ICD-10-AM code using word matching or best fit. ICD-10-AM codes were then converted into Major Diagnostic Categories (MDC). CN was aligned with the ICD-10-AM codes for external cause of morbidity and mortality. The most accurate results were obtained when ICD-10-AM codes were assigned using information from both FPA and CN. Comparison of cases coded as unconscious at point-of-call with the associated paramedic assessment highlighted the extra clinical detail obtained when paramedic assessment data are used. Ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Coding of ambulance data using ICD-10-AM allows for comparison of not only ambulance service users but also with other population groups. WHAT IS KNOWN ABOUT THE TOPIC? There is no reliable and standard coding and categorising system for paramedic assessment data contained in ambulance service databases. WHAT DOES THIS PAPER ADD? This study demonstrates that ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Representation of ambulance case types using ICD-10-AM-coded information obtained after paramedic assessment is more fine grained and clinically relevant than point-of-call data, which uses caller information before ambulance attendance. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? This paper describes

  16. Local coding based matching kernel method for image classification.

    Directory of Open Access Journals (Sweden)

    Yan Song

    Full Text Available This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  17. Conceptual-driven classification for coding advise in health insurance reimbursement.

    Science.gov (United States)

    Li, Sheng-Tun; Chen, Chih-Chuan; Huang, Fernando

    2011-01-01

    With the non-stop increases in medical treatment fees, the economic survival of a hospital in Taiwan relies on the reimbursements received from the Bureau of National Health Insurance, which in turn depend on the accuracy and completeness of the content of the discharge summaries as well as the correctness of their International Classification of Diseases (ICD) codes. The purpose of this research is to enforce the entire disease classification framework by supporting disease classification specialists in the coding process. This study developed an ICD code advisory system (ICD-AS) that performed knowledge discovery from discharge summaries and suggested ICD codes. Natural language processing and information retrieval techniques based on Zipf's Law were applied to process the content of discharge summaries, and fuzzy formal concept analysis was used to analyze and represent the relationships between the medical terms identified by MeSH. In addition, a certainty factor used as reference during the coding process was calculated to account for uncertainty and strengthen the credibility of the outcome. Two sets of 360 and 2579 textual discharge summaries of patients suffering from cerebrovascular disease was processed to build up ICD-AS and to evaluate the prediction performance. A number of experiments were conducted to investigate the impact of system parameters on accuracy and compare the proposed model to traditional classification techniques including linear-kernel support vector machines. The comparison results showed that the proposed system achieves the better overall performance in terms of several measures. In addition, some useful implication rules were obtained, which improve comprehension of the field of cerebrovascular disease and give insights to the relationships between relevant medical terms. Our system contributes valuable guidance to disease classification specialists in the process of coding discharge summaries, which consequently brings benefits in

  18. Das IHS und die Rolle der Forschungsinstitute

    OpenAIRE

    Keuschnigg, Christian

    2014-01-01

    Das Institut für Höhere Studien und wissenschaftliche Forschung (IHS) ist ein unabhängiges Forschungsinstitut und leistet Forschung und Ausbildung auf Spitzenniveau für Politik, Wirtschaft und Gesellschaft. Im Wettbewerb zwischen Universitäten und angewandten Forschungsinstituten ist das IHS einzigartig, weil es unter einem Dach Grundlagenforschung und forschungsnahe Lehre mit angewandter Forschung für die wissenschaftliche Politikberatung verbindet. Mit den drei Disziplinen Ökonomie, Soziolo...

  19. Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)

    Science.gov (United States)

    2016-05-01

    subject to code matrices that follows the structure given by (113). [⃗ yR y⃗I ] = √ Es 2L [ GR1 −GI1 GI2 GR2 ] [ QR −QI QI QR ] [⃗ bR b⃗I ] + [⃗ nR n⃗I... QR ] [⃗ b+ b⃗− ] + [⃗ n+ n⃗− ] (115) The average likelihood for type 4 CDMA (116) is a special case of type 1 CDMA with twice the code length and...AVERAGE LIKELIHOOD METHODS OF CLASSIFICATION OF CODE DIVISION MULTIPLE ACCESS (CDMA) MAY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  20. Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data (SCED). NFES 2011-801

    Science.gov (United States)

    National Forum on Education Statistics, 2011

    2011-01-01

    In this handbook, "Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data" (SCED), the National Center for Education Statistics (NCES) and the National Forum on Education Statistics have extended the existing secondary course classification system with codes and descriptions for courses offered at…

  1. Fast Binary Coding for the Scene Classification of High-Resolution Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    Fan Hu

    2016-06-01

    Full Text Available Scene classification of high-resolution remote sensing (HRRS imagery is an important task in the intelligent processing of remote sensing images and has attracted much attention in recent years. Although the existing scene classification methods, e.g., the bag-of-words (BOW model and its variants, can achieve acceptable performance, these approaches strongly rely on the extraction of local features and the complicated coding strategy, which are usually time consuming and demand much expert effort. In this paper, we propose a fast binary coding (FBC method, to effectively generate efficient discriminative scene representations of HRRS images. The main idea is inspired by the unsupervised feature learning technique and the binary feature descriptions. More precisely, equipped with the unsupervised feature learning technique, we first learn a set of optimal “filters” from large quantities of randomly-sampled image patches and then obtain feature maps by convolving the image scene with the learned filters. After binarizing the feature maps, we perform a simple hashing step to convert the binary-valued feature map to the integer-valued feature map. Finally, statistical histograms computed on the integer-valued feature map are used as global feature representations of the scenes of HRRS images, similar to the conventional BOW model. The analysis of the algorithm complexity and experiments on HRRS image datasets demonstrate that, in contrast with existing scene classification approaches, the proposed FBC has much faster computational speed and achieves comparable classification performance. In addition, we also propose two extensions to FBC, i.e., the spatial co-occurrence matrix and different visual saliency maps, for further improving its final classification accuracy.

  2. LAMOST OBSERVATIONS IN THE KEPLER FIELD: SPECTRAL CLASSIFICATION WITH THE MKCLASS CODE

    Energy Technology Data Exchange (ETDEWEB)

    Gray, R. O. [Department of Physics and Astronomy, Appalachian State University, Boone, NC 28608 (United States); Corbally, C. J. [Vatican Observatory Research Group, Steward Observatory, Tucson, AZ 85721-0065 (United States); Cat, P. De [Royal Observatory of Belgium, Ringlaan 3, B-1180 Brussel (Belgium); Fu, J. N.; Ren, A. B. [Department of Astronomy, Beijing Normal University, 19 Avenue Xinjiekouwai, Beijing 100875 (China); Shi, J. R.; Luo, A. L.; Zhang, H. T.; Wu, Y.; Cao, Z.; Li, G. [Key Laboratory for Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Zhang, Y.; Hou, Y.; Wang, Y. [Nanjing Institute of Astronomical Optics and Technology, National Astronomical Observatories, Chinese Academy of Sciences, Nanjing 210042 (China)

    2016-01-15

    The LAMOST-Kepler project was designed to obtain high-quality, low-resolution spectra of many of the stars in the Kepler field with the Large Sky Area Multi Object Fiber Spectroscopic Telescope (LAMOST) spectroscopic telescope. To date 101,086 spectra of 80,447 objects over the entire Kepler field have been acquired. Physical parameters, radial velocities, and rotational velocities of these stars will be reported in other papers. In this paper we present MK spectral classifications for these spectra determined with the automatic classification code MKCLASS. We discuss the quality and reliability of the spectral types and present histograms showing the frequency of the spectral types in the main table organized according to luminosity class. Finally, as examples of the use of this spectral database, we compute the proportion of A-type stars that are Am stars, and identify 32 new barium dwarf candidates.

  3. Effects of Field Distortions in IH-APF Linac for a Compact Medical Accelerator

    CERN Document Server

    Kapin, Valery; Yamada, Satoru

    2004-01-01

    The project on developing compact medical accelerators for the tumor therapy using carbon ions has been started at the National Institute of Radiological Sciences (NIRS). Alternating-phase-focused (APF) linac using an interdigital H-mode (IH) cavity has been proposed for the injector linac. The IH-cavity is a doubly ridged circular resonator loaded by the drift-tubes mounted on ridges with supporting stems. The effects of intrinsic and random field distortions in a practical design of the 4-Mev/u 200-MHz IH-APF linac are considered. The intrinsic field distortions in the IH-cavity are caused by an asymmetry of the gap fields due to presence of the stems and pair of ridges. The random field distortions are caused by drift-tube misalignments and non-regular deviations of the gap voltages from programmed values. The RF fields in the IH-cavity have been calculated using Microwave Studio (MWS) code. The effects of field distortions on beam dynamics have been simulated numerically. The intrinsic field distortions a...

  4. High dendritic expression of Ih in the proximity of the axon origin controls the integrative properties of nigral dopamine neurons.

    Science.gov (United States)

    Engel, Dominique; Seutin, Vincent

    2015-11-15

    are unknown. Using cell-attached patch-clamp recordings, we find a higher Ih current density in the axon-bearing dendrite than in the soma or in dendrites without axon in nigral dopamine neurons. Ih is mainly concentrated in the dendritic membrane area surrounding the axon origin and decreases with increasing distances from this site. Single EPSPs and temporal summation are similarly affected by blockade of Ih in axon- and non-axon-bearing dendrites. The presence of Ih close to the axon is pivotal to control the integrative functions and the output signal of dopamine neurons and may consequently influence the downstream coding of movement. © 2015 The Authors. The Journal of Physiology © 2015 The Physiological Society.

  5. New Site Coefficients and Site Classification System Used in Recent Building Seismic Code Provisions

    Science.gov (United States)

    Dobry, R.; Borcherdt, R.D.; Crouse, C.B.; Idriss, I.M.; Joyner, W.B.; Martin, G.R.; Power, M.S.; Rinne, E.E.; Seed, R.B.

    2000-01-01

    Recent code provisions for buildings and other structures (1994 and 1997 NEHRP Provisions, 1997 UBC) have adopted new site amplification factors and a new procedure for site classification. Two amplitude-dependent site amplification factors are specified: Fa for short periods and Fv for longer periods. Previous codes included only a long period factor S and did not provide for a short period amplification factor. The new site classification system is based on definitions of five site classes in terms of a representative average shear wave velocity to a depth of 30 m (V?? s). This definition permits sites to be classified unambiguously. When the shear wave velocity is not available, other soil properties such as standard penetration resistance or undrained shear strength can be used. The new site classes denoted by letters A - E, replace site classes in previous codes denoted by S1 - S4. Site classes A and B correspond to hard rock and rock, Site Class C corresponds to soft rock and very stiff / very dense soil, and Site Classes D and E correspond to stiff soil and soft soil. A sixth site class, F, is defined for soils requiring site-specific evaluations. Both Fa and Fv are functions of the site class, and also of the level of seismic hazard on rock, defined by parameters such as Aa and Av (1994 NEHRP Provisions), Ss and S1 (1997 NEHRP Provisions) or Z (1997 UBC). The values of Fa and Fv decrease as the seismic hazard on rock increases due to soil nonlinearity. The greatest impact of the new factors Fa and Fv as compared with the old S factors occurs in areas of low-to-medium seismic hazard. This paper summarizes the new site provisions, explains the basis for them, and discusses ongoing studies of site amplification in recent earthquakes that may influence future code developments.

  6. A physiologically-inspired model of numerical classification based on graded stimulus coding

    Directory of Open Access Journals (Sweden)

    John Pearson

    2010-01-01

    Full Text Available In most natural decision contexts, the process of selecting among competing actions takes place in the presence of informative, but potentially ambiguous, stimuli. Decisions about magnitudes—quantities like time, length, and brightness that are linearly ordered—constitute an important subclass of such decisions. It has long been known that perceptual judgments about such quantities obey Weber’s Law, wherein the just-noticeable difference in a magnitude is proportional to the magnitude itself. Current physiologically inspired models of numerical classification assume discriminations are made via a labeled line code of neurons selectively tuned for numerosity, a pattern observed in the firing rates of neurons in the ventral intraparietal area (VIP of the macaque. By contrast, neurons in the contiguous lateral intraparietal area (LIP signal numerosity in a graded fashion, suggesting the possibility that numerical classification could be achieved in the absence of neurons tuned for number. Here, we consider the performance of a decision model based on this analog coding scheme in a paradigmatic discrimination task—numerosity bisection. We demonstrate that a basic two-neuron classifier model, derived from experimentally measured monotonic responses of LIP neurons, is sufficient to reproduce the numerosity bisection behavior of monkeys, and that the threshold of the classifier can be set by reward maximization via a simple learning rule. In addition, our model predicts deviations from Weber Law scaling of choice behavior at high numerosity. Together, these results suggest both a generic neuronal framework for magnitude-based decisions and a role for reward contingency in the classification of such stimuli.

  7. Defining pediatric traumatic brain injury using International Classification of Diseases Version 10 Codes: a systematic review.

    Science.gov (United States)

    Chan, Vincy; Thurairajah, Pravheen; Colantonio, Angela

    2015-02-04

    Although healthcare administrative data are commonly used for traumatic brain injury (TBI) research, there is currently no consensus or consistency on the International Classification of Diseases Version 10 (ICD-10) codes used to define TBI among children and youth internationally. This study systematically reviewed the literature to explore the range of ICD-10 codes that are used to define TBI in this population. The identification of the range of ICD-10 codes to define this population in administrative data is crucial, as it has implications for policy, resource allocation, planning of healthcare services, and prevention strategies. The databases MEDLINE, MEDLINE In-Process, Embase, PsychINFO, CINAHL, SPORTDiscus, and Cochrane Database of Systematic Reviews were systematically searched. Grey literature was searched using Grey Matters and Google. Reference lists of included articles were also searched for relevant studies. Two reviewers independently screened all titles and abstracts using pre-defined inclusion and exclusion criteria. A full text screen was conducted on articles that met the first screen inclusion criteria. All full text articles that met the pre-defined inclusion criteria were included for analysis in this systematic review. A total of 1,326 publications were identified through the predetermined search strategy and 32 articles/reports met all eligibility criteria for inclusion in this review. Five articles specifically examined children and youth aged 19 years or under with TBI. ICD-10 case definitions ranged from the broad injuries to the head codes (ICD-10 S00 to S09) to concussion only (S06.0). There was overwhelming consensus on the inclusion of ICD-10 code S06, intracranial injury, while codes S00 (superficial injury of the head), S03 (dislocation, sprain, and strain of joints and ligaments of head), and S05 (injury of eye and orbit) were only used by articles that examined head injury, none of which specifically examined children and

  8. ColorPhylo: A Color Code to Accurately Display Taxonomic Classifications.

    Science.gov (United States)

    Lespinats, Sylvain; Fertil, Bernard

    2011-01-01

    Color may be very useful to visualise complex data. As far as taxonomy is concerned, color may help observing various species' characteristics in correlation with classification. However, choosing the number of subclasses to display is often a complex task: on the one hand, assigning a limited number of colors to taxa of interest hides the structure imbedded in the subtrees of the taxonomy; on the other hand, differentiating a high number of taxa by giving them specific colors, without considering the underlying taxonomy, may lead to unreadable results since relationships between displayed taxa would not be supported by the color code. In the present paper, an automatic color coding scheme is proposed to visualise the levels of taxonomic relationships displayed as overlay on any kind of data plot. To achieve this goal, a dimensionality reduction method allows displaying taxonomic "distances" onto a Euclidean two-dimensional space. The resulting map is projected onto a 2D color space (the Hue, Saturation, Brightness colorimetric space with brightness set to 1). Proximity in the taxonomic classification corresponds to proximity on the map and is therefore materialised by color proximity. As a result, each species is related to a color code showing its position in the taxonomic tree. The so called ColorPhylo displays taxonomic relationships intuitively and can be combined with any biological result. A Matlab version of ColorPhylo is available at http://sy.lespi.free.fr/ColorPhylo-homepage.html. Meanwhile, an ad-hoc distance in case of taxonomy with unknown edge lengths is proposed.

  9. Ontological function annotation of long non-coding RNAs through hierarchical multi-label classification.

    Science.gov (United States)

    Zhang, Jingpu; Zhang, Zuping; Wang, Zixiang; Liu, Yuting; Deng, Lei

    2018-05-15

    Long non-coding RNAs (lncRNAs) are an enormous collection of functional non-coding RNAs. Over the past decades, a large number of novel lncRNA genes have been identified. However, most of the lncRNAs remain function uncharacterized at present. Computational approaches provide a new insight to understand the potential functional implications of lncRNAs. Considering that each lncRNA may have multiple functions and a function may be further specialized into sub-functions, here we describe NeuraNetL2GO, a computational ontological function prediction approach for lncRNAs using hierarchical multi-label classification strategy based on multiple neural networks. The neural networks are incrementally trained level by level, each performing the prediction of gene ontology (GO) terms belonging to a given level. In NeuraNetL2GO, we use topological features of the lncRNA similarity network as the input of the neural networks and employ the output results to annotate the lncRNAs. We show that NeuraNetL2GO achieves the best performance and the overall advantage in maximum F-measure and coverage on the manually annotated lncRNA2GO-55 dataset compared to other state-of-the-art methods. The source code and data are available at http://denglab.org/NeuraNetL2GO/. leideng@csu.edu.cn. Supplementary data are available at Bioinformatics online.

  10. Total Hip Arthroplasty in Mucopolysaccharidosis Type IH

    Directory of Open Access Journals (Sweden)

    S. O'hEireamhoin

    2011-01-01

    Full Text Available Children affected by mucopolysaccharidosis (MPS type IH (Hurler Syndrome, an autosomal recessive metabolic disorder, are known to experience a range of musculoskeletal manifestations including spinal abnormalities, hand abnormalities, generalised joint stiffness, genu valgum, and hip dysplasia and avascular necrosis. Enzyme therapy, in the form of bone marrow transplantation, significantly increases life expectancy but does not prevent the development of the associated musculoskeletal disorders. We present the case of a 23-year-old woman with a diagnosis of Hurler syndrome with a satisfactory result following uncemented total hip arthroplasty.

  11. A Crosswalk of Mineral Commodity End Uses and North American Industry Classification System (NAICS) codes

    Science.gov (United States)

    Barry, James J.; Matos, Grecia R.; Menzie, W. David

    2015-09-14

    This crosswalk is based on the premise that there is a connection between the way mineral commodities are used and how this use is reflected in the economy. Raw mineral commodities are the basic materials from which goods, finished products, or intermediate materials are manufactured or made. Mineral commodities are vital to the development of the U.S. economy and they impact nearly every industrial segment of the economy, representing 12.2 percent of the U.S. gross domestic product (GDP) in 2010 (U.S. Bureau of Economic Analysis, 2014). In an effort to better understand the distribution of mineral commodities in the economy, the U.S. Geological Survey (USGS) attempts to link the end uses of mineral commodities to the corresponding North American Industry Classification System (NAICS) codes.

  12. Should International Classification of Diseases codes be used to survey hospital-acquired pneumonia?

    Science.gov (United States)

    Wolfensberger, A; Meier, A H; Kuster, S P; Mehra, T; Meier, M-T; Sax, H

    2018-05-01

    As surveillance of hospital-acquired pneumonia (HAP) is very resource intensive, alternatives for HAP surveillance are needed urgently. This study compared HAP rates according to routine discharge diagnostic codes of the International Classification of Diseases, 10 th Revision (ICD-10; ICD-HAP) with HAP rates according to the validated surveillance definitions of the Hospitals in Europe Link for Infection Control through Surveillance (HELICS/IPSE; HELICS-HAP) by manual retrospective re-evaluation of patient records. The positive predictive value of ICD-HAP for HELICS-HAP was 0.35, and sensitivity was 0.59. Therefore, the currently available ICD-10-based routine discharge data do not allow reliable identification of patients with HAP. Copyright © 2018 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  13. Call for consistent coding in diabetes mellitus using the Royal College of General Practitioners and NHS pragmatic classification of diabetes

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2013-03-01

    Full Text Available Background The prevalence of diabetes is increasing with growing levels of obesity and an aging population. New practical guidelines for diabetes provide an applicable classification. Inconsistent coding of diabetes hampers the use of computerised disease registers for quality improvement, and limits the monitoring of disease trends.Objective To develop a consensus set of codes that should be used when recording diabetes diagnostic data.Methods The consensus approach was hierarchical, with a preference for diagnostic/disorder codes, to define each type of diabetes and non-diabetic hyperglycaemia, which were listed as being completely, partially or not readily mapped to available codes. The practical classification divides diabetes into type 1 (T1DM, type 2 (T2DM, genetic, other, unclassified and non-diabetic fasting hyperglycaemia. We mapped the classification to Read version 2, Clinical Terms version 3 and SNOMED CT.Results T1DMand T2DM were completely mapped to appropriate codes. However, in other areas only partial mapping is possible. Genetics is a fast-moving field and there were considerable gaps in the available labels for genetic conditions; what the classification calls ‘other’ the coding system labels ‘secondary’ diabetes. The biggest gap was the lack of a code for diabetes where the type of diabetes was uncertain. Notwithstanding these limitations we were able to develop a consensus list.Conclusions It is a challenge to develop codes that readily map to contemporary clinical concepts. However, clinicians should adopt the standard recommended codes; and audit the quality of their existing records.

  14. Post accelerator of the IH type structure

    International Nuclear Information System (INIS)

    Chen Ming

    2002-01-01

    The principle, structure, adjustment of the gap voltage, beam dynamic, RF system and the bunchers of the post-accelerator with Interdigital-H type structure, which was developed by the author and Technical University Munich in four years, is described. The energy of ions with mass of three was increased from 340 keV to 1.74 MeV, when resonant frequency of 84.2 MHz and input RF power of 3 kW. The effective shunt impedance reached to 408 MΩ/m. The commissioning was succeeded with H 3 + ion beams. The output energy of H 3 + ion beams reached the design value. The two harmonic double drift buncher used by the IH structure bunches the beam to the bunches with the width of 360 ps. Then the acceptance of the IH structure is increased to 240 degree. Its shunt impedance is three times higher than former single gap bunchers used by TUM and the length of the buncher system is one fifth of former one only because the use of λ/4 coaxial cavities with double gaps

  15. Review: Evolution of GnIH structure and function

    Directory of Open Access Journals (Sweden)

    Tomohiro eOsugi

    2014-08-01

    Full Text Available Discovery of gonadotropin-inhibitory hormone (GnIH in the Japanese quail in 2000 was the first to demonstrate the existence of a hypothalamic neuropeptide inhibiting gonadotropin release. We now know that GnIH regulates reproduction by inhibiting gonadotropin synthesis and release via action on the gonadotropin-releasing hormone (GnRH system and the gonadotrope in various vertebrates. GnIH peptides identified in birds and mammals have a common LPXRF-amide (X = L or Q motif at the C-terminus and inhibits pituitary gonadotropin secretion. However, the function and structure of GnIH peptides were diverse in fish. Goldfish GnIHs possessing a C-terminal LPXRF-amide motif had both stimulatory and inhibitory effects on gonadotropin synthesis or release. The C-terminal sequence of grass puffer and medaka GnIHs were MPQRF-amide. To investigate the evolutionary origin of GnIH and its ancestral structure and function, we searched for GnIH in agnathans, the most ancient lineage of vertebrates. We identified GnIH precursor gene and mature GnIH peptides with C-terminal QPQRF-amide or RPQRF-amide from the brain of sea lamprey. Lamprey GnIH fibers were in close proximity to GnRH-III neurons. Further, one of lamprey GnIHs stimulated the expression of lamprey GnRH-III peptide in the hypothalamus and gonadotropic hormone β mRNA expression in the pituitary. We further identified the ancestral form of GnIH, which had a C-terminal RPQRF-amide, and its receptors in amphioxus, the most basal chordate species. The amphioxus GnIH inhibited cAMP signaling in vitro. In sum, the original forms of GnIH may date back to the time of the emergence of early chordates. GnIH peptides may have had various C-terminal structures slightly different from LPXRF-amide in basal chordates, which had stimulatory and/or inhibitory functions on reproduction. The C-terminal LPXRF-amide structure and its inhibitory function on reproduction may be selected in later-evolved vertebrates, such as

  16. Revealing topographic lineaments through IHS enhancement of DEM data. [Digital Elevation Model

    Science.gov (United States)

    Murdock, Gary

    1990-01-01

    Intensity-hue-saturation (IHS) processing of slope (dip), aspect (dip direction), and elevation to reveal subtle topographic lineaments which may not be obvious in the unprocessed data are used to enhance digital elevation model (DEM) data from northwestern Nevada. This IHS method of lineament identification was applied to a mosiac of 12 square degrees using a Cray Y-MP8/864. Square arrays from 3 x 3 to 31 x 31 points were tested as well as several different slope enhancements. When relatively few points are used to fit the plane, lineaments of various lengths are observed and a mechanism for lineament classification is described. An area encompassing the gold deposits of the Carlin trend and including the Rain in the southeast to Midas in the northwest is investigated in greater detail. The orientation and density of lineaments may be determined on the gently sloping pediment surface as well as in the more steeply sloping ranges.

  17. Classification and coding of commercial fishing injuries by work processes: an experience in the Danish fresh market fishing industry

    DEFF Research Database (Denmark)

    Jensen, Olaf Chresten; Stage, Søren; Noer, Preben

    2005-01-01

    BACKGROUND: Work-related injuries in commercial fishing are of concern internationally. To better identify the causes of injury, this study coded occupational injuries by working processes in commercial fishing for fresh market fish. METHODS: A classification system of the work processes was deve......BACKGROUND: Work-related injuries in commercial fishing are of concern internationally. To better identify the causes of injury, this study coded occupational injuries by working processes in commercial fishing for fresh market fish. METHODS: A classification system of the work processes...... to working with the gear and nets vary greatly in the different fishing methods. Coding of the injuries to the specific working processes allows for targeted prevention efforts....

  18. Multi-information fusion sparse coding with preserving local structure for hyperspectral image classification

    Science.gov (United States)

    Wei, Xiaohui; Zhu, Wen; Liao, Bo; Gu, Changlong; Li, Weibiao

    2017-10-01

    The key question of sparse coding (SC) is how to exploit the information that already exists to acquire the robust sparse representations (SRs) of distinguishing different objects for hyperspectral image (HSI) classification. We propose a multi-information fusion SC framework, which fuses the spectral, spatial, and label information in the same level, to solve the above question. In particular, pixels from disjointed spatial clusters, which are obtained by cutting the given HSI in space, are individually and sparsely encoded. Then, due to the importance of spatial structure, graph- and hypergraph-based regularizers are enforced to motivate the obtained representations smoothness and to preserve the local consistency for each spatial cluster. The latter simultaneously considers the spectrum, spatial, and label information of multiple pixels that have a great probability with the same label. Finally, a linear support vector machine is selected as the final classifier with the learned SRs as input. Experiments conducted on three frequently used real HSIs show that our methods can achieve satisfactory results compared with other state-of-the-art methods.

  19. The Classification of Complementary Information Set Codes of Lengths 14 and 16

    OpenAIRE

    Freibert, Finley

    2012-01-01

    In the paper "A new class of codes for Boolean masking of cryptographic computations," Carlet, Gaborit, Kim, and Sol\\'{e} defined a new class of rate one-half binary codes called \\emph{complementary information set} (or CIS) codes. The authors then classified all CIS codes of length less than or equal to 12. CIS codes have relations to classical Coding Theory as they are a generalization of self-dual codes. As stated in the paper, CIS codes also have important practical applications as they m...

  20. Resonant vibrational energy transfer in ice Ih

    Energy Technology Data Exchange (ETDEWEB)

    Shi, L.; Li, F.; Skinner, J. L. [Theoretical Chemistry Institute and Department of Chemistry, University of Wisconsin, Madison, Wisconsin 53706 (United States)

    2014-06-28

    Fascinating anisotropy decay experiments have recently been performed on H{sub 2}O ice Ih by Timmer and Bakker [R. L. A. Timmer, and H. J. Bakker, J. Phys. Chem. A 114, 4148 (2010)]. The very fast decay (on the order of 100 fs) is indicative of resonant energy transfer between OH stretches on different molecules. Isotope dilution experiments with deuterium show a dramatic dependence on the hydrogen mole fraction, which confirms the energy transfer picture. Timmer and Bakker have interpreted the experiments with a Förster incoherent hopping model, finding that energy transfer within the first solvation shell dominates the relaxation process. We have developed a microscopic theory of vibrational spectroscopy of water and ice, and herein we use this theory to calculate the anisotropy decay in ice as a function of hydrogen mole fraction. We obtain very good agreement with experiment. Interpretation of our results shows that four nearest-neighbor acceptors dominate the energy transfer, and that while the incoherent hopping picture is qualitatively correct, vibrational energy transport is partially coherent on the relevant timescale.

  1. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  2. Five-way Smoking Status Classification Using Text Hot-Spot Identification and Error-correcting Output Codes

    OpenAIRE

    Cohen, Aaron M.

    2008-01-01

    We participated in the i2b2 smoking status classification challenge task. The purpose of this task was to evaluate the ability of systems to automatically identify patient smoking status from discharge summaries. Our submission included several techniques that we compared and studied, including hot-spot identification, zero-vector filtering, inverse class frequency weighting, error-correcting output codes, and post-processing rules. We evaluated our approaches using the same methods as the i2...

  3. Validity of International Classification of Diseases (ICD) coding for dengue infections in hospital discharge records in Malaysia.

    Science.gov (United States)

    Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn

    2018-04-20

    Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.

  4. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  5. ncRNA-class Web Tool: Non-coding RNA feature extraction and pre-miRNA classification web tool

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Theofilatos, Konstantinos A.; Papadimitriou, Stergios; Tsakalidis, Athanasios K.; Likothanassis, Spiridon D.; Mavroudi, Seferina P.

    2012-01-01

    Until recently, it was commonly accepted that most genetic information is transacted by proteins. Recent evidence suggests that the majority of the genomes of mammals and other complex organisms are in fact transcribed into non-coding RNAs (ncRNAs), many of which are alternatively spliced and/or processed into smaller products. Non coding RNA genes analysis requires the calculation of several sequential, thermodynamical and structural features. Many independent tools have already been developed for the efficient calculation of such features but to the best of our knowledge there does not exist any integrative approach for this task. The most significant amount of existing work is related to the miRNA class of non-coding RNAs. MicroRNAs (miRNAs) are small non-coding RNAs that play a significant role in gene regulation and their prediction is a challenging bioinformatics problem. Non-coding RNA feature extraction and pre-miRNA classification Web Tool (ncRNA-class Web Tool) is a publicly available web tool ( http://150.140.142.24:82/Default.aspx ) which provides a user friendly and efficient environment for the effective calculation of a set of 58 sequential, thermodynamical and structural features of non-coding RNAs, plus a tool for the accurate prediction of miRNAs. © 2012 IFIP International Federation for Information Processing.

  6. The accuracy of International Classification of Diseases coding for dental problems not associated with trauma in a hospital emergency department.

    Science.gov (United States)

    Figueiredo, Rafael L F; Singhal, Sonica; Dempster, Laura; Hwang, Stephen W; Quinonez, Carlos

    2015-01-01

    Emergency department (ED) visits for nontraumatic dental conditions (NTDCs) may be a sign of unmet need for dental care. The objective of this study was to determine the accuracy of the International Classification of Diseases codes (ICD-10-CA) for ED visits for NTDC. ED visits in 2008-2099 at one hospital in Toronto were identified if the discharge diagnosis in the administrative database system was an ICD-10-CA code for a NTDC (K00-K14). A random sample of 100 visits was selected, and the medical records for these visits were reviewed by a dentist. The description of the clinical signs and symptoms were evaluated, and a diagnosis was assigned. This diagnosis was compared with the diagnosis assigned by the physician and the code assigned to the visit. The 100 ED visits reviewed were associated with 16 different ICD-10-CA codes for NTDC. Only 2 percent of these visits were clearly caused by trauma. The code K0887 (toothache) was the most frequent diagnostic code (31 percent). We found 43.3 percent disagreement on the discharge diagnosis reported by the physician, and 58.0 percent disagreement on the code in the administrative database assigned by the abstractor, compared with what it was suggested by the dentist reviewing the chart. There are substantial discrepancies between the ICD-10-CA diagnosis assigned in administrative databases and the diagnosis assigned by a dentist reviewing the chart retrospectively. However, ICD-10-CA codes can be used to accurately identify ED visits for NTDC. © 2015 American Association of Public Health Dentistry.

  7. Use of the Coding Causes of Death in HIV in the classification of deaths in Northeastern Brazil.

    Science.gov (United States)

    Alves, Diana Neves; Bresani-Salvi, Cristiane Campello; Batista, Joanna d'Arc Lyra; Ximenes, Ricardo Arraes de Alencar; Miranda-Filho, Demócrito de Barros; Melo, Heloísa Ramos Lacerda de; Albuquerque, Maria de Fátima Pessoa Militão de

    2017-01-01

    Describe the coding process of death causes for people living with HIV/AIDS, and classify deaths as related or unrelated to immunodeficiency by applying the Coding Causes of Death in HIV (CoDe) system. A cross-sectional study that codifies and classifies the causes of deaths occurring in a cohort of 2,372 people living with HIV/AIDS, monitored between 2007 and 2012, in two specialized HIV care services in Pernambuco. The causes of death already codified according to the International Classification of Diseases were recoded and classified as deaths related and unrelated to immunodeficiency by the CoDe system. We calculated the frequencies of the CoDe codes for the causes of death in each classification category. There were 315 (13%) deaths during the study period; 93 (30%) were caused by an AIDS-defining illness on the Centers for Disease Control and Prevention list. A total of 232 deaths (74%) were related to immunodeficiency after application of the CoDe. Infections were the most common cause, both related (76%) and unrelated (47%) to immunodeficiency, followed by malignancies (5%) in the first group and external causes (16%), malignancies (12 %) and cardiovascular diseases (11%) in the second group. Tuberculosis comprised 70% of the immunodeficiency-defining infections. Opportunistic infections and aging diseases were the most frequent causes of death, adding multiple disease burdens on health services. The CoDe system increases the probability of classifying deaths more accurately in people living with HIV/AIDS. Descrever o processo de codificação das causas de morte em pessoas vivendo com HIV/Aids, e classificar os óbitos como relacionados ou não relacionados à imunodeficiência aplicando o sistema Coding Causes of Death in HIV (CoDe). Estudo transversal, que codifica e classifica as causas dos óbitos ocorridos em uma coorte de 2.372 pessoas vivendo com HIV/Aids acompanhadas entre 2007 e 2012 em dois serviços de atendimento especializado em HIV em

  8. Using Administrative Mental Health Indicators in Heart Failure Outcomes Research: Comparison of Clinical Records and International Classification of Disease Coding.

    Science.gov (United States)

    Bender, Miriam; Smith, Tyler C

    2016-01-01

    Use of mental indication in health outcomes research is of growing interest to researchers. This study, as part of a larger research program, quantified agreement between administrative International Classification of Disease (ICD-9) coding for, and "gold standard" clinician documentation of, mental health issues (MHIs) in hospitalized heart failure (HF) patients to determine the validity of mental health administrative data for use in HF outcomes research. A 13% random sample (n = 504) was selected from all unique patients (n = 3,769) hospitalized with a primary HF diagnosis at 4 San Diego County community hospitals during 2009-2012. MHI was defined as ICD-9 discharge diagnostic coding 290-319. Records were audited for clinician documentation of MHI. A total of 43% (n = 216) had mental health clinician documentation; 33% (n = 164) had ICD-9 coding for MHI. ICD-9 code bundle 290-319 had 0.70 sensitivity, 0.97 specificity, and kappa 0.69 (95% confidence interval 0.61-0.79). More specific ICD-9 MHI code bundles had kappas ranging from 0.44 to 0.82 and sensitivities ranging from 42% to 82%. Agreement between ICD-9 coding and clinician documentation for a broadly defined MHI is substantial, and can validly "rule in" MHI for hospitalized patients with heart failure. More specific MHI code bundles had fair to almost perfect agreement, with a wide range of sensitivities for identifying patients with an MHI. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. The IHS diagnostic X-ray equipment radiation protection program

    International Nuclear Information System (INIS)

    Knapp, A.; Byrns, G.; Suleiman, O.

    1994-01-01

    The Indian Health Service (IHS) operates or contracts with Tribal groups to operate 50 hospitals and approximately 165 primary ambulatory care centers. These facilities contain approximately 275 medical and 800 dental diagnostic x-ray machines. IHS environmental health personnel in collaboration with the Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) developed a diagnostic x-ray protection program including standard survey procedures and menu-driven calculations software. Important features of the program include the evaluation of equipment performance collection of average patient entrance skin exposure (ESE) measurements for selected procedures, and quality assurance. The ESE data, collected using the National Evaluation of X-ray Trends (NEXT) protocol, will be presented. The IHS Diagnostic X-ray Radiation Protection Program is dynamic and is adapting to changes in technology and workload

  10. Identifying Adverse Events Using International Classification of Diseases, Tenth Revision Y Codes in Korea: A Cross-sectional Study

    Directory of Open Access Journals (Sweden)

    Minsu Ock

    2018-01-01

    Full Text Available Objectives The use of administrative data is an affordable alternative to conducting a difficult large-scale medical-record review to estimate the scale of adverse events. We identified adverse events from 2002 to 2013 on the national level in Korea, using International Classification of Diseases, tenth revision (ICD-10 Y codes. Methods We used data from the National Health Insurance Service-National Sample Cohort (NHIS-NSC. We relied on medical treatment databases to extract information on ICD-10 Y codes from each participant in the NHIS-NSC. We classified adverse events in the ICD-10 Y codes into 6 types: those related to drugs, transfusions, and fluids; those related to vaccines and immunoglobulin; those related to surgery and procedures; those related to infections; those related to devices; and others. Results Over 12 years, a total of 20 817 adverse events were identified using ICD-10 Y codes, and the estimated total adverse event rate was 0.20%. Between 2002 and 2013, the total number of such events increased by 131.3%, from 1366 in 2002 to 3159 in 2013. The total rate increased by 103.9%, from 0.17% in 2002 to 0.35% in 2013. Events related to drugs, transfusions, and fluids were the most common (19 446, 93.4%, followed by those related to surgery and procedures (1209, 5.8% and those related to vaccines and immunoglobulin (72, 0.3%. Conclusions Based on a comparison with the results of other studies, the total adverse event rate in this study was significantly underestimated. Improving coding practices for ICD-10 Y codes is necessary to precisely monitor the scale of adverse events in Korea.

  11. LZW-Kernel: fast kernel utilizing variable length code blocks from LZW compressors for protein sequence classification.

    Science.gov (United States)

    Filatov, Gleb; Bauwens, Bruno; Kertész-Farkas, Attila

    2018-05-07

    Bioinformatics studies often rely on similarity measures between sequence pairs, which often pose a bottleneck in large-scale sequence analysis. Here, we present a new convolutional kernel function for protein sequences called the LZW-Kernel. It is based on code words identified with the Lempel-Ziv-Welch (LZW) universal text compressor. The LZW-Kernel is an alignment-free method, it is always symmetric, is positive, always provides 1.0 for self-similarity and it can directly be used with Support Vector Machines (SVMs) in classification problems, contrary to normalized compression distance (NCD), which often violates the distance metric properties in practice and requires further techniques to be used with SVMs. The LZW-Kernel is a one-pass algorithm, which makes it particularly plausible for big data applications. Our experimental studies on remote protein homology detection and protein classification tasks reveal that the LZW-Kernel closely approaches the performance of the Local Alignment Kernel (LAK) and the SVM-pairwise method combined with Smith-Waterman (SW) scoring at a fraction of the time. Moreover, the LZW-Kernel outperforms the SVM-pairwise method when combined with BLAST scores, which indicates that the LZW code words might be a better basis for similarity measures than local alignment approximations found with BLAST. In addition, the LZW-Kernel outperforms n-gram based mismatch kernels, hidden Markov model based SAM and Fisher kernel, and protein family based PSI-BLAST, among others. Further advantages include the LZW-Kernel's reliance on a simple idea, its ease of implementation, and its high speed, three times faster than BLAST and several magnitudes faster than SW or LAK in our tests. LZW-Kernel is implemented as a standalone C code and is a free open-source program distributed under GPLv3 license and can be downloaded from https://github.com/kfattila/LZW-Kernel. akerteszfarkas@hse.ru. Supplementary data are available at Bioinformatics Online.

  12. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  13. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    Science.gov (United States)

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem

  14. Automatic Modulation Classification of LFM and Polyphase-coded Radar Signals

    Directory of Open Access Journals (Sweden)

    S. B. S. Hanbali

    2017-12-01

    Full Text Available There are several techniques for detecting and classifying low probability of intercept radar signals such as Wigner distribution, Choi-Williams distribution and time-frequency rate distribution, but these distributions require high SNR. To overcome this problem, we propose a new technique for detecting and classifying linear frequency modulation signal and polyphase coded signals using optimum fractional Fourier transform at low SNR. The theoretical analysis and simulation experiments demonstrate the validity and efficiency of the proposed method.

  15. Ocena prehranskega stanja starejših v socialnovarstvenem zavodu

    Directory of Open Access Journals (Sweden)

    Nika Urh

    2017-09-01

    Full Text Available Uvod: Eden od dejavnikov kakovosti življenja starejših v socialnovarstvenem zavodu je prehrana. Namen raziskave je bil proučiti prehransko stanje in ponuditi predloge za izboljšanje prehrane starejših v socialnovarstvenem zavodu. Metode: V raziskavi je bila uporabljena kvantitativna opisna metoda raziskovanja. Analizirani so bili dnevni jedilniki v socialnovarstvenem zavodu in na vzorcu starejših (n = 48 izvedena analizo dejanskega vnosa hranil in prehranskega statusa. Prehransko stanje je bilo ugotovljeno na podlagi indeksa telesne mase in vprašalnika Mini prehranski pregled. Vključeni v raziskavo so imeli prehrano, pri kateri dietne prilagoditve niso bile potrebne. Rezultati: Povprečna energijska vrednost ponujenih jedilnikov je znašala 8457 kJ (2021 kcal na dan, 17 hranil je odstopalo od priporočil. Moški (M in ženske (Ž se statistično pomembno razlikujejo v deležu ostanka maščob (μM = 16 %, μŽ = 24 %; p = 0,036, holesterola (μM = 15 %, μŽ = 26 %; p = 0,035, vitamina D (μM = 15 %, μŽ = 27 %; p = 0,017 in vitamina B12 (μM = 17 %, μŽ = 25 %; p = 0,016. Indeks telesne mase je pokazal, da nihče od starejših, ki so bili vključeni v raziskavo, ni bil podhranjen ali v kategoriji debelost III. stopnje. Diskusija in zaključek: Raziskava je potrdila potrebo po sistematičnem spremljanju kakovosti prehrane starejših v socialnovarstvenih zavodih. Da bi starejšim zagotovili ustrezno prehransko oskrbo, je potrebno sodelovanje medpoklicnega tima s starejšimi in njihovimi družinskimi člani.

  16. Development of an IH-type linac for the acceleration of high current heavy ion beams

    Energy Technology Data Exchange (ETDEWEB)

    Haehnel, Jan Hendrik

    2017-07-20

    The Facility for Antiproton and Ion Research (FAIR) at GSI Darmstadt will provide unprecedented intensities of protons and heavy ions up to uranium at energies of up to 29 GeV for protons and 2.7 GeV/u for U{sup 28+}. To achieve high intensities in the synchrotron accelerators, high beam currents have to be provided by the injector linear accelerators. High current heavy ion beams are provided by the Universal Linear Accelerator (UNILAC), which in its current state will not be able to provide the required FAIR beam currents. This thesis deals with the development of upgrades for the UNILAC to ensure its high current capability. The first improvement is a matching section (MEBT) for the interface between the RFQ and the IH-DTL of the existing high current injector HSI at the UNILAC. With this new MEBT section, particle losses are eliminated and the overall beam quality is improved. As a second improvement, a complete replacement of the existing Alvarez-DTL is presented. A combination of efficient IH-type cavities and KONUS beam dynamics results in a reduction of the linac length from about 60 m (Alvarez) to just 23 m (new IH-DTL) while providing the same energy and fulfilling FAIR requirements of a high beam current and beam quality. This thesis contains a detailed beam dynamics design of the new linac including some fundamental investigations of the KONUS beam dynamics concept. A cross-check of the beam dynamics design was performed with two independent multi-particle simulation codes. Detailed error studies were conducted to investigate the influence of manufacturing, alignment and operating errors on the beam dynamics performance. Additionally, all five linac cavities were designed, optimized, and their RF parameters including power requirements calculated to provide a comprehensive linac design.

  17. Development of an IH-type linac for the acceleration of high current heavy ion beams

    International Nuclear Information System (INIS)

    Haehnel, Jan Hendrik

    2017-01-01

    The Facility for Antiproton and Ion Research (FAIR) at GSI Darmstadt will provide unprecedented intensities of protons and heavy ions up to uranium at energies of up to 29 GeV for protons and 2.7 GeV/u for U 28+ . To achieve high intensities in the synchrotron accelerators, high beam currents have to be provided by the injector linear accelerators. High current heavy ion beams are provided by the Universal Linear Accelerator (UNILAC), which in its current state will not be able to provide the required FAIR beam currents. This thesis deals with the development of upgrades for the UNILAC to ensure its high current capability. The first improvement is a matching section (MEBT) for the interface between the RFQ and the IH-DTL of the existing high current injector HSI at the UNILAC. With this new MEBT section, particle losses are eliminated and the overall beam quality is improved. As a second improvement, a complete replacement of the existing Alvarez-DTL is presented. A combination of efficient IH-type cavities and KONUS beam dynamics results in a reduction of the linac length from about 60 m (Alvarez) to just 23 m (new IH-DTL) while providing the same energy and fulfilling FAIR requirements of a high beam current and beam quality. This thesis contains a detailed beam dynamics design of the new linac including some fundamental investigations of the KONUS beam dynamics concept. A cross-check of the beam dynamics design was performed with two independent multi-particle simulation codes. Detailed error studies were conducted to investigate the influence of manufacturing, alignment and operating errors on the beam dynamics performance. Additionally, all five linac cavities were designed, optimized, and their RF parameters including power requirements calculated to provide a comprehensive linac design.

  18. Classification of quantum phases and topology of logical operators in an exactly solved model of quantum codes

    International Nuclear Information System (INIS)

    Yoshida, Beni

    2011-01-01

    Searches for possible new quantum phases and classifications of quantum phases have been central problems in physics. Yet, they are indeed challenging problems due to the computational difficulties in analyzing quantum many-body systems and the lack of a general framework for classifications. While frustration-free Hamiltonians, which appear as fixed point Hamiltonians of renormalization group transformations, may serve as representatives of quantum phases, it is still difficult to analyze and classify quantum phases of arbitrary frustration-free Hamiltonians exhaustively. Here, we address these problems by sharpening our considerations to a certain subclass of frustration-free Hamiltonians, called stabilizer Hamiltonians, which have been actively studied in quantum information science. We propose a model of frustration-free Hamiltonians which covers a large class of physically realistic stabilizer Hamiltonians, constrained to only three physical conditions; the locality of interaction terms, translation symmetries and scale symmetries, meaning that the number of ground states does not grow with the system size. We show that quantum phases arising in two-dimensional models can be classified exactly through certain quantum coding theoretical operators, called logical operators, by proving that two models with topologically distinct shapes of logical operators are always separated by quantum phase transitions.

  19. 42 CFR 137.294 - What is the typical IHS environmental review process for construction projects?

    Science.gov (United States)

    2010-10-01

    ... SELF-GOVERNANCE Construction Nepa Process § 137.294 What is the typical IHS environmental review... impact on the environment, and therefore do not require environmental impact statements (EIS). Under current IHS procedures, an environmental review is performed on all construction projects. During the IHS...

  20. Population-based evaluation of a suggested anatomic and clinical classification of congenital heart defects based on the International Paediatric and Congenital Cardiac Code

    Directory of Open Access Journals (Sweden)

    Goffinet François

    2011-10-01

    Full Text Available Abstract Background Classification of the overall spectrum of congenital heart defects (CHD has always been challenging, in part because of the diversity of the cardiac phenotypes, but also because of the oft-complex associations. The purpose of our study was to establish a comprehensive and easy-to-use classification of CHD for clinical and epidemiological studies based on the long list of the International Paediatric and Congenital Cardiac Code (IPCCC. Methods We coded each individual malformation using six-digit codes from the long list of IPCCC. We then regrouped all lesions into 10 categories and 23 subcategories according to a multi-dimensional approach encompassing anatomic, diagnostic and therapeutic criteria. This anatomic and clinical classification of congenital heart disease (ACC-CHD was then applied to data acquired from a population-based cohort of patients with CHD in France, made up of 2867 cases (82% live births, 1.8% stillbirths and 16.2% pregnancy terminations. Results The majority of cases (79.5% could be identified with a single IPCCC code. The category "Heterotaxy, including isomerism and mirror-imagery" was the only one that typically required more than one code for identification of cases. The two largest categories were "ventricular septal defects" (52% and "anomalies of the outflow tracts and arterial valves" (20% of cases. Conclusion Our proposed classification is not new, but rather a regrouping of the known spectrum of CHD into a manageable number of categories based on anatomic and clinical criteria. The classification is designed to use the code numbers of the long list of IPCCC but can accommodate ICD-10 codes. Its exhaustiveness, simplicity, and anatomic basis make it useful for clinical and epidemiologic studies, including those aimed at assessment of risk factors and outcomes.

  1. Five-way smoking status classification using text hot-spot identification and error-correcting output codes.

    Science.gov (United States)

    Cohen, Aaron M

    2008-01-01

    We participated in the i2b2 smoking status classification challenge task. The purpose of this task was to evaluate the ability of systems to automatically identify patient smoking status from discharge summaries. Our submission included several techniques that we compared and studied, including hot-spot identification, zero-vector filtering, inverse class frequency weighting, error-correcting output codes, and post-processing rules. We evaluated our approaches using the same methods as the i2b2 task organizers, using micro- and macro-averaged F1 as the primary performance metric. Our best performing system achieved a micro-F1 of 0.9000 on the test collection, equivalent to the best performing system submitted to the i2b2 challenge. Hot-spot identification, zero-vector filtering, classifier weighting, and error correcting output coding contributed additively to increased performance, with hot-spot identification having by far the largest positive effect. High performance on automatic identification of patient smoking status from discharge summaries is achievable with the efficient and straightforward machine learning techniques studied here.

  2. Coding and classification in drug statistics – From national to global application

    Directory of Open Access Journals (Sweden)

    Marit Rønning

    2009-11-01

    Full Text Available  SUMMARYThe Anatomical Therapeutic Chemical (ATC classification system and the defined daily dose (DDDwas developed in Norway in the early seventies. The creation of the ATC/DDD methodology was animportant basis for presenting drug utilisation statistics in a sensible way. Norway was in 1977 also thefirst country to publish national drug utilisation statistics from wholesalers on an annual basis. Thecombination of these activities in Norway in the seventies made us a pioneer country in the area of drugutilisation research. Over the years, the use of the ATC/DDD methodology has gradually increased incountries outside Norway. Since 1996, the methodology has been recommended by WHO for use ininternational drug utilisation studies. The WHO Collaborating Centre for Drug Statistics Methodologyin Oslo handles the maintenance and development of the ATC/DDD system. The Centre is now responsiblefor the global co-ordination. After nearly 30 years of experience with ATC/DDD, the methodologyhas demonstrated its suitability in drug use research. The main challenge in the coming years is toeducate the users worldwide in how to use the methodology properly.

  3. Development of non-duct IH (induction heating) smokeless roaster; Nonduct IH (denjiha yudo kanetsu) muen roaster no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Matsunaga, K. [Chubu Electric Power Co. Inc., Nagoya (Japan)

    1996-11-01

    As a part of diffusion and promotion of electric cooking apparatuses, a non-duct IH smokeless roaster has been developed for roast meat restaurants in cooperation with Hitachi Home Tech. Smoke and odor in the room can be removed without using exhaust duct by incorporating smoke and odor processing equipment into the main body. For the developed roaster, IH was adopted as a heat source with less smoke and exhaust gas compared with city gas. Generated smoke and odor are removed by the electrical precipitator and the deodorant catalyst incorporated in the main body. After the treatment, exhaust gas is emitted in the room. This roaster has characteristics as follows. This roaster can be used at high-class roast meat restaurants for avoiding smoke and odor, and can be brought in banqueting hall of hotels. Since it does not have a duct, there is no danger of fire. It is easy to change the layout of guest room. Since the IH system has less exhaust heat than the gas system, increase in room temperature is moderate, which results in the reduction of air conditioning load. 8 figs., 3 tabs.

  4. Administrative database concerns: accuracy of International Classification of Diseases, Ninth Revision coding is poor for preoperative anemia in patients undergoing spinal fusion.

    Science.gov (United States)

    Golinvaux, Nicholas S; Bohl, Daniel D; Basques, Bryce A; Grauer, Jonathan N

    2014-11-15

    Cross-sectional study. To objectively evaluate the ability of International Classification of Diseases, Ninth Revision (ICD-9) codes, which are used as the foundation for administratively coded national databases, to identify preoperative anemia in patients undergoing spinal fusion. National database research in spine surgery continues to rise. However, the validity of studies based on administratively coded data, such as the Nationwide Inpatient Sample, are dependent on the accuracy of ICD-9 coding. Such coding has previously been found to have poor sensitivity to conditions such as obesity and infection. A cross-sectional study was performed at an academic medical center. Hospital-reported anemia ICD-9 codes (those used for administratively coded databases) were directly compared with the chart-documented preoperative hematocrits (true laboratory values). A patient was deemed to have preoperative anemia if the preoperative hematocrit was less than the lower end of the normal range (36.0% for females and 41.0% for males). The study included 260 patients. Of these, 37 patients (14.2%) were anemic; however, only 10 patients (3.8%) received an "anemia" ICD-9 code. Of the 10 patients coded as anemic, 7 were anemic by definition, whereas 3 were not, and thus were miscoded. This equates to an ICD-9 code sensitivity of 0.19, with a specificity of 0.99, and positive and negative predictive values of 0.70 and 0.88, respectively. This study uses preoperative anemia to demonstrate the potential inaccuracies of ICD-9 coding. These results have implications for publications using databases that are compiled from ICD-9 coding data. Furthermore, the findings of the current investigation raise concerns regarding the accuracy of additional comorbidities. Although administrative databases are powerful resources that provide large sample sizes, it is crucial that we further consider the quality of the data source relative to its intended purpose.

  5. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS versions 8, 9 and 10.1

    Directory of Open Access Journals (Sweden)

    John Orchard

    2010-10-01

    Full Text Available John Orchard1, Katherine Rae1, John Brooks2, Martin Hägglund3, Lluis Til4, David Wales5, Tim Wood61Sports Medicine at Sydney University, Sydney NSW Australia; 2Rugby Football Union, Twickenham, England, UK; 3Department of Medical and Health Sciences, Linköping University, Linköping, Sweden; 4FC Barcelona, Barcelona, Catalonia, Spain; 5Arsenal FC, Highbury, England, UK; 6Tennis Australia, Melbourne, Vic, AustraliaAbstract: The Orchard Sports Injury Classification System (OSICS is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer, Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes and OSICS 10.1 (four digit codes are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements.Keywords: sports injury classification, epidemiology, surveillance, coding

  6. Ih current is necessary to maintain normal dopamine fluctuations and sleep consolidation in Drosophila.

    Directory of Open Access Journals (Sweden)

    Alicia Gonzalo-Gomez

    Full Text Available HCN channels are becoming pharmacological targets mainly in cardiac diseases. But apart from their well-known role in heart pacemaking, these channels are widely expressed in the nervous system where they contribute to the neuron firing pattern. Consequently, abolishing Ih current might have detrimental consequences in a big repertoire of behavioral traits. Several studies in mammals have identified the Ih current as an important determinant of the firing activity of dopaminergic neurons, and recent evidences link alterations in this current to various dopamine-related disorders. We used the model organism Drosophila melanogaster to investigate how lack of Ih current affects dopamine levels and the behavioral consequences in the sleep:activity pattern. Unlike mammals, in Drosophila there is only one gene encoding HCN channels. We generated a deficiency of the DmIh core gene region and measured, by HPLC, levels of dopamine. Our data demonstrate daily variations of dopamine in wild-type fly heads. Lack of Ih current dramatically alters dopamine pattern, but different mechanisms seem to operate during light and dark conditions. Behaviorally, DmIh mutant flies display alterations in the rest:activity pattern, and altered circadian rhythms. Our data strongly suggest that Ih current is necessary to prevent dopamine overproduction at dark, while light input allows cycling of dopamine in an Ih current dependent manner. Moreover, lack of Ih current results in behavioral defects that are consistent with altered dopamine levels.

  7. Review: evolution of GnIH and related peptides structure and function in the chordates.

    Science.gov (United States)

    Osugi, Tomohiro; Ubuka, Takayoshi; Tsutsui, Kazuyoshi

    2014-01-01

    Discovery of gonadotropin-inhibitory hormone (GnIH) in the Japanese quail in 2000 was the first to demonstrate the existence of a hypothalamic neuropeptide inhibiting gonadotropin release. We now know that GnIH regulates reproduction by inhibiting gonadotropin synthesis and release via action on the gonadotropin-releasing hormone (GnRH) system and the gonadotrope in various vertebrates. GnIH peptides identified in birds and mammals have a common LPXRF-amide (X = L or Q) motif at the C-terminus and inhibit pituitary gonadotropin secretion. However, the function and structure of GnIH peptides are diverse in fish. Goldfish GnIHs possessing a C-terminal LPXRF-amide motif have both stimulatory and inhibitory effects on gonadotropin synthesis or release. The C-terminal sequence of grass puffer and medaka GnIHs are MPQRF-amide. To investigate the evolutionary origin of GnIH and its ancestral structure and function, we searched for GnIH in agnathans, the most ancient lineage of vertebrates. We identified GnIH precursor gene and mature GnIH peptides with C-terminal QPQRF-amide or RPQRF-amide from the brain of sea lamprey. Lamprey GnIH fibers were in close proximity to GnRH-III neurons. Further, one of lamprey GnIHs stimulated the expression of lamprey GnRH-III peptide in the hypothalamus and gonadotropic hormone β mRNA expression in the pituitary. We further identified the ancestral form of GnIH, which had a C-terminal RPQRF-amide, and its receptors in amphioxus, the most basal chordate species. The amphioxus GnIH inhibited cAMP signaling in vitro. In sum, the original forms of GnIH may date back to the time of the emergence of early chordates. GnIH peptides may have had various C-terminal structures slightly different from LPXRF-amide in basal chordates, which had stimulatory and/or inhibitory functions on reproduction. The C-terminal LPXRF-amide structure and its inhibitory function on reproduction may be selected in later-evolved vertebrates, such as birds and mammals.

  8. Evolutionary origin and divergence of GnIH and its homologous peptides.

    Science.gov (United States)

    Tsutsui, Kazuyoshi; Osugi, Tomohiro

    2009-03-01

    Probing undiscovered hypothalamic neuropeptides that play important roles in the regulation of pituitary function in vertebrates is essential for the progress of neuroendocrinology. In 2000, we discovered a novel hypothalamic dodecapeptide inhibiting gonadotropin release in quail and termed it gonadotropin-inhibitory hormone (GnIH). GnIH acts on the pituitary and gonadotropin-releasing hormone (GnRH) neurons in the hypothalamus via a novel G protein-coupled receptor for GnIH to inhibit gonadal development and maintenance by decreasing gonadotropin release and synthesis. Similar findings were observed in other avian species. Thus, GnIH is a key factor controlling avian reproduction. To give our findings a broader perspective, we also found GnIH homologous peptides in the hypothalamus of other vertebrates, such as mammals, reptiles, amphibians and teleosts. GnIH and its homologs share a common C-terminal LPXRFamide (X=L or Q) motif. A mammalian GnIH homolog also inhibited gonadotropin release in mammals like the GnIH action in birds. In contrast to higher vertebrates, hypophysiotropic activities of GnIH homologs were different in lower vertebrates. To clarify the evolutionary origin of GnIH and its homologs, we further sought to identify novel LPXRFamide peptides from the brain of sea lamprey and hagfish, two extant groups of the oldest lineage of vertebrates, Agnatha. In these agnathans, LPXRFamide peptide and its cDNA were identified only from the brain of hagfish. Based on these findings over the past decade, this paper summarizes the evolutionary origin and divergence of GnIH and its homologous peptides.

  9. Positive Predictive Values of International Classification of Diseases, 10th Revision Coding Algorithms to Identify Patients With Autosomal Dominant Polycystic Kidney Disease

    Directory of Open Access Journals (Sweden)

    Vinusha Kalatharan

    2016-12-01

    Full Text Available Background: International Classification of Diseases, 10th Revision codes (ICD-10 for autosomal dominant polycystic kidney disease (ADPKD is used within several administrative health care databases. It is unknown whether these codes identify patients who meet strict clinical criteria for ADPKD. Objective: The objective of this study is (1 to determine whether different ICD-10 coding algorithms identify adult patients who meet strict clinical criteria for ADPKD as assessed through medical chart review and (2 to assess the number of patients identified with different ADPKD coding algorithms in Ontario. Design: Validation study of health care database codes, and prevalence. Setting: Ontario, Canada. Patients: For the chart review, 201 adult patients with hospital encounters between April 1, 2002, and March 31, 2014, assigned either ICD-10 codes Q61.2 or Q61.3. Measurements: This study measured positive predictive value of the ICD-10 coding algorithms and the number of Ontarians identified with different coding algorithms. Methods: We manually reviewed a random sample of medical charts in London, Ontario, Canada, and determined whether or not ADPKD was present according to strict clinical criteria. Results: The presence of either ICD-10 code Q61.2 or Q61.3 in a hospital encounter had a positive predictive value of 85% (95% confidence interval [CI], 79%-89% and identified 2981 Ontarians (0.02% of the Ontario adult population. The presence of ICD-10 code Q61.2 in a hospital encounter had a positive predictive value of 97% (95% CI, 86%-100% and identified 394 adults in Ontario (0.003% of the Ontario adult population. Limitations: (1 We could not calculate other measures of validity; (2 the coding algorithms do not identify patients without hospital encounters; and (3 coding practices may differ between hospitals. Conclusions: Most patients with ICD-10 code Q61.2 or Q61.3 assigned during their hospital encounters have ADPKD according to the clinical

  10. Single Ih channels in pyramidal neuron dendrites: properties, distribution, and impact on action potential output

    NARCIS (Netherlands)

    Kole, Maarten H. P.; Hallermann, Stefan; Stuart, Greg J.

    2006-01-01

    The hyperpolarization-activated cation current (Ih) plays an important role in regulating neuronal excitability, yet its native single-channel properties in the brain are essentially unknown. Here we use variance-mean analysis to study the properties of single Ih channels in the apical dendrites of

  11. 42 CFR 137.206 - Why does the IHS need this information?

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Why does the IHS need this information? 137.206 Section 137.206 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES INDIAN HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES TRIBAL SELF-GOVERNANCE Operational Provisions Health Status Reports § 137.206 Why does the IHS...

  12. Cell-Type Specific Development of the Hyperpolarization-Activated Current, Ih, in Prefrontal Cortical Neurons

    Directory of Open Access Journals (Sweden)

    Sha-Sha Yang

    2018-05-01

    Full Text Available H-current, also known as hyperpolarization-activated current (Ih, is an inward current generated by the hyperpolarization-activated cyclic nucleotide-gated (HCN cation channels. Ih plays an essential role in regulating neuronal properties, synaptic integration and plasticity, and synchronous activity in the brain. As these biological factors change across development, the brain undergoes varying levels of vulnerability to disorders like schizophrenia that disrupt prefrontal cortex (PFC-dependent function. However, developmental changes in Ih in PFC neurons remains untested. Here, we examine Ih in pyramidal neurons vs. gamma-aminobutyric acid (GABAergic parvalbumin-expressing (PV+ interneurons in developing mouse PFC. Our findings show that the amplitudes of Ih in these cell types are identical during the juvenile period but differ at later time points. In pyramidal neurons, Ih amplitude significantly increases from juvenile to adolescence and follows a similar trend into adulthood. In contrast, the amplitude of Ih in PV+ interneurons decreases from juvenile to adolescence, and does not change from adolescence to adulthood. Moreover, the kinetics of HCN channels in pyramidal neurons is significantly slower than in PV+ interneurons, with a gradual decrease in pyramidal neurons and a gradual increase in PV+ cells across development. Our study reveals distinct developmental trajectories of Ih in pyramidal neurons and PV+ interneurons. The cell-type specific alteration of Ih during the critical period from juvenile to adolescence reflects the contribution of Ih to the maturation of the PFC and PFC-dependent function. These findings are essential for a better understanding of normal PFC function, and for elucidating Ih’s crucial role in the pathophysiology of neurodevelopmental disorders.

  13. Positive predictive values of the International Classification of Disease, 10th edition diagnoses codes for diverticular disease in the Danish National Registry of Patients

    Directory of Open Access Journals (Sweden)

    Rune Erichsen

    2010-10-01

    Full Text Available Rune Erichsen1, Lisa Strate2, Henrik Toft Sørensen1, John A Baron31Department of Clinical Epidemiology, Aarhus University Hospital, Denmark; 2Division of Gastroenterology, University of Washington, Seattle, WA, USA; 3Departments of Medicine and of Community and Family Medicine, Dartmouth Medical School, NH, USAObjective: To investigate the accuracy of diagnostic coding for diverticular disease in the Danish National Registry of Patients (NRP.Study design and setting: At Aalborg Hospital, Denmark, with a catchment area of 640,000 inhabitants, we identified 100 patients recorded in the NRP with a diagnosis of diverticular disease (International Classification of Disease codes, 10th revision [ICD-10] K572–K579 during the 1999–2008 period. We assessed the positive predictive value (PPV as a measure of the accuracy of discharge codes for diverticular disease using information from discharge abstracts and outpatient notes as the reference standard.Results: Of the 100 patients coded with diverticular disease, 49 had complicated diverticular disease, whereas 51 had uncomplicated diverticulosis. For the overall diagnosis of diverticular disease (K57, the PPV was 0.98 (95% confidence intervals [CIs]: 0.93, 0.99. For the more detailed subgroups of diagnosis indicating the presence or absence of complications (K573–K579 the PPVs ranged from 0.67 (95% CI: 0.09, 0.99 to 0.92 (95% CI: 0.52, 1.00. The diagnosis codes did not allow accurate identification of uncomplicated disease or any specific complication. However, the combined ICD-10 codes K572, K574, and K578 had a PPV of 0.91 (95% CI: 0.71, 0.99 for any complication.Conclusion: The diagnosis codes in the NRP can be used to identify patients with diverticular disease in general; however, they do not accurately discern patients with uncomplicated diverticulosis or with specific diverticular complications.Keywords: diverticulum, colon, diverticulitis, validation studies

  14. Effect of cortisol on gonadotropin inhibitory hormone (GnIH) in the cinnamon clownfish, Amphiprion melanopus.

    Science.gov (United States)

    Choi, Young Jae; Habibi, Hamid R; Kil, Gyung-Suk; Jung, Min-Min; Choi, Cheol Young

    2017-04-01

    Hypothalamic peptides, gonadotropin-releasing hormone (GnRH) and gonadotropin inhibitory hormone (GnIH), play pivotal roles in the control of reproduction and gonadal maturation in fish. In the present study we tested the possibility that stress-mediated reproductive dysfunction in teleost may involve changes in GnRH and GnIH activity. We studied expression of brain GnIH, GnIH-R, seabream GnRH (sbGnRH), as well as circulating levels of follicle stimulating hormone (FSH), and luteinizing hormone (LH) in the cinnamon clownfish, Amphiprion melanopus. Treatment with cortisol increased GnIH mRNA level, but reduced sbGnRH mRNA and circulating levels of LH and FSH in cinnamon clownfish. Using double immunofluorescence staining, we found expression of both GnIH and GnRH in the diencephalon region of cinnamon clownfish brain. These findings support the hypothesis that cortisol, an indicator of stress, affects reproduction, in part, by increasing GnIH in cinnamon clownfish which contributes to hypothalamic suppression of reproductive function in A. melanopus, a protandrous hermaphroditic fish. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. A New Coding System for Metabolic Disorders Demonstrates Gaps in the International Disease Classifications ICD-10 and SNOMED-CT, Which Can Be Barriers to Genotype-Phenotype Data Sharing

    NARCIS (Netherlands)

    Sollie, Annet; Sijmons, Rolf H.; Lindhout, Dick; van der Ploeg, Ans T.; Gozalbo, M. Estela Rubio; Smit, G. Peter A.; Verheijen, Frans; Waterham, Hans R.; van Weely, Sonja; Wijburg, Frits A.; Wijburg, Rudolph; Visser, Gepke

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of

  16. Reversible pressure-induced crystal-amorphous structural transformation in ice Ih

    Science.gov (United States)

    English, Niall J.; Tse, John S.

    2014-08-01

    Molecular dynamics (MD) simulation of depressurised high-density amorphous ice (HDA) at 80 K and at negative pressures has been performed. Over several attempts, HDA recrystallised to a form close to hexagonal ice Ih, albeit with some defects. The results support the hypothesis that compression of ice-Ih to HDA is a reversible first-order phase transition, with a large hysteresis. Therefore, it would appear that LDA is not truly amorphous. The elastic energy estimated from the area of the hysteresis loop is ca. 4.5 kJ/mol, in some way consistent with experimentally-determined accumulated successive heats of transformations from recovered HDA → ice Ih.

  17. Review: Regulatory mechanisms of gonadotropin-inhibitory hormone (GnIH synthesis and release in photoperiodic animals

    Directory of Open Access Journals (Sweden)

    Kazuyoshi eTsutsui

    2013-04-01

    Full Text Available Gonadotropin-inhibitory hormone (GnIH is a novel hypothalamic neuropeptide that was discovered in quail as an inhibitory factor for gonadotropin release. GnIH inhibits gonadotropin synthesis and release in birds through actions on gonadotropin-releasing hormone (GnRH neurons and gonadotropes, mediated via the GnIH receptor (GnIH-R, GPR147. Subsequently, GnIH was identified in mammals and other vertebrates. As in birds, mammalian GnIH inhibits gonadotropin secretion, indicating a conserved role for this neuropeptide in the control of the hypothalamic-pituitary-gonadal (HPG axis across species. Identification of the regulatory mechanisms governing GnIH expression and release is important in understanding the physiological role of the GnIH system. A nocturnal hormone, melatonin, appears to act directly on GnIH neurons through its receptor to induce expression and release of GnIH in quail, a photoperiodic bird. Recently, a similar, but opposite, action of melatonin on the inhibition of expression of mammalian GnIH was shown in hamsters and sheep, photoperiodic mammals. These results in photoperiodic animals demonstrate that GnIH expression is photoperiodically modulated via a melatonin-dependent process. Recent findings indicate that GnIH may be a mediator of stress-induced reproductive disruption in birds and mammals, pointing to a broad role for this neuropeptide in assessing physiological state and modifying reproductive effort accordingly. This paper summarizes the advances made in our knowledge regarding the regulation of GnIH synthesis and release in photoperiodic birds and mammals. This paper also discusses the neuroendocrine integration of environmental signals, such as photoperiods and stress, and internal signals, such as GnIH, melatonin and glucocorticoids, to control avian and mammalian reproduction.

  18. Comparison study on flexible pavement design using FAA (Federal Aviation Administration) and LCN (Load Classification Number) code in Ahmad Yani international airport’s runway

    Science.gov (United States)

    Santoso, S. E.; Sulistiono, D.; Mawardi, A. F.

    2017-11-01

    FAA code for airport design has been broadly used by Indonesian Ministry of Aviation since decades ago. However, there is not much comprehensive study about its relevance and efficiency towards current situation in Indonesia. Therefore, a further comparison study on flexible pavement design for airport runway using comparable method has become essential. The main focus of this study is to compare which method between FAA and LCN that offer the most efficient and effective way in runway pavement planning. The comparative methods in this study mainly use the variety of variable approach. FAA code for instance, will use the approach on the aircraft’s maximum take-off weight and annual departure. Whilst LCN code use the variable of equivalent single wheel load and tire pressure. Based on the variables mentioned above, a further classification and rated method will be used to determine which code is best implemented. According to the analysis, it is clear that FAA method is the most effective way to plan runway design in Indonesia with consecutively total pavement thickness of 127cm and LCN method total pavement thickness of 70cm. Although, FAA total pavement is thicker that LCN its relevance towards sustainable and pristine condition in the future has become an essential aspect to consider in design and planning.

  19. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1

    Science.gov (United States)

    Orchard, John; Rae, Katherine; Brooks, John; Hägglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    The Orchard Sports Injury Classification System (OSICS) is one of the world’s most commonly used systems for coding injury diagnoses in sports injury surveillance systems. Its major strengths are that it has wide usage, has codes specific to sports medicine and that it is free to use. Literature searches and stakeholder consultations were made to assess the uptake of OSICS and to develop new versions. OSICS was commonly used in the sports of football (soccer), Australian football, rugby union, cricket and tennis. It is referenced in international papers in three sports and used in four commercially available computerised injury management systems. Suggested injury categories for the major sports are presented. New versions OSICS 9 (three digit codes) and OSICS 10.1 (four digit codes) are presented. OSICS is a potentially helpful component of a comprehensive sports injury surveillance system, but many other components are required. Choices made in developing these components should ideally be agreed upon by groups of researchers in consensus statements. PMID:24198559

  20. Nomenclature for congenital and paediatric cardiac disease: the International Paediatric and Congenital Cardiac Code (IPCCC) and the Eleventh Iteration of the International Classification of Diseases (ICD-11).

    Science.gov (United States)

    Franklin, Rodney C G; Béland, Marie J; Colan, Steven D; Walters, Henry L; Aiello, Vera D; Anderson, Robert H; Bailliard, Frédérique; Boris, Jeffrey R; Cohen, Meryl S; Gaynor, J William; Guleserian, Kristine J; Houyel, Lucile; Jacobs, Marshall L; Juraszek, Amy L; Krogmann, Otto N; Kurosawa, Hiromi; Lopez, Leo; Maruszewski, Bohdan J; St Louis, James D; Seslar, Stephen P; Srivastava, Shubhika; Stellin, Giovanni; Tchervenkov, Christo I; Weinberg, Paul M; Jacobs, Jeffrey P

    2017-12-01

    An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many "short list" versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various "short lists". In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the "short list" for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses

  1. FFT-enhanced IHS transform method for fusing high-resolution satellite images

    Science.gov (United States)

    Ling, Y.; Ehlers, M.; Usery, E.L.; Madden, M.

    2007-01-01

    Existing image fusion techniques such as the intensity-hue-saturation (IHS) transform and principal components analysis (PCA) methods may not be optimal for fusing the new generation commercial high-resolution satellite images such as Ikonos and QuickBird. One problem is color distortion in the fused image, which causes visual changes as well as spectral differences between the original and fused images. In this paper, a fast Fourier transform (FFT)-enhanced IHS method is developed for fusing new generation high-resolution satellite images. This method combines a standard IHS transform with FFT filtering of both the panchromatic image and the intensity component of the original multispectral image. Ikonos and QuickBird data are used to assess the FFT-enhanced IHS transform method. Experimental results indicate that the FFT-enhanced IHS transform method may improve upon the standard IHS transform and the PCA methods in preserving spectral and spatial information. ?? 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  2. Classification of headache on the basis of the IHS diagnostic criteria

    NARCIS (Netherlands)

    de Bruijn-Kofman, AT; van de Wiel, H; Sorbi, MJ

    1999-01-01

    Objective: To test the effects of a mass-media behavioral treatment program on migraine and tension-type headache, patients with pure migraine, and with pure tension-type headache were to be selected. Patient Selection: A random sample of 233 headache sufferers of 15,000 subscribers to the program.

  3. Classification of multispectral or hyperspectral satellite imagery using clustering of sparse approximations on sparse representations in learned dictionaries obtained using efficient convolutional sparse coding

    Science.gov (United States)

    Moody, Daniela; Wohlberg, Brendt

    2018-01-02

    An approach for land cover classification, seasonal and yearly change detection and monitoring, and identification of changes in man-made features may use a clustering of sparse approximations (CoSA) on sparse representations in learned dictionaries. The learned dictionaries may be derived using efficient convolutional sparse coding to build multispectral or hyperspectral, multiresolution dictionaries that are adapted to regional satellite image data. Sparse image representations of images over the learned dictionaries may be used to perform unsupervised k-means clustering into land cover categories. The clustering process behaves as a classifier in detecting real variability. This approach may combine spectral and spatial textural characteristics to detect geologic, vegetative, hydrologic, and man-made features, as well as changes in these features over time.

  4. Decoding the encoding of functional brain networks: An fMRI classification comparison of non-negative matrix factorization (NMF), independent component analysis (ICA), and sparse coding algorithms.

    Science.gov (United States)

    Xie, Jianwen; Douglas, Pamela K; Wu, Ying Nian; Brody, Arthur L; Anderson, Ariana E

    2017-04-15

    Brain networks in fMRI are typically identified using spatial independent component analysis (ICA), yet other mathematical constraints provide alternate biologically-plausible frameworks for generating brain networks. Non-negative matrix factorization (NMF) would suppress negative BOLD signal by enforcing positivity. Spatial sparse coding algorithms (L1 Regularized Learning and K-SVD) would impose local specialization and a discouragement of multitasking, where the total observed activity in a single voxel originates from a restricted number of possible brain networks. The assumptions of independence, positivity, and sparsity to encode task-related brain networks are compared; the resulting brain networks within scan for different constraints are used as basis functions to encode observed functional activity. These encodings are then decoded using machine learning, by using the time series weights to predict within scan whether a subject is viewing a video, listening to an audio cue, or at rest, in 304 fMRI scans from 51 subjects. The sparse coding algorithm of L1 Regularized Learning outperformed 4 variations of ICA (pcoding algorithms. Holding constant the effect of the extraction algorithm, encodings using sparser spatial networks (containing more zero-valued voxels) had higher classification accuracy (pcoding algorithms suggests that algorithms which enforce sparsity, discourage multitasking, and promote local specialization may capture better the underlying source processes than those which allow inexhaustible local processes such as ICA. Negative BOLD signal may capture task-related activations. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Povratna informacija pri matematičnih domačih nalogah

    OpenAIRE

    Žitko, Urša

    2017-01-01

    V diplomskem delu je predstavljena povratna informacija pri matematičnih domačih nalogah. V teoretičnem delu so predstavljene domače naloge: njihova zgodovina, namen in več klasifikacij vrst domačih nalog. Domače naloge so namenjene učencem, da z njimi usvojijo in utrdijo obravnavano snov. Domača naloga učencem omogoča učenje uporabe postopkov, ki jih že poznajo, in pridobivanje novih spretnosti za nadaljnje razumevanje snovi. Z rednim pisanjem domačih nalog se učenci naučijo vestnega in sam...

  6. Qualidades psicométricas do Inventário de Habilidades Sociais (IHS: estudo sobre a estabilidade temporal e a validade concomitante Psychometric qualities of a Social Skills Inventory (IHS: a study of its temporal stability and concomitant validity

    Directory of Open Access Journals (Sweden)

    Marina Bandeira

    2000-12-01

    Full Text Available Esta pesquisa investiga as qualidades psicométricas do Inventário de Habilidades Sociais (IHS em termos de sua validade concomitante e de sua fidedignidade ou estabilidade temporal. Participaram desta pesquisa, 104 estudantes de Psicologia que foram submetidos à aplicação do IHS e da Escala de Assertividade de Rathus. Os resultados mostram uma correlação significativa entre estas duas escalas de avaliação. Os dados referentes à aplicação teste-reteste do IHS, em uma subamostra aleatória de 39 sujeitos, mostram igualmente um correlação significativa entre as duas aplicações. Estes resultados indicam que o IHS possui validade concomitante e fidedignidade ou estabilidade temporal. Estas conclusões complementam os resultados de estudos anteriores sobre as qualidades psicométricas do IHS e recomendam a utilização desta escala para avaliar as habilidades sociais de estudantes universitários no contexto brasileiro.This research investigates the psychometric properties of the Social Skills Inventory (IHS in terms of its concomitant validity and reliability. A sample of 104 psychology students participated in this research. Two scales were applied to the subjects, the IHS and the Rathus Assertiveness Scale. The results showed a significant correlation between the IHS and the Rathus Scale scores. The IHS was also reapplied to a randomized sub-sample of 39 students. The results of this test-retest application also showed a significant correlation between these scores. The results indicate that the IHS has concomitant validity and reliability or temporal stability. This study adds positive results to a previous study investigating some other psychometric properties of the IHS and recommends the use of this scale for the evaluation of college student social skills.

  7. Correlation between patients' reasons for encounters/health problems and population density in Japan: a systematic review of observational studies coded by the International Classification of Health Problems in Primary Care (ICHPPC) and the International Classification of Primary care (ICPC).

    Science.gov (United States)

    Kaneko, Makoto; Ohta, Ryuichi; Nago, Naoki; Fukushi, Motoharu; Matsushima, Masato

    2017-09-13

    The Japanese health care system has yet to establish structured training for primary care physicians; therefore, physicians who received an internal medicine based training program continue to play a principal role in the primary care setting. To promote the development of a more efficient primary health care system, the assessment of its current status in regard to the spectrum of patients' reasons for encounters (RFEs) and health problems is an important step. Recognizing the proportions of patients' RFEs and health problems, which are not generally covered by an internist, can provide valuable information to promote the development of a primary care physician-centered system. We conducted a systematic review in which we searched six databases (PubMed, the Cochrane Library, Google Scholar, Ichushi-Web, JDreamIII and CiNii) for observational studies in Japan coded by International Classification of Health Problems in Primary Care (ICHPPC) and International Classification of Primary Care (ICPC) up to March 2015. We employed population density as index of accessibility. We calculated Spearman's rank correlation coefficient to examine the correlation between the proportion of "non-internal medicine-related" RFEs and health problems in each study area in consideration of the population density. We found 17 studies with diverse designs and settings. Among these studies, "non-internal medicine-related" RFEs, which was not thought to be covered by internists, ranged from about 4% to 40%. In addition, "non-internal medicine-related" health problems ranged from about 10% to 40%. However, no significant correlation was found between population density and the proportion of "non-internal medicine-related" RFEs and health problems. This is the first systematic review on RFEs and health problems coded by ICHPPC and ICPC undertaken to reveal the diversity of health problems in Japanese primary care. These results suggest that primary care physicians in some rural areas of Japan

  8. Critical Evaluation of Headache Classifications.

    Science.gov (United States)

    Özge, Aynur

    2013-08-01

    Transforming a subjective sense like headache into an objective state and establishing a common language for this complaint which can be both a symptom and a disease all by itself have kept the investigators busy for years. Each recommendation proposed has brought along a set of patients who do not meet the criteria. While almost the most ideal and most comprehensive classification studies continued at this point, this time criticisims about withdrawing from daily practice came to the fore. In this article, the classification adventure of scientists who work in the area of headache will be summarized. More specifically, 2 classifications made by the International Headache Society (IHS) and the point reached in relation with the 3rd classification which is still being worked on will be discussed together with headache subtypes. It has been presented with the wish and belief that it will contribute to the readers and young investigators who are interested in this subject.

  9. Classification of radiological procedures

    International Nuclear Information System (INIS)

    1989-01-01

    A classification for departments in Danish hospitals which use radiological procedures. The classification codes consist of 4 digits, where the first 2 are the codes for the main groups. The first digit represents the procedure's topographical object and the second the techniques. The last 2 digits describe individual procedures. (CLS)

  10. 42 CFR 137.275 - May Self-Governance Tribes include IHS construction programs in a construction project agreement...

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false May Self-Governance Tribes include IHS construction... OF HEALTH AND HUMAN SERVICES TRIBAL SELF-GOVERNANCE Construction Purpose and Scope § 137.275 May Self-Governance Tribes include IHS construction programs in a construction project agreement or in a funding...

  11. Complete genome sequence of Pseudomonas rhizosphaerae IH5T (=DSM 16299T), a phosphate-solubilizing rhizobacterium for bacterial biofertilizer.

    Science.gov (United States)

    Kwak, Yunyoung; Jung, Byung Kwon; Shin, Jae-Ho

    2015-01-10

    Pseudomonas rhizosphaerae IH5(T) (=DSM 16299(T)), isolated from the rhizospheric soil of grass growing in Spain, has been reported as a novel species of the genus Pseudomonas harboring insoluble phosphorus solubilizing activity. To understanding the multifunctional biofertilizer better, we report the complete genome sequence of P. rhizosphaerae IH5(T). Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Byamukama, Abdul [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Haiyong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-10-15

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries.

  13. Developing A Specific Criteria For Categorization Of Radioactive Waste Classification System For Uganda Using The Radar's Computer Code

    International Nuclear Information System (INIS)

    Byamukama, Abdul; Jung, Haiyong

    2014-01-01

    Radioactive materials are utilized in industries, agriculture and research, medical facilities and academic institutions for numerous purposes that are useful in the daily life of mankind. To effectively manage the radioactive waste and selecting appropriate disposal schemes, it is imperative to have a specific criteria for allocating radioactive waste to a particular waste class. Uganda has a radioactive waste classification scheme based on activity concentration and half-life albeit in qualitative terms as documented in the Uganda Atomic Energy Regulations 2012. There is no clear boundary between the different waste classes and hence difficult to; suggest disposal options, make decisions and enforcing compliance, communicate with stakeholders effectively among others. To overcome the challenges, the RESRAD computer code was used to derive a specific criteria for classifying between the different waste categories for Uganda basing on the activity concentration of radionuclides. The results were compared with that of Australia and were found to correlate given the differences in site parameters and consumption habits of the residents in the two countries

  14. Transforming wealth: using the inverse hyperbolic sine (IHS) and splines to predict youth's math achievement.

    Science.gov (United States)

    Friedline, Terri; Masa, Rainier D; Chowa, Gina A N

    2015-01-01

    The natural log and categorical transformations commonly applied to wealth for meeting the statistical assumptions of research may not always be appropriate for adjusting for skewness given wealth's unique properties. Finding and applying appropriate transformations is becoming increasingly important as researchers consider wealth as a predictor of well-being. We present an alternative transformation-the inverse hyperbolic sine (IHS)-for simultaneously dealing with skewness and accounting for wealth's unique properties. Using the relationship between household wealth and youth's math achievement as an example, we apply the IHS transformation to wealth data from US and Ghanaian households. We also explore non-linearity and accumulation thresholds by combining IHS transformed wealth with splines. IHS transformed wealth relates to youth's math achievement similarly when compared to categorical and natural log transformations, indicating that it is a viable alternative to other transformations commonly used in research. Non-linear relationships and accumulation thresholds emerge that predict youth's math achievement when splines are incorporated. In US households, accumulating debt relates to decreases in math achievement whereas accumulating assets relates to increases in math achievement. In Ghanaian households, accumulating assets between the 25th and 50th percentiles relates to increases in youth's math achievement. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Multipole moments of water molecules in clusters and ice Ih from first principles calculations

    International Nuclear Information System (INIS)

    Batista, E.R.; Xantheas, S.S.; Jonsson, H.

    1999-01-01

    We have calculated molecular multipole moments for water molecules in clusters and in ice Ih by partitioning the charge density obtained from first principles calculations. Various schemes for dividing the electronic charge density among the water molecules were used. They include Bader close-quote s zero flux surfaces and Voronoi partitioning schemes. A comparison was also made with an induction model including dipole, dipole-quadrupole, quadrupole-quadrupole polarizability and first hyperpolarizability as well as fixed octopole and hexadecapole moments. We have found that the different density partitioning schemes lead to widely different values for the molecular multipoles, illustrating how poorly defined molecular multipoles are in clusters and condensed environments. For instance, the magnitude of the molecular dipole moment in ice Ih ranges between 2.3 D and 3.1 D depending on the partitioning scheme used. Within each scheme, though, the value for the molecular dipole moment in ice is larger than in the hexamer. The magnitude of the molecular dipole moment in the clusters shows a monotonic increase from the gas phase value to the one in ice Ih, with the molecular dipole moment in the water ring hexamer being smaller than the one in ice Ih for all the partitioning schemes used. copyright 1999 American Institute of Physics

  16. Remote Sensing Image Fusion Based on the Combination Grey Absolute Correlation Degree and IHS Transform

    Directory of Open Access Journals (Sweden)

    Hui LIN

    2014-12-01

    Full Text Available An improved fusion algorithm for multi-source remote sensing images with high spatial resolution and multi-spectral capacity is proposed based on traditional IHS fusion and grey correlation analysis. Firstly, grey absolute correlation degree is used to discriminate non-edge pixels and edge pixels in high-spatial resolution images, by which the weight of intensity component is identified in order to combine it with high-spatial resolution image. Therefore, image fusion is achieved using IHS inverse transform. The proposed method is applied to ETM+ multi-spectral images and panchromatic image, and Quickbird’s multi-spectral images and panchromatic image respectively. The experiments prove that the fusion method proposed in the paper can efficiently preserve spectral information of the original multi-spectral images while enhancing spatial resolution greatly. By comparison and analysis, the proposed fusion algorithm is better than traditional IHS fusion and fusion method based on grey correlation analysis and IHS transform.

  17. Photoabsorption of the molecular IH cation at the iodine 3 d absorption edge

    Science.gov (United States)

    Klumpp, Stephan; Guda, Alexander A.; Schubert, Kaja; Mertens, Karolin; Hellhund, Jonas; Müller, Alfred; Schippers, Stefan; Bari, Sadia; Martins, Michael

    2018-03-01

    Yields of atomic iodine Iq + (q ≥2 ) fragments resulting from photoexcitation and photoionization of the target ions IH+ and I+ have been measured in the photon-energy range 610-680 eV, which comprises the thresholds for iodine 3 d ionization. The measured ion-yield spectra show two strong and broad resonance features due to the excitation of the 3 d3 /2 ,5 /2 electrons into ɛ f states rather similar for both parent ions. In the 3 d pre-edge range, excitations into (n p π ) -like orbitals and into an additional σ* orbital are found for IH+, which have been identified by comparison of the atomic I+ and molecular IH+ data and with the help of (time-dependent) density functional theory (DFT) and atomic Hartree-Fock calculations. The (5 p π ) orbital is almost atomlike, whereas all other resonances of the IH+ primary ion show a more pronounced molecular character, which is deduced from the chemical shifts of the resonances and the theoretical analysis.

  18. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life

    DEFF Research Database (Denmark)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    : Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers......AIM: To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. METHOD...... of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after...

  19. International Classification of Primary Care-2 coding of primary care data at the general out-patients' clinic of General Hospital, Lagos, Nigeria.

    Science.gov (United States)

    Olagundoye, Olawunmi Abimbola; van Boven, Kees; van Weel, Chris

    2016-01-01

    Primary care serves as an integral part of the health systems of nations especially the African continent. It is the portal of entry for nearly all patients into the health care system. Paucity of accurate data for health statistics remains a challenge in the most parts of Africa because of inadequate technical manpower and infrastructure. Inadequate quality of data systems contributes to inaccurate data. A simple-to-use classification system such as the International Classification of Primary Care (ICPC) may be a solution to this problem at the primary care level. To apply ICPC-2 for secondary coding of reasons for encounter (RfE), problems managed and processes of care in a Nigerian primary care setting. Furthermore, to analyze the value of selected presented symptoms as predictors of the most common diagnoses encountered in the study setting. Content analysis of randomly selected patients' paper records for data collection at the end of clinic sessions conducted by family physicians at the general out-patients' clinics. Contents of clinical consultations were secondarily coded with the ICPC-2 and recorded into excel spreadsheets with fields for sociodemographic data such as age, sex, occupation, religion, and ICPC elements of an encounter: RfE/complaints, diagnoses/problems, and interventions/processes of care. Four hundred and one encounters considered in this study yielded 915 RfEs, 546 diagnoses, and 1221 processes. This implies an average of 2.3 RfE, 1.4 diagnoses, and 3.0 processes per encounter. The top 10 RfE, diagnoses/common illnesses, and processes were determined. Through the determination of the probability of the occurrence of certain diseases beginning with a RfE/complaint, the top five diagnoses that resulted from each of the top five RfE were also obtained. The top five RfE were: headache, fever, pain general/multiple sites, visual disturbance other and abdominal pain/cramps general. The top five diagnoses were: Malaria, hypertension

  20. A new coding system for metabolic disorders demonstrates gaps in the international disease classifications ICD-10 and SNOMED-CT, which can be barriers to genotype-phenotype data sharing.

    Science.gov (United States)

    Sollie, Annet; Sijmons, Rolf H; Lindhout, Dick; van der Ploeg, Ans T; Rubio Gozalbo, M Estela; Smit, G Peter A; Verheijen, Frans; Waterham, Hans R; van Weely, Sonja; Wijburg, Frits A; Wijburg, Rudolph; Visser, Gepke

    2013-07-01

    Data sharing is essential for a better understanding of genetic disorders. Good phenotype coding plays a key role in this process. Unfortunately, the two most widely used coding systems in medicine, ICD-10 and SNOMED-CT, lack information necessary for the detailed classification and annotation of rare and genetic disorders. This prevents the optimal registration of such patients in databases and thus data-sharing efforts. To improve care and to facilitate research for patients with metabolic disorders, we developed a new coding system for metabolic diseases with a dedicated group of clinical specialists. Next, we compared the resulting codes with those in ICD and SNOMED-CT. No matches were found in 76% of cases in ICD-10 and in 54% in SNOMED-CT. We conclude that there are sizable gaps in the SNOMED-CT and ICD coding systems for metabolic disorders. There may be similar gaps for other classes of rare and genetic disorders. We have demonstrated that expert groups can help in addressing such coding issues. Our coding system has been made available to the ICD and SNOMED-CT organizations as well as to the Orphanet and HPO organizations for further public application and updates will be published online (www.ddrmd.nl and www.cineas.org). © 2013 WILEY PERIODICALS, INC.

  1. Path-integral simulation of ice Ih: The effect of pressure

    Science.gov (United States)

    Herrero, Carlos P.; Ramírez, Rafael

    2011-12-01

    The effect of pressure on structural and thermodynamic properties of ice Ih has been studied by means of path-integral molecular dynamics simulations at temperatures between 50 and 300 K. Interatomic interactions were modeled by using the effective q-TIP4P/F potential for flexible water. Positive (compression) and negative (tension) pressures have been considered, which allowed us to approach the limits for the mechanical stability of this solid water phase. We have studied the pressure dependence of the crystal volume, bulk modulus, interatomic distances, atomic delocalization, and kinetic energy. The spinodal point at both negative and positive pressures is derived from the vanishing of the bulk modulus. For P300 K. At positive pressure the spinodal is associated with ice amorphization, and at low temperatures it is found to be between 1.1 and 1.3 GPa. Quantum nuclear effects cause a reduction of the metastability region of ice Ih.

  2. Long Term Processing Using Integrated Hydropyrolysis plus Hydroconversion (IH2) for the Production of Gasoline and Diesel from Biomass

    Energy Technology Data Exchange (ETDEWEB)

    Marker, Terry [Gas Technology Institute; Roberts, Michael [Gas Technology Institute; Linck, Martin [Gas Technology Institute; Felix, Larry [Gas Technology Institute; Ortiz-Toral, Pedro [Gas Technology Institute; Wangerow, Jim [Gas Technology Institute; McLeod, Celeste [CRI Catalyst; Del Paggio, Alan [CRI Catalyst; Gephart, John [Johnson Timber; Starr, Jack [Cargill; Hahn, John [Cargill

    2013-06-09

    Cellulosic and woody biomass can be directly converted to hydrocarbon gasoline and diesel blending components through the use of a new, economical, technology named integrated hydropyrolysis plus hydroconversion (IH2). The IH2 gasoline and diesel blending components are fully compatible with petroleum based gasoline and diesel, contain less than 1% oxygen and have less than 1 total acid number (TAN). The IH2 gasoline is high quality and very close to a drop in fuel. The life cycle analysis (LCA) shows that the use of the IH2 process to convert wood to gasoline and diesel results in a greater than 90% reduction in greenhouse gas emission compared to that found with fossil derived fuels. The technoeconomic analysis showed the conversion of wood using the IH2 process can produce gasoline and diesel at less than $2.00/gallon. In this project, the previously reported semi-continuous small scale IH2 test results were confirmed in a continuous 50 kg/day pilot plant. The continuous IH2 pilot plant used in this project was operated round the clock for over 750 hours and showed good pilot plant operability while consistently producing 26-28 wt % yields of high quality gasoline and diesel product. The IH2 catalyst showed good stability, although more work on catalyst stability is recommended. Additional work is needed to commercialize the IH2 technology including running large particle size biomass, modeling the hydropyrolysis step, studying the effects of process variables and building and operating a 1-50 ton/day demonstration scale plant. The IH2 is a true game changing technology by utilizing U.S. domestic renewable biomass resources to create transportation fuels, sufficient in quantity and quality to substantially reduce our reliance on foreign crude oil. Thus, the IH2 technology offers a path to genuine energy independence for the U. S., along with the creation of a significant number of new U.S. jobs to plant, grow, harvest, and process biomass crops into fungible

  3. Munitions Classification Library

    Science.gov (United States)

    2016-04-04

    members of the community to make their own additions to any, or all, of the classification libraries . The next phase entailed data collection over less......Include area code) 04/04/2016 Final Report August 2014 - August 2015 MUNITIONS CLASSIFICATION LIBRARY Mr. Craig Murray, Parsons Dr. Thomas H. Bell, Leidos

  4. Parents' Assessments of Disability in Their Children Using World Health Organization International Classification of Functioning, Disability and Health, Child and Youth Version Joined Body Functions and Activity Codes Related to Everyday Life.

    Science.gov (United States)

    Illum, Niels Ove; Gradel, Kim Oren

    2017-01-01

    To help parents assess disability in their own children using World Health Organization (WHO) International Classification of Functioning, Disability and Health, Child and Youth Version (ICF-CY) code qualifier scoring and to assess the validity and reliability of the data sets obtained. Parents of 162 children with spina bifida, spinal muscular atrophy, muscular disorders, cerebral palsy, visual impairment, hearing impairment, mental disability, or disability following brain tumours performed scoring for 26 body functions qualifiers (b codes) and activities and participation qualifiers (d codes). Scoring was repeated after 6 months. Psychometric and Rasch data analysis was undertaken. The initial and repeated data had Cronbach α of 0.96 and 0.97, respectively. Inter-code correlation was 0.54 (range: 0.23-0.91) and 0.76 (range: 0.20-0.92). The corrected code-total correlations were 0.72 (range: 0.49-0.83) and 0.75 (range: 0.50-0.87). When repeated, the ICF-CY code qualifier scoring showed a correlation R of 0.90. Rasch analysis of the selected ICF-CY code data demonstrated a mean measure of 0.00 and 0.00, respectively. Code qualifier infit mean square (MNSQ) had a mean of 1.01 and 1.00. The mean corresponding outfit MNSQ was 1.05 and 1.01. The ICF-CY code τ thresholds and category measures were continuous when assessed and reassessed by parents. Participating children had a mean of 56 codes scores (range: 26-130) before and a mean of 55.9 scores (range: 25-125) after repeat. Corresponding measures were -1.10 (range: -5.31 to 5.25) and -1.11 (range: -5.42 to 5.36), respectively. Based on measures obtained at the 2 occasions, the correlation coefficient R was 0.84. The child code map showed coherence of ICF-CY codes at each level. There was continuity in covering the range across disabilities. And, first and foremost, the distribution of codes reflexed a true continuity in disability with codes for motor functions activated first, then codes for cognitive functions

  5. Delicious Low GL space foods by using Low GI materials -IH and Vacuum cooking -

    Science.gov (United States)

    Katayama, Naomi; Nagasaka, Sanako; Murasaki, Masahiro; Space Agriculture Task Force, J.

    Enough life-support systems are necessary to stay in space for a long term. The management of the meal for astronauts is in particular very important. When an astronaut gets sick in outer space, it means death. To astronauts, the delicious good balance space foods are essential for their work. This study was aimed at making balance space foods menu for the healthy space-life. The kitchen utensil has a limit in the space environment. And a method to warm is only heater without fire. Therefore purpose of this study, we make the space foods which make by using vacuum cooking device and the IH heater We made space foods menu to referred to Japanese nutrition standard in 2010. We made space foods menu which are using "brown rice, wheat, soy bean, sweet potato and green-vegetable" and " loach and insects which are silkworm pupa, snail, mud snail, turmait, fly, grasshopper, bee". We use ten health adults as subjects. Ten subjects performed the sensory test of the questionnaire method. There was the sensuality examination in the item of "taste, a fragrance, color, the quantity" and acquired a mark at ten points of perfect scores.. We could make the space foods which we devised with vacuum cooking and IH deliciously. As a result of sensuality examination, the eight points in ten points of perfect scores was appeared. This result showed, our space food menu is delicious. We can store these space foods with a refrigerator for 20 days by making vacuum cooking. This thing is at all important result so that a save is enabled when surplus food was done in future by performing vacuum cooking. We want to make delicious space foods menu with vacuum cooking and IH heater more in future.

  6. Ice Ih anomalies: Thermal contraction, anomalous volume isotope effect, and pressure-induced amorphization

    Science.gov (United States)

    Salim, Michael A.; Willow, Soohaeng Yoo; Hirata, So

    2016-05-01

    Ice Ih displays several anomalous thermodynamic properties such as thermal contraction at low temperatures, an anomalous volume isotope effect (VIE) rendering the volume of D2O ice greater than that of H2O ice, and a pressure-induced transition to the high-density amorphous (HDA) phase. Furthermore, the anomalous VIE increases with temperature, despite its quantum-mechanical origin. Here, embedded-fragment ab initio second-order many-body perturbation (MP2) theory in the quasiharmonic approximation (QHA) is applied to the Gibbs energy of an infinite, proton-disordered crystal of ice Ih at wide ranges of temperatures and pressures. The quantum effect of nuclei moving in anharmonic potentials is taken into account from first principles without any empirical or nonsystematic approximation to either the electronic or vibrational Hamiltonian. MP2 predicts quantitatively correctly the thermal contraction at low temperatures, which is confirmed to originate from the volume-contracting hydrogen-bond bending modes (acoustic phonons). It qualitatively reproduces (but underestimates) the thermal expansion at higher temperatures, caused by the volume-expanding hydrogen-bond stretching (and to a lesser extent librational) modes. The anomalous VIE is found to be the result of subtle cancellations among closely competing isotope effects on volume from all modes. Consequently, even ab initio MP2 with the aug-cc-pVDZ and aug-cc-pVTZ basis sets has difficulty reproducing this anomaly, yielding qualitatively varied predictions of the sign of the VIE depending on such computational details as the choice of the embedding field. However, the temperature growth of the anomalous VIE is reproduced robustly and is ascribed to the librational modes. These solid-state MP2 calculations, as well as MP2 Born-Oppenheimer molecular dynamics, find a volume collapse and a loss of symmetry and long-range order in ice Ih upon pressure loading of 2.35 GPa or higher. Concomitantly, rapid softening of

  7. Ice Ih anomalies: Thermal contraction, anomalous volume isotope effect, and pressure-induced amorphization

    Energy Technology Data Exchange (ETDEWEB)

    Salim, Michael A.; Willow, Soohaeng Yoo; Hirata, So, E-mail: sohirata@illinois.edu [Department of Chemistry, University of Illinois at Urbana-Champaign, 600 South Mathews Avenue, Urbana, Illinois 61801 (United States)

    2016-05-28

    Ice Ih displays several anomalous thermodynamic properties such as thermal contraction at low temperatures, an anomalous volume isotope effect (VIE) rendering the volume of D{sub 2}O ice greater than that of H{sub 2}O ice, and a pressure-induced transition to the high-density amorphous (HDA) phase. Furthermore, the anomalous VIE increases with temperature, despite its quantum-mechanical origin. Here, embedded-fragment ab initio second-order many-body perturbation (MP2) theory in the quasiharmonic approximation (QHA) is applied to the Gibbs energy of an infinite, proton-disordered crystal of ice Ih at wide ranges of temperatures and pressures. The quantum effect of nuclei moving in anharmonic potentials is taken into account from first principles without any empirical or nonsystematic approximation to either the electronic or vibrational Hamiltonian. MP2 predicts quantitatively correctly the thermal contraction at low temperatures, which is confirmed to originate from the volume-contracting hydrogen-bond bending modes (acoustic phonons). It qualitatively reproduces (but underestimates) the thermal expansion at higher temperatures, caused by the volume-expanding hydrogen-bond stretching (and to a lesser extent librational) modes. The anomalous VIE is found to be the result of subtle cancellations among closely competing isotope effects on volume from all modes. Consequently, even ab initio MP2 with the aug-cc-pVDZ and aug-cc-pVTZ basis sets has difficulty reproducing this anomaly, yielding qualitatively varied predictions of the sign of the VIE depending on such computational details as the choice of the embedding field. However, the temperature growth of the anomalous VIE is reproduced robustly and is ascribed to the librational modes. These solid-state MP2 calculations, as well as MP2 Born–Oppenheimer molecular dynamics, find a volume collapse and a loss of symmetry and long-range order in ice Ih upon pressure loading of 2.35 GPa or higher. Concomitantly, rapid

  8. Nuclear code abstracts (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-02-01

    Nuclear Code Abstracts is compiled in the Nuclear Code Committee to exchange information of the nuclear code developments among members of the committee. Enlarging the collection, the present one includes nuclear code abstracts obtained in 1975 through liaison officers of the organizations in Japan participating in the Nuclear Energy Agency's Computer Program Library at Ispra, Italy. The classification of nuclear codes and the format of code abstracts are the same as those in the library. (auth.)

  9. KWIC Index of nuclear codes (1975 edition)

    International Nuclear Information System (INIS)

    Akanuma, Makoto; Hirakawa, Takashi

    1976-01-01

    It is a KWIC Index for 254 nuclear codes in the Nuclear Code Abstracts (1975 edition). The classification of nuclear codes and the form of index are the same as those in the Computer Programme Library at Ispra, Italy. (auth.)

  10. Design and Construction of Power System for Induction Heating (IH) Cooker Using Resonant Converter

    International Nuclear Information System (INIS)

    Soe Thiri Thandar; Clement Saldanah; Win Khaing Moe

    2008-06-01

    Induction Heating (IH) systems using electromagnetic induction are developed in many industrial applications in Myanmar. Many industries have benefited from this new breakthrough by implementing induction heating for melting, hardening, and heating. Induction heating cooker is based on high frequency induction heating,electrical and electronic technologies. From the electronic point of view, induction heating cooker is composed of four parts.They are rectifier, filter, high frequency inverter, and resonant load. The purpose of this research is mainly objected to developed an induction heating cooker. The rectifier module is considered as full-bridge rectifier. The second protion of the system is a capacitive filter. The ripple components are minimized by this filter. The third is a high frequency converter to convert the constant DC to high frequency AC by switching the devices alternately. In this research, the Insulated Gate Bipolar Transistor (IGBT) will be used as a power source, and can be driven by the pulse signals from the pulse transformer circuit. In the resonant load, the power consumption is about 500W. Construction and testing has been carried out. The merits of this research work is that IH cooker can be developed because of having less energy consumption, safe, efficient, quick heating, and having efficiency of 90% or more

  11. A method based on IHS cylindrical transform model for quality assessment of image fusion

    Science.gov (United States)

    Zhu, Xiaokun; Jia, Yonghong

    2005-10-01

    Image fusion technique has been widely applied to remote sensing image analysis and processing, and methods for quality assessment of image fusion in remote sensing have also become the research issues at home and abroad. Traditional assessment methods combine calculation of quantitative indexes and visual interpretation to compare fused images quantificationally and qualitatively. However, in the existing assessment methods, there are two defects: on one hand, most imdexes lack the theoretic support to compare different fusion methods. On the hand, there is not a uniform preference for most of the quantitative assessment indexes when they are applied to estimate the fusion effects. That is, the spatial resolution and spectral feature could not be analyzed synchronously by these indexes and there is not a general method to unify the spatial and spectral feature assessment. So in this paper, on the basis of the approximate general model of four traditional fusion methods, including Intensity Hue Saturation(IHS) triangle transform fusion, High Pass Filter(HPF) fusion, Principal Component Analysis(PCA) fusion, Wavelet Transform(WT) fusion, a correlation coefficient assessment method based on IHS cylindrical transform is proposed. By experiments, this method can not only get the evaluation results of spatial and spectral features on the basis of uniform preference, but also can acquire the comparison between fusion image sources and fused images, and acquire differences among fusion methods. Compared with the traditional assessment methods, the new methods is more intuitionistic, and in accord with subjective estimation.

  12. Revision, uptake and coding issues related to the open access Orchard Sports Injury Classification System (OSICS) versions 8, 9 and 10.1

    OpenAIRE

    Orchard, John; Rae, Katherine; Brooks, John; H?gglund, Martin; Til, Lluis; Wales, David; Wood, Tim

    2010-01-01

    John Orchard1, Katherine Rae1, John Brooks2, Martin Hägglund3, Lluis Til4, David Wales5, Tim Wood61Sports Medicine at Sydney University, Sydney NSW Australia; 2Rugby Football Union, Twickenham, England, UK; 3Department of Medical and Health Sciences, Linköping University, Linköping, Sweden; 4FC Barcelona, Barcelona, Catalonia, Spain; 5Arsenal FC, Highbury, England, UK; 6Tennis Australia, Melbourne, Vic, AustraliaAbstract: The Orchard Sports Injury Classification Sys...

  13. Classification and modelling of functional outputs of computation codes. Application to accidental thermal-hydraulic calculations in pressurized water reactor (PWR)

    International Nuclear Information System (INIS)

    Auder, Benjamin

    2011-01-01

    This research thesis has been made within the frame of a project on nuclear reactor vessel life. It deals with the use of numerical codes aimed at estimating probability densities for every input parameter in order to calculate probability margins at the output level. More precisely, it deals with codes with one-dimensional functional responses. The author studies the numerical simulation of a pressurized thermal shock on a nuclear reactor vessel, i.e. one of the possible accident types. The study of the vessel integrity relies on a thermal-hydraulic analysis and on a mechanical analysis. Algorithms are developed and proposed for each of them. Input-output data are classified using a clustering technique and a graph-based representation. A method for output dimension reduction is proposed, and a regression is applied between inputs and reduced representations. Applications are discussed in the case of modelling and sensitivity analysis for the CATHARE code (a code used at the CEA for the thermal-hydraulic analysis)

  14. [Changes introduced into the recent International Classification of Headache Disorders: ICHD-III beta classification].

    Science.gov (United States)

    Belvis, Robert; Mas, Natàlia; Roig, Carles

    2015-01-16

    The International Headache Society (IHS) has published the third edition of the International Classification of Headache Disorders (ICHD-III beta), the most commonly used guide to diagnosing headaches in the world. To review the recent additions to the guide, to explain the new entities that appear in it and to compare the conditions that have had their criteria further clarified against the criteria in the previous edition. We have recorded a large number of clarifications in the criteria in practically all the headaches and neuralgias in the classification, but the conditions that have undergone the most significant clarifications are chronic migraine, primary headache associated with sexual activity, short-lasting unilateral neuralgiform headache attacks, new daily persistent headache, medication-overuse headache, syndrome of transient headache and neurological deficits with cerebrospinal fluid lymphocytosis. The most notable new entities that have been incorporated are external-compression headache, cold-stimulus headache, nummular headache, headache attributed to aeroplane travel and headache attributed to autonomic dysreflexia. Another point to be highlighted is the case of the new headaches (still not considered entities in their own right) included in the appendix, some of the most noteworthy being epicrania fugax, vestibular migraine and infantile colic. The IHS recommends no longer using the previous classification and changing over to the new classification (ICHD-III beta) in healthcare, teaching and research, in addition to making this new guide as widely known as possible.

  15. A Tale of Two Disability Coding Systems: The Veterans Administration Schedule for Rating Disabilities (VASRD) vs. Diagnostic Coding Using the International Classification of Diseases, 9th Edition, Clinical Modification (ICD-9-CM)

    Science.gov (United States)

    2008-01-01

    of ear and other sense organ disability cases with a disability-related CRO hospital record (N=234). Sickle - cell anemia was the most common...Hemic and Lymphatic Systems VASRD Group, 1984-1999. ICD-9-CM code (number and title) Frequency Percent of total* 282.6 Sickle - Cell Anemia 14...heterogeneous, and ICD-9-CM conditions linked to this VASRD include conditions that may be experienced by men and women. For example, 2 abdominal pain was

  16. 42 CFR 137.95 - May a Self-Governance Tribe purchase goods and services from the IHS on a reimbursable basis?

    Science.gov (United States)

    2010-10-01

    ... services from the IHS on a reimbursable basis? 137.95 Section 137.95 Public Health PUBLIC HEALTH SERVICE... Tribe purchase goods and services from the IHS on a reimbursable basis? Yes, a Self-Governance Tribe may...-Governance Tribe, on a reimbursable basis, including payment in advance with subsequent adjustment. Prompt...

  17. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  18. 42 CFR 136.412 - What questions must the IHS ask as part of the background investigation?

    Science.gov (United States)

    2010-10-01

    ..., exploitation, contact, or prostitution; crimes against persons; or offenses committed against children? If yes... Child Protection and Family Violence Prevention § 136.412 What questions must the IHS ask as part of the...: (1) Has the individual been arrested or charged with a crime involving a child? If yes, the...

  19. Dielectric constant and low-frequency infrared spectra for liquid water and ice Ih within the E3B model

    Energy Technology Data Exchange (ETDEWEB)

    Shi, L.; Ni, Y.; Drews, S. E. P.; Skinner, J. L. [Theoretical Chemistry Institute and Department of Chemistry, University of Wisconsin, Madison, Wisconsin 53706 (United States)

    2014-08-28

    Two intrinsic difficulties in modeling condensed-phase water with conventional rigid non-polarizable water models are: reproducing the static dielectric constants for liquid water and ice Ih, and generating the peak at about 200 cm{sup −1} in the low-frequency infrared spectrum for liquid water. The primary physical reason for these failures is believed to be the missing polarization effect in these models, and consequently various sophisticated polarizable water models have been developed. However, in this work we pursue a different strategy and propose a simple empirical scheme to include the polarization effect only on the dipole surface (without modifying a model's intermolecular interaction potential). We implement this strategy for our explicit three-body (E3B) model. Our calculated static dielectric constants and low-frequency infrared spectra are in good agreement with experiment for both liquid water and ice Ih over wide temperature ranges, albeit with one fitting parameter for each phase. The success of our modeling also suggests that thermal fluctuations about local minima and the energy differences between different proton-disordered configurations play minor roles in the static dielectric constant of ice Ih. Our analysis shows that the polarization effect is important in resolving the two difficulties mentioned above and sheds some light on the origin of several features in the low-frequency infrared spectra for liquid water and ice Ih.

  20. 77 FR 52748 - 60-Day Proposed Information Collection: Indian Health Service (IHS) Sharing What Works-Best...

    Science.gov (United States)

    2012-08-30

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Indian Health Service 60-Day Proposed Information Collection: Indian Health Service (IHS) Sharing What Works--Best Practice, Promising Practice, and Local Effort (BPPPLE) Form; Request For Public Comment AGENCY: Indian Health Service, HHS. ACTION: Notice...

  1. 42 CFR 137.401 - What role does Tribal consultation play in the IHS annual budget request process?

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false What role does Tribal consultation play in the IHS annual budget request process? 137.401 Section 137.401 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF...-GOVERNANCE Secretarial Responsibilities Budget Request § 137.401 What role does Tribal consultation play in...

  2. Elastic wave speeds and moduli in polycrystalline ice Ih, si methane hydrate, and sll methane-ethane hydrate

    Science.gov (United States)

    Helgerud, M.B.; Waite, W.F.; Kirby, S.H.; Nur, A.

    2009-01-01

    We used ultrasonic pulse transmission to measure compressional, P, and shear, S, wave speeds in laboratory-formed polycrystalline ice Ih, si methane hydrate, and sll methane-ethane hydrate. From the wave speed's linear dependence on temperature and pressure and from the sample's calculated density, we derived expressions for bulk, shear, and compressional wave moduli and Poisson's ratio from -20 to 15??C and 22.4 to 32.8 MPa for ice Ih, -20 to 15??C and 30.5 to 97.7 MPa for si methane hydrate, and -20 to 10??C and 30.5 to 91.6 MPa for sll methane-ethane hydrate. All three materials had comparable P and S wave speeds and decreasing shear wave speeds with increasing applied pressure. Each material also showed evidence of rapid intergranular bonding, with a corresponding increase in wave speed, in response to pauses in sample deformation. There were also key differences. Resistance to uniaxial compaction, indicated by the pressure required to compact initially porous samples, was significantly lower for ice Ih than for either hydrate. The ice Ih shear modulus decreased with increasing pressure, in contrast to the increase measured in both hydrates ?? 2009.

  3. Hyperpolarization-activated current (I(h)) in vestibular calyx terminals: characterization and role in shaping postsynaptic events.

    Science.gov (United States)

    Meredith, Frances L; Benke, Tim A; Rennie, Katherine J

    2012-12-01

    Calyx afferent terminals engulf the basolateral region of type I vestibular hair cells, and synaptic transmission across the vestibular type I hair cell/calyx is not well understood. Calyces express several ionic conductances, which may shape postsynaptic potentials. These include previously described tetrodotoxin-sensitive inward Na(+) currents, voltage-dependent outward K(+) currents and a K(Ca) current. Here, we characterize an inwardly rectifying conductance in gerbil semicircular canal calyx terminals (postnatal days 3-45), sensitive to voltage and to cyclic nucleotides. Using whole-cell patch clamp, we recorded from isolated calyx terminals still attached to their type I hair cells. A slowly activating, noninactivating current (I(h)) was seen with hyperpolarizing voltage steps negative to the resting potential. External Cs(+) (1-5 mM) and ZD7288 (100 μM) blocked the inward current by 97 and 83 %, respectively, confirming that I(h) was carried by hyperpolarization-activated, cyclic nucleotide gated channels. Mean half-activation voltage of I(h) was -123 mV, which shifted to -114 mV in the presence of cAMP. Activation of I(h) was well described with a third order exponential fit to the current (mean time constant of activation, τ, was 190 ms at -139 mV). Activation speeded up significantly (τ=136 and 127 ms, respectively) when intracellular cAMP and cGMP were present, suggesting that in vivo I(h) could be subject to efferent modulation via cyclic nucleotide-dependent mechanisms. In current clamp, hyperpolarizing current steps produced a time-dependent depolarizing sag followed by either a rebound afterdepolarization or an action potential. Spontaneous excitatory postsynaptic potentials (EPSPs) became larger and wider when I(h) was blocked with ZD7288. In a three-dimensional mathematical model of the calyx terminal based on Hodgkin-Huxley type ionic conductances, removal of I(h) similarly increased the EPSP, whereas cAMP slightly decreased simulated EPSP size

  4. Issues in Developing a Surveillance Case Definition for Nonfatal Suicide Attempt and Intentional Self-harm Using International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) Coded Data.

    Science.gov (United States)

    Hedegaard, Holly; Schoenbaum, Michael; Claassen, Cynthia; Crosby, Alex; Holland, Kristin; Proescholdbell, Scott

    2018-02-01

    Suicide and intentional self-harm are among the leading causes of death in the United States. To study this public health issue, epidemiologists and researchers often analyze data coded using the International Classification of Diseases (ICD). Prior to October 1, 2015, health care organizations and providers used the clinical modification of the Ninth Revision of ICD (ICD-9-CM) to report medical information in electronic claims data. The transition in October 2015 to use of the clinical modification of the Tenth Revision of ICD (ICD-10-CM) resulted in the need to update methods and selection criteria previously developed for ICD-9-CM coded data. This report provides guidance on the use of ICD-10-CM codes to identify cases of nonfatal suicide attempts and intentional self-harm in ICD-10-CM coded data sets. ICD-10-CM codes for nonfatal suicide attempts and intentional self-harm include: X71-X83, intentional self-harm due to drowning and submersion, firearms, explosive or thermal material, sharp or blunt objects, jumping from a high place, jumping or lying in front of a moving object, crashing of motor vehicle, and other specified means; T36-T50 with a 6th character of 2 (except for T36.9, T37.9, T39.9, T41.4, T42.7, T43.9, T45.9, T47.9, and T49.9, which are included if the 5th character is 2), intentional self-harm due to drug poisoning (overdose); T51-T65 with a 6th character of 2 (except for T51.9, T52.9, T53.9, T54.9, T56.9, T57.9, T58.0, T58.1, T58.9, T59.9, T60.9, T61.0, T61.1, T61.9, T62.9, T63.9, T64.0, T64.8, and T65.9, which are included if the 5th character is 2), intentional self-harm due to toxic effects of nonmedicinal substances; T71 with a 6th character of 2, intentional self-harm due to asphyxiation, suffocation, strangulation; and T14.91, Suicide attempt. Issues to consider when selecting records for nonfatal suicide attempts and intentional self-harm from ICD-10-CM coded administrative data sets are also discussed. All material appearing in this

  5. Role of Ih in differentiating the dynamics of the gastric and pyloric neurons in the stomatogastric ganglion of the lobster, Homarus americanus.

    Science.gov (United States)

    Zhu, Lin; Selverston, Allen I; Ayers, Joseph

    2016-06-01

    The hyperpolarization-activated inward cationic current (Ih) is known to regulate the rhythmicity, excitability, and synaptic transmission in heart cells and many types of neurons across a variety of species, including some pyloric and gastric mill neurons in the stomatogastric ganglion (STG) in Cancer borealis and Panulirus interruptus However, little is known about the role of Ih in regulating the gastric mill dynamics and its contribution to the dynamical bifurcation of the gastric mill and pyloric networks. We investigated the role of Ih in the rhythmic activity and cellular excitability of both the gastric mill neurons (medial gastric, gastric mill) and pyloric neurons (pyloric dilator, lateral pyloric) in Homarus americanus Through testing the burst period between 5 and 50 mM CsCl, and elimination of postinhibitory rebound and voltage sag, we found that 30 mM CsCl can sufficiently block Ih in both the pyloric and gastric mill neurons. Our results show that Ih maintains the excitability of both the pyloric and gastric mill neurons. However, Ih regulates slow oscillations of the pyloric and gastric mill neurons differently. Specifically, blocking Ih diminishes the difference between the pyloric and gastric mill burst periods by increasing the pyloric burst period and decreasing the gastric mill burst period. Moreover, the phase-plane analysis shows that blocking Ih causes the trajectory of slow oscillations of the gastric mill neurons to change toward the pyloric sinusoidal-like trajectories. In addition to regulating the pyloric rhythm, we found that Ih is also essential for the gastric mill rhythms and differentially regulates these two dynamics. Copyright © 2016 the American Physiological Society.

  6. Essay of accelerator R and D in a small laboratory of an university. Head ion IH linac for fusion material. 1983-1985

    International Nuclear Information System (INIS)

    Hattori, Toshiyuki

    2005-01-01

    The linear accelerator of Inter-Digital H type (IH linac) is known to have a high shunt impedance. Research Laboratory for Nuclear Reactors of Tokyo Institute of Technology introduced an IH linac for fusion materials irradiation test in 1983. The beam injector was a tandem electrostatic accelerator. The IH linac was designed and fabricated based on the developmental work at Institute for Nuclear Study of University of Tokyo. The processes of component alignment, cold test and start-up operation are described. Educational aspect of the project is also reviewed. (K.Y.)

  7. Argentinean adaptation of the Social Skills Inventory IHS-Del-Prette.

    Science.gov (United States)

    Olaz, Fabián Orlando; Medrano, Leonardo; Greco, María Eugenia; Del Prette, Zilda Aparecida Pereira

    2009-11-01

    We present the results of the adaptation of the IHS-Del-Prette (Inventario de Habilidades Sociales, in English, Social Skills Inventory) to a sample of Argentinean college students. Firstly, we addressed the backward translation and carried out an equivalence study of the Portuguese and Spanish versions of the scale. The results showed the two versions were equivalent, as we obtained correlations lower than .50 in only 5 items. Secondly, we performed item analysis by calculating discrimination indexes and item-total correlations. Results indicated that the items are sensitive to differentiate between high and low social-skill groups. Exploratory factor analysis carried out with a sample of 602 college students yielded five factors that explained 26.5% of the total variance, although our data did not completely match the original factor structure. We also obtained moderate alpha values for the subscales, but high reliability for the total scale. Lastly, group differences between males and females are presented to provide evidence of validity. We discuss the implications of the results and present future lines of inquiry.

  8. Shuttle Return-to-Flight IH-108 Aerothermal Test at CUBRC - Flow Field Calibration and CFD

    Science.gov (United States)

    Lau, Kei Y.; Holden, M. S.

    2011-01-01

    This paper discusses one specific aspect of the Shuttle Retrun-To-Flight IH-108 Aerothermal Test at Calspan-University of Buffalo Research Center (CUBRC), the test flow field calibration. It showed the versatility of the CUBRC Large Energy National Shock Tunnel (LENS) II wind tunnel for an aerothermal test with unique and demanding requirements. CFD analyses were used effectively to extend the test range at the low end of the Mach range. It demonstrated how ground test facility and CFD synergy can be utilitzed iteratively to enhance the confidence in the fedility of both tools. It addressed the lingering concerns of the aerothermal community on use of inpulse facility and CFD analysis. At the conclusion of the test program, members from the NASA Marshall (MSFC), CUBRC and USA (United Space Alliance) Consultants (The Grey Beards) were asked to independently verify the flight scaling data generated by Boeing for flight certification of the re-designed external tank (ET) components. The blind test comparison showed very good results.

  9. Anomalous diffusion of water molecules at grain boundaries in ice Ih.

    Science.gov (United States)

    Moreira, Pedro Augusto Franco Pinheiro; Veiga, Roberto Gomes de Aguiar; Ribeiro, Ingrid de Almeida; Freitas, Rodrigo; Helfferich, Julian; de Koning, Maurice

    2018-05-23

    Using ab initio and classical molecular dynamics simulations, we study pre-melting phenomena in pristine coincident-site-lattice grain boundaries (GBs) in proton-disordered hexagonal ice Ih at temperatures just below the melting point Tm. Concerning pre-melt-layer thicknesses, the results are consistent with the available experimental estimates for low-disorder impurity-free GBs. With regard to molecular mobility, the simulations provide a key new insight: the translational motion of the water molecules is found to be subdiffusive for time scales from ∼10 ns up to at least 0.1 μs. Moreover, the fact that the anomalous diffusion occurs even at temperatures just below Tm where the bulk supercooled liquid still diffuses normally suggests that it is related to the confinement of the GB pre-melt layers by the surrounding crystalline environment. Furthermore, we show that this behavior can be characterized by continuous-time random walk models in which the waiting-time distributions decay according to power-laws that are very similar to those describing dynamics in glass-forming systems.

  10. Code Cactus; Code Cactus

    Energy Technology Data Exchange (ETDEWEB)

    Fajeau, M; Nguyen, L T; Saunier, J [Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France)

    1966-09-01

    This code handles the following problems: -1) Analysis of thermal experiments on a water loop at high or low pressure; steady state or transient behavior; -2) Analysis of thermal and hydrodynamic behavior of water-cooled and moderated reactors, at either high or low pressure, with boiling permitted; fuel elements are assumed to be flat plates: - Flowrate in parallel channels coupled or not by conduction across plates, with conditions of pressure drops or flowrate, variable or not with respect to time is given; the power can be coupled to reactor kinetics calculation or supplied by the code user. The code, containing a schematic representation of safety rod behavior, is a one dimensional, multi-channel code, and has as its complement (FLID), a one-channel, two-dimensional code. (authors) [French] Ce code permet de traiter les problemes ci-dessous: 1. Depouillement d'essais thermiques sur boucle a eau, haute ou basse pression, en regime permanent ou transitoire; 2. Etudes thermiques et hydrauliques de reacteurs a eau, a plaques, a haute ou basse pression, ebullition permise: - repartition entre canaux paralleles, couples on non par conduction a travers plaques, pour des conditions de debit ou de pertes de charge imposees, variables ou non dans le temps; - la puissance peut etre couplee a la neutronique et une representation schematique des actions de securite est prevue. Ce code (Cactus) a une dimension d'espace et plusieurs canaux, a pour complement Flid qui traite l'etude d'un seul canal a deux dimensions. (auteurs)

  11. Četrta konferenca o izobraževanju in učenju starejših

    Directory of Open Access Journals (Sweden)

    Dušana Findeisen

    1998-12-01

    Full Text Available Pravijo, da je Univerza v Ulmu "najvišja nemška univerza". Najvišja zato, ker stoji zunaj mesta na vzpetini. Res je, da slovi po svojih tehniških fakultetah, pa vendar v Ulmu niso pozabili na družbene vede. Znanja iz teh ved si je na Univerzi v Ulmu moč pridobiti pri ZAWIW, Univerzitetnem centru za splošno nadaljevalno izobraževanje. Pri tem centru pa je leta 1994 nastal tudi sedež Evropske zveze "Učenje v poznejših letih". Tako je v njenem okviru lani potekala v Ulmu že četrta konferenca, posvečena izobraževanju in učenju starejših odraslih.

  12. Ih equalizes membrane input resistance in a heterogeneous population of fusiform neurons in the dorsal cochlear nucleus.

    Directory of Open Access Journals (Sweden)

    Cesar Celis Ceballos

    2016-10-01

    Full Text Available In a neuronal population, several combinations of its ionic conductances are used to attain a specific firing phenotype. Some neurons present heterogeneity in their firing, generally produced by expression of a specific conductance, but how additional conductances vary along in order to homeostatically regulate membrane excitability is less known. Dorsal cochlear nucleus principal neurons, fusiform neurons, display heterogeneous spontaneous action potential activity and thus represent an appropriate model to study the role of different conductances in establishing firing heterogeneity. Particularly, fusiform neurons are divided into quiet, with no spontaneous firing, or active neurons, presenting spontaneous, regular firing. These modes are determined by the expression levels of an intrinsic membrane conductance, an inwardly rectifying potassium current (IKir. In this work, we tested whether other subthreshold conductances vary homeostatically to maintain membrane excitability constant across the two subtypes. We found that Ih expression covaries specifically with IKir in order to maintain membrane resistance constant. The impact of Ih on membrane resistance is dependent on the level of IKir expression, being much smaller in quiet neurons with bigger IKir, but Ih variations are not relevant for creating the quiet and active phenotypes. Finally, we demonstrate that the individual proportion of each conductance, and not their absolute conductance, is relevant for determining the neuronal firing mode. We conclude that in fusiform neurons the variations of their different subthreshold conductances are limited to specific conductances in order to create firing heterogeneity and maintain membrane homeostasis.

  13. Popular heavy particle beam cancer therapeutic system (3). Development of high efficiency compact incident system-2. Great success of beam test of new APF-IH type DTL

    International Nuclear Information System (INIS)

    Yamamoto, Kazuo; Iwata, Yoshiyuki

    2006-01-01

    High efficiency compact incident system consists of an electron cyclotron resonance (ECR) ion source, a radio frequency quadrupole (RFQ) linear accelerator and an interdigital H-mode (IH) drift tube linear accelerator (DTL). IH type DTL and alternating phase focusing (APF) method is explained. Its special features, production, and beam test are reported. The electric field generation method, outline of the APF method, drift tube, IH type DTL, distribution of electric field and voltage, set up of beam test, ECR ion source and incident line, the inside structure of the RFQ type linear accelerator and the APF-IH type DTL, matching Q lens section, beam, emittance, measurement results of momentum dispersion are illustrated. (S.Y.)

  14. Convergent and reciprocal modulation of a leak K+ current and Ih by an inhalational anaesthetic and neurotransmitters in rat brainstem motoneurones

    Science.gov (United States)

    Sirois, Jay E; Lynch, Carl; Bayliss, Douglas A

    2002-01-01

    Neurotransmitters and volatile anaesthetics have opposing effects on motoneuronal excitability which appear to reflect contrasting modulation of two types of subthreshold currents. Neurotransmitters increase motoneuronal excitability by inhibiting TWIK-related acid-sensitive K+ channels (TASK) and shifting activation of a hyperpolarization-activated cationic current (Ih) to more depolarized potentials; on the other hand, anaesthetics decrease excitability by activating a TASK-like current and inducing a hyperpolarizing shift in Ih activation. Here, we used whole-cell recording from motoneurones in brainstem slices to test if neurotransmitters (serotonin (5-HT) and noradrenaline (NA)) and an anaesthetic (halothane) indeed compete for modulation of the same ion channels - and we determined which prevails. When applied together under current clamp conditions, 5-HT reversed anaesthetic-induced membrane hyperpolarization and increased motoneuronal excitability. Under voltage clamp conditions, 5-HT and NA overcame most, but not all, of the halothane-induced current. When Ih was blocked with ZD 7288, the neurotransmitters completely inhibited the K+ current activated by halothane; the halothane-sensitive neurotransmitter current reversed at the equilibrium potential for potassium (EK) and displayed properties expected of acid-sensitive, open-rectifier TASK channels. To characterize modulation of Ih in relative isolation, effects of 5-HT and halothane were examined in acidified bath solutions that blocked TASK channels. Under these conditions, 5-HT and halothane each caused their characteristic shift in voltage-dependent gating of Ih. When tested concurrently, however, halothane decreased the neurotransmitter-induced depolarizing shift in Ih activation. Thus, halothane and neurotransmitters converge on TASK and Ih channels with opposite effects; transmitter action prevailed over anaesthetic effects on TASK channels, but not over effects on Ih. These data suggest that

  15. High-pass filtering of input signals by the Ih current in a non-spiking neuron, the retinal rod bipolar cell.

    Directory of Open Access Journals (Sweden)

    Lorenzo Cangiano

    Full Text Available Hyperpolarization-activated cyclic nucleotide-sensitive (HCN channels mediate the I(f current in heart and I(h throughout the nervous system. In spiking neurons I(h participates primarily in different forms of rhythmic activity. Little is known, however, about its role in neurons operating with graded potentials as in the retina, where all four channel isoforms are expressed. Intriguing evidence for an involvement of I(h in early visual processing are the side effects reported, in dim light or darkness, by cardiac patients treated with HCN inhibitors. Moreover, electroretinographic recordings indicate that these drugs affect temporal processing in the outer retina. Here we analyzed the functional role of HCN channels in rod bipolar cells (RBCs of the mouse. Perforated-patch recordings in the dark-adapted slice found that RBCs exhibit I(h, and that this is sensitive to the specific blocker ZD7288. RBC input impedance, explored by sinusoidal frequency-modulated current stimuli (0.1-30 Hz, displays band-pass behavior in the range of I(h activation. Theoretical modeling and pharmacological blockade demonstrate that high-pass filtering of input signals by I(h, in combination with low-pass filtering by passive properties, fully accounts for this frequency-tuning. Correcting for the depolarization introduced by shunting through the pipette-membrane seal, leads to predict that in darkness I(h is tonically active in RBCs and quickens their responses to dim light stimuli. Immunohistochemistry targeting candidate subunit isoforms HCN1-2, in combination with markers of RBCs (PKC and rod-RBC synaptic contacts (bassoon, mGluR6, Kv1.3, suggests that RBCs express HCN2 on the tip of their dendrites. The functional properties conferred by I(h onto RBCs may contribute to shape the retina's light response and explain the visual side effects of HCN inhibitors.

  16. [The Classification of Headache: Important Aspects of Patient's History and Clinical Diagnostic].

    Science.gov (United States)

    Kamm, Katharina; Ruscheweyh, Ruth; Eren, Ozan; Straube, Andreas

    2017-03-01

    Headache disorders are the most occuring symptoms in human population. Basis for a successful therapy of headaches is a definite diagnosis, which needs in turn valid criteria for the graduation of headaches. Corresponding to the classification of the International Headache Society (IHS) especially relevant questions about patient's history and clinical examination lead to a diagnosis. © Georg Thieme Verlag KG Stuttgart · New York.

  17. Supernova Photometric Lightcurve Classification

    Science.gov (United States)

    Zaidi, Tayeb; Narayan, Gautham

    2016-01-01

    This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).

  18. Synthesizing Certified Code

    Science.gov (United States)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  19. Insurance billing and coding.

    Science.gov (United States)

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  20. Identification and localization of gonadotropin-inhibitory hormone (GnIH) orthologs in the hypothalamus of the red-eared slider turtle, Trachemys scripta elegans.

    Science.gov (United States)

    Ukena, Kazuyoshi; Iwakoshi-Ukena, Eiko; Osugi, Tomohiro; Tsutsui, Kazuyoshi

    2016-02-01

    Gonadotropin-inhibitory hormone (GnIH) was discovered in 2000 as a novel hypothalamic neuropeptide that inhibited gonadotropin release in the Japanese quail. GnIH and its orthologs have a common C-terminal LPXRFamide (X=L or Q) motif, and have been identified in vertebrates from agnathans to humans, apart from reptiles. In the present study, we characterized a cDNA encoding GnIH orthologs in the brain of the red-eared slider turtle. The deduced precursor protein consisted of 205 amino-acid residues, encoding three putative peptide sequences that included the LPXRFamide motif at their C-termini. In addition, the precursor sequence was most similar to those of avian species. Immunoaffinity purification combined with mass spectrometry confirmed that three mature peptides were produced in the brain. In situ hybridization and immunohistochemistry showed that turtle GnIH-containing cells were restricted to the periventricular hypothalamic nucleus. Immunoreactive fibers were densely distributed in the median eminence. Thus, GnIH and related peptides may act on the pituitary to regulate pituitary hormone release in turtles as well as other vertebrates. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Izobraževanje starejših odraslih in pomen njihovega izobraževanja za njih same ter družbo

    Directory of Open Access Journals (Sweden)

    Dušana Findeisen

    2009-12-01

    Full Text Available Avtorica predstavi pomen izobraževanja starejših odraslih za dejavno staranje v dolgem prehodnem obdobju, imenovanem »čas med delom, upokojitvijo in starostjo«. Ta koncept prehodnega obdobja, prehoda v starost, namreč danes nadomešča starejši koncept pokoja. V tem času ljudje v poznejših letih življenja opravljajo plačano delo, se izobražujejo, se (občasno vračajo na trg dela, delajo prostovoljno. Avtorica piše o prebivalstvenih spremembah in o pomenu izobraženosti vseh in vseh skupin starejših odraslih za soočanje s staranjem družbe. Dušana Findeisen zastane pri vprašanju značilnosti izobraževalnih programov za starejše odrasle, obravnava vrednote, nosilce in oblike izobraževanja starejših. Nazadnje opiše slovensko univerzo za tretje življenjsko obdobje kot konceptualno eno najbolj celostno domišljenih oblik in modelov izobraževanja ljudi v poznejših letih življenja.

  2. Analysis of the role of the low threshold currents IT and Ih in intrinsic delta oscillations of thalamocortical neurons

    Directory of Open Access Journals (Sweden)

    Yimy eAmarillo

    2015-05-01

    Full Text Available Thalamocortical neurons are involved in the generation and maintenance of brain rhythms associated with global functional states. The repetitive burst firing of TC neurons at delta frequencies (1-4 Hz has been linked to the oscillations recorded during deep sleep and during episodes of absence seizures. To get insight into the biophysical properties that are the basis for intrinsic delta oscillations in these neurons, we performed a bifurcation analysis of a minimal conductance-based thalamocortical neuron model including only the IT channel and the sodium and potassium leak channels. This analysis unveils the dynamics of repetitive burst firing of TC neurons, and describes how the interplay between the amplifying variable mT and the recovering variable hT of the calcium channel IT is sufficient to generate low threshold oscillations in the delta band. We also explored the role of the hyperpolarization activated cationic current Ih in this reduced model and determine that, albeit not required, Ih amplifies and stabilizes the oscillation.

  3. Computer codes used in particle accelerator design: First edition

    International Nuclear Information System (INIS)

    1987-01-01

    This paper contains a listing of more than 150 programs that have been used in the design and analysis of accelerators. Given on each citation are person to contact, classification of the computer code, publications describing the code, computer and language runned on, and a short description of the code. Codes are indexed by subject, person to contact, and code acronym

  4. Coding Partitions

    Directory of Open Access Journals (Sweden)

    Fabio Burderi

    2007-05-01

    Full Text Available Motivated by the study of decipherability conditions for codes weaker than Unique Decipherability (UD, we introduce the notion of coding partition. Such a notion generalizes that of UD code and, for codes that are not UD, allows to recover the ``unique decipherability" at the level of the classes of the partition. By tacking into account the natural order between the partitions, we define the characteristic partition of a code X as the finest coding partition of X. This leads to introduce the canonical decomposition of a code in at most one unambiguouscomponent and other (if any totally ambiguouscomponents. In the case the code is finite, we give an algorithm for computing its canonical partition. This, in particular, allows to decide whether a given partition of a finite code X is a coding partition. This last problem is then approached in the case the code is a rational set. We prove its decidability under the hypothesis that the partition contains a finite number of classes and each class is a rational set. Moreover we conjecture that the canonical partition satisfies such a hypothesis. Finally we consider also some relationships between coding partitions and varieties of codes.

  5. Testing IH Instrumentation: Analysis of 1996-1998 Tank Ventilation Data in Terms of Characterizing a Transient Release

    International Nuclear Information System (INIS)

    Droppo, James G.

    2004-01-01

    An analysis is conducted of the 1996-1998 Hanford tank ventilation studies of average ventilation rates to help define characteristics of shorter term releases. This effort is being conducted as part of the design of tests of Industrial Hygiene's (IH) instrumentation ability to detect transient airborne plumes from tanks using current deployment strategies for tank operations. This analysis has improved our understanding of the variability of hourly average tank ventilation processes. However, the analysis was unable to discern the relative importance of emissions due to continuous releases and short-duration bursts of material. The key findings are as follows: (1) The ventilation of relatively well-sealed, passively ventilated tanks appears to be driven by a combination of pressure, buoyancy, and wind influences. The results of a best-fit analysis conducted with a single data set provide information on the hourly emission variability that IH instrumentation will need to detect. (2) Tank ventilation rates and tank emission rates are not the same. The studies found that the measured infiltration rates for a single tank are often a complex function of air exchanges between tanks and air exchanges with outdoor air. This situation greatly limits the usefulness of the ventilation data in defining vapor emission rates. (3) There is no evidence in the data to discern if the routine tank vapor releases occur over a short time (i.e., a puff) or over an extended time (i.e., continuous releases). Based on this analysis of the tank ventilation studies, it is also noted that (1) the hourly averaged emission peaks from the relatively well-sealed passively-vented tanks (such as U-103) are not a simple function of one meteorological parameter--but the peaks often are the result of the coincidence of temporal maximums in pressure, temperature, and wind influences and (2) a mechanistic combination modeling approach and/or field studies may be necessary to understand the short

  6. Land Cover - Minnesota Land Cover Classification System

    Data.gov (United States)

    Minnesota Department of Natural Resources — Land cover data set based on the Minnesota Land Cover Classification System (MLCCS) coding scheme. This data was produced using a combination of aerial photograph...

  7. Ih tunes theta/gamma oscillations and cross-frequency coupling in an in silico CA3 model.

    Directory of Open Access Journals (Sweden)

    Samuel A Neymotin

    Full Text Available Ih channels are uniquely positioned to act as neuromodulatory control points for tuning hippocampal theta (4-12 Hz and gamma (25 Hz oscillations, oscillations which are thought to have importance for organization of information flow. contributes to neuronal membrane resonance and resting membrane potential, and is modulated by second messengers. We investigated oscillatory control using a multiscale computer model of hippocampal CA3, where each cell class (pyramidal, basket, and oriens-lacunosum moleculare cells, contained type-appropriate isoforms of . Our model demonstrated that modulation of pyramidal and basket allows tuning theta and gamma oscillation frequency and amplitude. Pyramidal also controlled cross-frequency coupling (CFC and allowed shifting gamma generation towards particular phases of the theta cycle, effected via 's ability to set pyramidal excitability. Our model predicts that in vivo neuromodulatory control of allows flexibly controlling CFC and the timing of gamma discharges at particular theta phases.

  8. Analysis and Evaluation of IKONOS Image Fusion Algorithm Based on Land Cover Classification

    Institute of Scientific and Technical Information of China (English)

    Xia; JING; Yan; BAO

    2015-01-01

    Different fusion algorithm has its own advantages and limitations,so it is very difficult to simply evaluate the good points and bad points of the fusion algorithm. Whether an algorithm was selected to fuse object images was also depended upon the sensor types and special research purposes. Firstly,five fusion methods,i. e. IHS,Brovey,PCA,SFIM and Gram-Schmidt,were briefly described in the paper. And then visual judgment and quantitative statistical parameters were used to assess the five algorithms. Finally,in order to determine which one is the best suitable fusion method for land cover classification of IKONOS image,the maximum likelihood classification( MLC) was applied using the above five fusion images. The results showed that the fusion effect of SFIM transform and Gram-Schmidt transform were better than the other three image fusion methods in spatial details improvement and spectral information fidelity,and Gram-Schmidt technique was superior to SFIM transform in the aspect of expressing image details. The classification accuracy of the fused image using Gram-Schmidt and SFIM algorithms was higher than that of the other three image fusion methods,and the overall accuracy was greater than 98%. The IHS-fused image classification accuracy was the lowest,the overall accuracy and kappa coefficient were 83. 14% and 0. 76,respectively. Thus the IKONOS fusion images obtained by the Gram-Schmidt and SFIM were better for improving the land cover classification accuracy.

  9. Compatibility of amino acids in ice Ih and high-pressure phases: implications for the origin of life

    Science.gov (United States)

    Hao, J.; Giovenco, E.; Pedreira-Segade, U.; Montagnac, G.; Daniel, I.

    2017-12-01

    Icy environments may have been common on the early Earth due to the faint young sun. Previous studies have proposed that the formation of large icy bodies in the early ocean could concentrate the building blocks of life in eutectic fluids and therefore facilitate the polymerization of monomers. This hypothesis is based on the untested assumption that organic molecules are virtually incompatible in ice Ih. In this study, we conducted freezing experiments to explore the partitioning behavior of selected amino acids (glycine, L-alanine, L-proline, and L-phenylalanine) between ice Ih and aqueous solutions analogous to seawater. We let ice crystals grow slowly from a few seeds in equilibrium with the solution and used Raman spectroscopy to analyze in situ the relative concentrations of amino acids in the ice and aqueous solution. During freezing, there was no precipitation of amino acid crystals, indicating that the concentrations in solution never reached their solubility limit, even when the droplet was mostly frozen. Analyses of the Raman spectra of ice and eutectic solution showed that considerable amounts of amino acids existed in the ice phase with partition coefficients ranging between 0.2 and 0.5. This study also explored the partitioning of amino acids between other phases of ice (ice VI and ice VII) and solutions at high pressures and observed similar results. These observations implied little incompatibility of amino acids in ice during the freezing of the solutions, rendering the hypothesis of a cold origin of life unwarranted. However, incorporation into ice could significantly improve the efficiency of extraterrestrial transport of small organics. Therefore, this study supports the hypothesis of extraterrestrial delivery of organic molecules in the icy comets and asteroids to the primitive Earth as suggested by an increasing number of independent observations.

  10. Self-diffusion of polycrystalline ice Ih under confining pressure: Hydrogen isotope analysis using 2-D Raman imaging

    Science.gov (United States)

    Noguchi, Naoki; Kubo, Tomoaki; Durham, William B.; Kagi, Hiroyuki; Shimizu, Ichiko

    2016-08-01

    We have developed a high-resolution technique based on micro Raman spectroscopy to measure hydrogen isotope diffusion profiles in ice Ih. The calibration curve for quantitative analysis of deuterium in ice Ih was constructed using micro Raman spectroscopy. Diffusion experiments using diffusion couples composed of dense polycrystalline H2O and D2O ice were carried out under a gas confining pressure of 100 MPa (to suppress micro-fracturing and pore formation) at temperatures from 235 K to 245 K and diffusion times from 0.2 to 94 hours. Two-dimensional deuterium profiles across the diffusion couples were determined by Raman imaging. The location of small spots of frost from room air could be detected from the shapes of the Raman bands of OH and OD stretching modes, which change because of the effect of the molar ratio of deuterium on the molecular coupling interaction. We emphasize the validity for screening the impurities utilizing the coupling interaction. Some recrystallization and grain boundary migration occurred in recovered diffusion couples, but analysis of two-dimensional diffusion profiles of regions not affected by grain boundary migration allowed us to measure a volume diffusivity for ice at 100 MPa of (2.8 ± 0.4) ×10-3exp[ -57.0 ± 15.4kJ /mol RT ] m2 /s (R is the gas constant, T is temperature). Based on ambient pressure diffusivity measurements by others, this value indicates a high (negative) activation volume for volume diffusivity of -29.5 cm3/mol or more. We can also constrain the value of grain boundary diffusivity in ice at 100 MPa to be volume diffusivity.

  11. Diagnosis of periodontal diseases using different classification ...

    African Journals Online (AJOL)

    The codes created for risk factors, periodontal data, and radiographically bone loss were formed as a matrix structure and regarded as inputs for the classification unit. A total of six periodontal conditions was the outputs of the classification unit. The accuracy of the suggested methods was compared according to their ...

  12. 42 CFR 136.404 - What does the Indian Child Protection and Family Violence Prevention Act require of the IHS and...

    Science.gov (United States)

    2010-10-01

    ... Protection and Family Violence Prevention § 136.404 What does the Indian Child Protection and Family Violence... 42 Public Health 1 2010-10-01 2010-10-01 false What does the Indian Child Protection and Family Violence Prevention Act require of the IHS and Indian Tribes or Tribal organizations receiving funds under...

  13. Supervised Transfer Sparse Coding

    KAUST Repository

    Al-Shedivat, Maruan

    2014-07-27

    A combination of the sparse coding and transfer learn- ing techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from differ- ent underlying distributions, i.e., belong to different do- mains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and thus the training set solely comprised objects from the source domain. However, in real world applications, the target domain often has some labeled objects, or one can always manually label a small num- ber of them. In this paper, we explore such possibil- ity and show how a small number of labeled data in the target domain can significantly leverage classifica- tion accuracy of the state-of-the-art transfer sparse cod- ing methods. We further propose a unified framework named supervised transfer sparse coding (STSC) which simultaneously optimizes sparse representation, domain transfer and classification. Experimental results on three applications demonstrate that a little manual labeling and then learning the model in a supervised fashion can significantly improve classification accuracy.

  14. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  15. Speaking Code

    DEFF Research Database (Denmark)

    Cox, Geoff

    Speaking Code begins by invoking the “Hello World” convention used by programmers when learning a new language, helping to establish the interplay of text and code that runs through the book. Interweaving the voice of critical writing from the humanities with the tradition of computing and software...

  16. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  17. The structure of dual Grassmann codes

    DEFF Research Database (Denmark)

    Beelen, Peter; Pinero, Fernando

    2016-01-01

    In this article we study the duals of Grassmann codes, certain codes coming from the Grassmannian variety. Exploiting their structure, we are able to count and classify all their minimum weight codewords. In this classification the lines lying on the Grassmannian variety play a central role. Rela...

  18. Kvalitet budućih ulja za motore niskih emisija / Future lubricants quality for low emission engines

    Directory of Open Access Journals (Sweden)

    Radinko Gligorijević

    2004-01-01

    Full Text Available U tekućoj dekadi ovog milenijima proizvođači motora moraju postići dvostruko smanjenje potrošnje goriva, pa time i emisije CO2 i desetostruko smanjenje nivoa emisija, pre svega NOX i čestica. Za dostizanje tog cilja moraju se razviti nove specifikacije ulja, čiji doprinos smanjenju potrošnje goriva i nivou štetnih emisija postaje sve važniji. Trend razvoja ulja kreće se u pravcu nižih viskoznih gradacija sa nižim sadržajem sumpora, fosfora, sulfatnog pepela i nižom isparljivošću, čime se postiže smanjenje emisije, kao i potrošnje goriva. / In the current decade of this millennium automobile manufacturers must achieve an additional twofold increase in fuel efficiency and a tenfold reduction of emission of CO2 and pollutants, especially NOX and particles. To achieve this goal, a new specification of engine lubricants, whose contribution to fuel efficiency and reduction of emissions cannot be neglected, is to be developed. Development trends in lubricants move towards low viscosity, low content of sulfur, phosphorus, sulfate ash and low volatility, resulting in emission reduction and increase of fuel efficiency.

  19. Phase changes induced by guest orientational ordering of filled ice Ih methane hydrate under high pressure and low temperature

    International Nuclear Information System (INIS)

    Hirai, H; Tanaka, T; Yagi, T; Matsuoka, T; Ohishi, Y; Ohtake, M; Yamamoto, Y

    2014-01-01

    Low-temperature and high-pressure experiments were performed with filled ice Ih structure of methane hydrate under pressure and temperature conditions of 2.0 to 77.0 GPa and 30 to 300 K, respectively, using diamond anvil cells and a helium-refrigeration cryostat. Distinct changes in the axial ratios of the host framework were revealed by In-situ X-ray diffractometry. Splitting in the CH vibration modes of the guest methane molecules, which was previously explained by the orientational ordering of the guest molecules, was observed by Raman spectroscopy. The pressure and temperature conditions at the split of the vibration modes agreed well with those of the axial ratio changes. The results indicated that orientational ordering of the guest methane molecules from orientational disordered-state occurred at high pressures and low temperatures, and that this guest ordering led to the axial ratio changes in the host framework. Existing regions of the guest disordered-phase and the guest ordered-phase were roughly estimated by the X-ray data. In addition, above the pressure of the guest-ordered phase, another high pressure phase was developed at a low-temperature region. The deuterated-water host samples were also examined and isotopic effects on the guest ordering and phase changes were observed.

  20. Coding Labour

    Directory of Open Access Journals (Sweden)

    Anthony McCosker

    2014-03-01

    Full Text Available As well as introducing the Coding Labour section, the authors explore the diffusion of code across the material contexts of everyday life, through the objects and tools of mediation, the systems and practices of cultural production and organisational management, and in the material conditions of labour. Taking code beyond computation and software, their specific focus is on the increasingly familiar connections between code and labour with a focus on the codification and modulation of affect through technologies and practices of management within the contemporary work organisation. In the grey literature of spreadsheets, minutes, workload models, email and the like they identify a violence of forms through which workplace affect, in its constant flux of crisis and ‘prodromal’ modes, is regulated and governed.

  1. Development and validation of the International Hidradenitis Suppurativa Severity Score System (IHS4), a novel dynamic scoring system to assess HS severity

    DEFF Research Database (Denmark)

    Zouboulis, C C; Tzellos, T; Kyrgidis, A

    2017-01-01

    BACKGROUND: A validated tool for the dynamic severity assessment of hidradenitis suppurativa/acne inversa (HS) is lacking. OBJECTIVES: To develop and validate a novel dynamic scoring system to assess the severity of HS. METHODS: A Delphi voting procedure was conducted among the members......, as well as examination for correlation (Spearman's rho) and agreement (Cohen's kappa) with existing scores, were engaged to recognize the variables for a new International HS4 (IHS4) that was established by a second Delphi round. RESULTS: Consensus HS4 was based on number of skin lesions, number of skin....... Three candidate scores were presented to the second Delphi round. The resulting IHS4 score is arrived at by the number of nodules (multiplied by 1) plus the number of abscesses (multiplied by 2) plus the number of draining tunnels (multiplied by 4). A total score of 3 or less signifies mild, 4...

  2. Speech coding

    Energy Technology Data Exchange (ETDEWEB)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  3. Optimal codes as Tanner codes with cyclic component codes

    DEFF Research Database (Denmark)

    Høholdt, Tom; Pinero, Fernando; Zeng, Peng

    2014-01-01

    In this article we study a class of graph codes with cyclic code component codes as affine variety codes. Within this class of Tanner codes we find some optimal binary codes. We use a particular subgraph of the point-line incidence plane of A(2,q) as the Tanner graph, and we are able to describe ...

  4. Classification differences and maternal mortality

    DEFF Research Database (Denmark)

    Salanave, B; Bouvier-Colle, M H; Varnoux, N

    1999-01-01

    OBJECTIVES: To compare the ways maternal deaths are classified in national statistical offices in Europe and to evaluate the ways classification affects published rates. METHODS: Data on pregnancy-associated deaths were collected in 13 European countries. Cases were classified by a European panel....... This change was substantial in three countries (P statistical offices appeared to attribute fewer deaths to obstetric causes. In the other countries, no differences were detected. According to official published data, the aggregated maternal mortality rate for participating countries was 7.7 per...... of experts into obstetric or non-obstetric causes. An ICD-9 code (International Classification of Diseases) was attributed to each case. These were compared to the codes given in each country. Correction indices were calculated, giving new estimates of maternal mortality rates. SUBJECTS: There were...

  5. Aztheca Code

    International Nuclear Information System (INIS)

    Quezada G, S.; Espinosa P, G.; Centeno P, J.; Sanchez M, H.

    2017-09-01

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  6. Classification of Maize in Complex Smallholder Farming Systems Using UAV Imagery

    Directory of Open Access Journals (Sweden)

    Ola Hall

    2018-06-01

    Full Text Available Yield estimates and yield gap analysis are important for identifying poor agricultural productivity. Remote sensing holds great promise for measuring yield and thus determining yield gaps. Farming systems in sub-Saharan Africa (SSA are commonly characterized by small field size, intercropping, different crop species with similar phenologies, and sometimes high cloud frequency during the growing season, all of which pose real challenges to remote sensing. Here, an unmanned aerial vehicle (UAV system based on a quadcopter equipped with two consumer-grade cameras was used for the delineation and classification of maize plants on smallholder farms in Ghana. Object-oriented image classification methods were applied to the imagery, combined with measures of image texture and intensity, hue, and saturation (IHS, in order to achieve delineation. It was found that the inclusion of a near-infrared (NIR channel and red–green–blue (RGB spectra, in combination with texture or IHS, increased the classification accuracy for both single and mosaic images to above 94%. Thus, the system proved suitable for delineating and classifying maize using RGB and NIR imagery and calculating the vegetation fraction, an important parameter in producing yield estimates for heterogeneous smallholder farming systems.

  7. Mechanisms of IhERG/IKr Modulation by α1-Adrenoceptors in HEK293 Cells and Cardiac Myocytes

    Directory of Open Access Journals (Sweden)

    Janire Urrutia

    2016-12-01

    Full Text Available Background: The rapid delayed rectifier K+ current (IKr, carried by the hERG protein, is one of the main repolarising currents in the human heart and a reduction of this current increases the risk of ventricular fibrillation. α1-adrenoceptors (α1-AR activation reduces IKr but, despite the clear relationship between an increase in the sympathetic tone and arrhythmias, the mechanisms underlying the α1-AR regulation of the hERG channel are controversial. Thus, we aimed to investigate the mechanisms by which α1-AR stimulation regulates IKr. Methods: α1-adrenoceptors, hERG channels, auxiliary subunits minK and MIRP1, the non PIP2-interacting mutant D-hERG (with a deletion of the 883-894 amino acids in the C-terminal and the non PKC-phosphorylable mutant N-terminal truncated-hERG (NTK-hERG were transfected in HEK293 cells. Cell membranes were extracted by centrifugation and the different proteins were visualized by Western blot. Potassium currents were recorded by the patch-clamp technique. IKr was recorded in isolated feline cardiac myocytes. Results: Activation of the α1-AR reduces the amplitude of IhERG and IKr through a positive shift in the activation half voltage, which reduces the channel availability at physiological membrane potentials. The intracellular pathway connecting the α1-AR to the hERG channel in HEK293 cells includes activation of the Gαq protein, PLC activation and PIP2 hydrolysis, activation of PKC and direct phosphorylation of the hERG channel N-terminal. The PKC-mediated IKr channel phosphorylation and subsequent IKr reduction after α1-AR stimulation was corroborated in feline cardiac myocytes. Conclusions: These findings clarify the link between sympathetic nervous system hyperactivity and IKr reduction, one of the best characterized causes of torsades de pointes and ventricular fibrillation.

  8. Vocable Code

    DEFF Research Database (Denmark)

    Soon, Winnie; Cox, Geoff

    2018-01-01

    a computational and poetic composition for two screens: on one of these, texts and voices are repeated and disrupted by mathematical chaos, together exploring the performativity of code and language; on the other, is a mix of a computer programming syntax and human language. In this sense queer code can...... be understood as both an object and subject of study that intervenes in the world’s ‘becoming' and how material bodies are produced via human and nonhuman practices. Through mixing the natural and computer language, this article presents a script in six parts from a performative lecture for two persons...

  9. NSURE code

    International Nuclear Information System (INIS)

    Rattan, D.S.

    1993-11-01

    NSURE stands for Near-Surface Repository code. NSURE is a performance assessment code. developed for the safety assessment of near-surface disposal facilities for low-level radioactive waste (LLRW). Part one of this report documents the NSURE model, governing equations and formulation of the mathematical models, and their implementation under the SYVAC3 executive. The NSURE model simulates the release of nuclides from an engineered vault, their subsequent transport via the groundwater and surface water pathways tot he biosphere, and predicts the resulting dose rate to a critical individual. Part two of this report consists of a User's manual, describing simulation procedures, input data preparation, output and example test cases

  10. 75 FR 78213 - Proposed Information Collection; Comment Request; 2012 Economic Census Classification Report for...

    Science.gov (United States)

    2010-12-15

    ... 8-digit North American Industry Classification System (NAICS) based code for use in the 2012... classification due to changes in NAICS for 2012. Collecting this classification information will ensure the... the reporting burden on sampled sectors. Proper NAICS classification data ensures high quality...

  11. The Aster code; Code Aster

    Energy Technology Data Exchange (ETDEWEB)

    Delbecq, J.M

    1999-07-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  12. Coding Class

    DEFF Research Database (Denmark)

    Ejsing-Duun, Stine; Hansbøl, Mikala

    Denne rapport rummer evaluering og dokumentation af Coding Class projektet1. Coding Class projektet blev igangsat i skoleåret 2016/2017 af IT-Branchen i samarbejde med en række medlemsvirksomheder, Københavns kommune, Vejle Kommune, Styrelsen for IT- og Læring (STIL) og den frivillige forening...... Coding Pirates2. Rapporten er forfattet af Docent i digitale læringsressourcer og forskningskoordinator for forsknings- og udviklingsmiljøet Digitalisering i Skolen (DiS), Mikala Hansbøl, fra Institut for Skole og Læring ved Professionshøjskolen Metropol; og Lektor i læringsteknologi, interaktionsdesign......, design tænkning og design-pædagogik, Stine Ejsing-Duun fra Forskningslab: It og Læringsdesign (ILD-LAB) ved Institut for kommunikation og psykologi, Aalborg Universitet i København. Vi har fulgt og gennemført evaluering og dokumentation af Coding Class projektet i perioden november 2016 til maj 2017...

  13. Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  14. ANIMAL code

    International Nuclear Information System (INIS)

    Lindemuth, I.R.

    1979-01-01

    This report describes ANIMAL, a two-dimensional Eulerian magnetohydrodynamic computer code. ANIMAL's physical model also appears. Formulated are temporal and spatial finite-difference equations in a manner that facilitates implementation of the algorithm. Outlined are the functions of the algorithm's FORTRAN subroutines and variables

  15. Network Coding

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 7. Network Coding. K V Rashmi Nihar B Shah P Vijay Kumar. General Article Volume 15 Issue 7 July 2010 pp 604-621. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/07/0604-0621 ...

  16. MCNP code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids

  17. Expander Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Expander Codes - The Sipser–Spielman Construction. Priti Shankar. General Article Volume 10 ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science Bangalore 560 012, India.

  18. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  19. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  20. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  1. Panda code

    International Nuclear Information System (INIS)

    Altomare, S.; Minton, G.

    1975-02-01

    PANDA is a new two-group one-dimensional (slab/cylinder) neutron diffusion code designed to replace and extend the FAB series. PANDA allows for the nonlinear effects of xenon, enthalpy and Doppler. Fuel depletion is allowed. PANDA has a completely general search facility which will seek criticality, maximize reactivity, or minimize peaking. Any single parameter may be varied in a search. PANDA is written in FORTRAN IV, and as such is nearly machine independent. However, PANDA has been written with the present limitations of the Westinghouse CDC-6600 system in mind. Most computation loops are very short, and the code is less than half the useful 6600 memory size so that two jobs can reside in the core at once. (auth)

  2. CANAL code

    International Nuclear Information System (INIS)

    Gara, P.; Martin, E.

    1983-01-01

    The CANAL code presented here optimizes a realistic iron free extraction channel which has to provide a given transversal magnetic field law in the median plane: the current bars may be curved, have finite lengths and cooling ducts and move in a restricted transversal area; terminal connectors may be added, images of the bars in pole pieces may be included. A special option optimizes a real set of circular coils [fr

  3. The Role of Stress-Effected Subgrain Size Distribution in Anelastic Recovery: An Experimental Study on Polycrystalline Ice-Ih

    Science.gov (United States)

    Caswell, T. E.; Goldsby, D. L.; Cooper, R. F.; Prior, D. J.

    2013-12-01

    Anelasticity, or time-dependent and recoverable strain, is the source of attenuation at seismic and sub-seismic frequencies, yet the processes governing anelastic recovery are poorly resolved. Numerous experimental studies [e.g., 1-3] have demonstrated that anelasticity occurs via diffusion-effected relaxation along grain boundaries, which leads to a significant grain size sensitivity. Similar studies, however, conducted on deformed single crystals [e.g. 4], coarse-grained metals deforming in dislocation creep [e.g., 5] and polycrystalline ice deforming via a dislocation-accommodated mechanism [6] demonstrate the same frequency dependence, consistent with the grain boundary mechanism, but with no sensitivity to grain size. We postulate that it is the deformation-effected distribution of subgrains, which possesses unique diffusive properties relative to a defect-free lattice, that dominates attenuation in these situations. To test this idea we are conducting creep and stress-drop experiments on polycrystalline ice-Ih with concurrent high-resolution microstructural analysis conducted via Electron Backscatter Diffraction (EBSD) [7] to characterize the relationship between subgrain size distribution and diffusion-effected anelasticity. Our experiments establish the subgrain size distribution in steady-state creep of fine-grained ice-1h at compressional stresses between 0.1-4 MPa, which for the grain sizes and temperatures of our experiments places the rheology squarely within the regime of grain boundary sliding that is accommodated by basal dislocation slip [8]. We then explore the dynamics of the established microstructure, which includes subgrain formation [cf. 9], via stress-drop experiments [e.g. 10]. Experiments of this type allow the characterization of microstructural 'hardness,' i.e., the viscosity of the polycrystalline solid as effected by finite strain, from which we can discern the diffusive kinetics of subgrain boundaries [11, 12]. We are currently

  4. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  5. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.; Bensmail, H.; Yao, N.; Gao, Xin

    2013-01-01

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  6. Discriminative sparse coding on multi-manifolds

    KAUST Repository

    Wang, J.J.-Y.

    2013-09-26

    Sparse coding has been popularly used as an effective data representation method in various applications, such as computer vision, medical imaging and bioinformatics. However, the conventional sparse coding algorithms and their manifold-regularized variants (graph sparse coding and Laplacian sparse coding), learn codebooks and codes in an unsupervised manner and neglect class information that is available in the training set. To address this problem, we propose a novel discriminative sparse coding method based on multi-manifolds, that learns discriminative class-conditioned codebooks and sparse codes from both data feature spaces and class labels. First, the entire training set is partitioned into multiple manifolds according to the class labels. Then, we formulate the sparse coding as a manifold-manifold matching problem and learn class-conditioned codebooks and codes to maximize the manifold margins of different classes. Lastly, we present a data sample-manifold matching-based strategy to classify the unlabeled data samples. Experimental results on somatic mutations identification and breast tumor classification based on ultrasonic images demonstrate the efficacy of the proposed data representation and classification approach. 2013 The Authors. All rights reserved.

  7. From concatenated codes to graph codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Høholdt, Tom

    2004-01-01

    We consider codes based on simple bipartite expander graphs. These codes may be seen as the first step leading from product type concatenated codes to more complex graph codes. We emphasize constructions of specific codes of realistic lengths, and study the details of decoding by message passing...

  8. Comparison and analysis for item classifications between AP1000 and traditional PWR

    International Nuclear Information System (INIS)

    Luo Shuiyun; Liu Xiaoyan

    2012-01-01

    The comparison and analysis for the safety classification, seismic category, code classification and QA classification between AP1000 and traditional PWR were presented. The safety could be guaranteed and the construction and manufacture costs could be cut down since all sorts of AP1000 classifications. It is suggested that the QA classification and the QA requirements correspond to the national conditions should be drafted in the process of AP1000 domestication. (authors)

  9. Survey of Codes Employing Nuclear Damage Assessment

    Science.gov (United States)

    1977-10-01

    surveyed codes were com- DO 73Mu 1473 ETN OF 1NOVSSSOLETE UNCLASSIFIED 1 SECURITY CLASSIFICATION OF THIS f AGE (Wh*11 Date Efntered)S<>-~C. I UNCLASSIFIED...level and above) TALLEY/TOTEM not nuclear TARTARUS too highly aggregated (battalion level and above) UNICORN highly aggregated force allocation code...vulnerability data can bq input by the user as he receives them, and there is the abil ’ity to replay any situation using hindsight. The age of target

  10. Diagnosis of periodontal diseases using different classification ...

    African Journals Online (AJOL)

    2014-11-29

    Nov 29, 2014 ... Nigerian Journal of Clinical Practice • May-Jun 2015 • Vol 18 • Issue 3 ... Materials and Methods: A total of 150 patients was divided into two groups such as training ... functions from training data and DT learning is one of ... were represented as numerical codings for classification ..... tool within dentistry.

  11. High Order Tensor Formulation for Convolutional Sparse Coding

    KAUST Repository

    Bibi, Adel Aamer; Ghanem, Bernard

    2017-01-01

    Convolutional sparse coding (CSC) has gained attention for its successful role as a reconstruction and a classification tool in the computer vision and machine learning community. Current CSC methods can only reconstruct singlefeature 2D images

  12. SAW Classification Algorithm for Chinese Text Classification

    OpenAIRE

    Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang

    2015-01-01

    Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...

  13. The classification of easement

    Directory of Open Access Journals (Sweden)

    Popov Danica D.

    2015-01-01

    Full Text Available Easement means, a right enjoyed by the owner of land over the lands of another: such as rights of way, right of light, rights of support, rights to a flow of air or water etc. The dominant tenement is the land owned by the possessor of the easement, and the servient tenement is the land over which the right is enjoyed. An easement must exist for the accommodation and better enjoyment to which it is annexed, otherwise it may amount to mere licence. An easement benefits and binds the land itself and therefore countinious despite any change of ownership of either dominant or servient tenement, although it will be extinguished if the two tenemants come into common ownership. An easement can only be enjoyed in respect of land. This means two parcels of land. First there must be a 'dominant tenement' and a 'servient tenement'. Dominant tenement to which the benefit of the easement attaches, and another (servient tenement which bears the burden of the easement. A positive easement consist of a right to do something on the land of another; a negative easement restrict the use of owner of the serviant tenement may make of his land. An easement may be on land or on the house made on land. The next classification is on easement on the ground, and the other one under the ground. An easement shall be done in accordance with the principle of restrictions. This means that the less burden the servient tenement. When there is doubt about the extent of the actual easement shall take what easier the servient tenement. The new needs of the dominant estate does not result in the expansion of servitude. In the article is made comparison between The Draft Code of property and other real estate, and The Draft of Civil Code of Serbia.

  14. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  15. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  16. Dynamic Shannon Coding

    OpenAIRE

    Gagie, Travis

    2005-01-01

    We present a new algorithm for dynamic prefix-free coding, based on Shannon coding. We give a simple analysis and prove a better upper bound on the length of the encoding produced than the corresponding bound for dynamic Huffman coding. We show how our algorithm can be modified for efficient length-restricted coding, alphabetic coding and coding with unequal letter costs.

  17. Fundamentals of convolutional coding

    CERN Document Server

    Johannesson, Rolf

    2015-01-01

    Fundamentals of Convolutional Coding, Second Edition, regarded as a bible of convolutional coding brings you a clear and comprehensive discussion of the basic principles of this field * Two new chapters on low-density parity-check (LDPC) convolutional codes and iterative coding * Viterbi, BCJR, BEAST, list, and sequential decoding of convolutional codes * Distance properties of convolutional codes * Includes a downloadable solutions manual

  18. Codes Over Hyperfields

    Directory of Open Access Journals (Sweden)

    Atamewoue Surdive

    2017-12-01

    Full Text Available In this paper, we define linear codes and cyclic codes over a finite Krasner hyperfield and we characterize these codes by their generator matrices and parity check matrices. We also demonstrate that codes over finite Krasner hyperfields are more interesting for code theory than codes over classical finite fields.

  19. Lossless Compression of Classification-Map Data

    Science.gov (United States)

    Hua, Xie; Klimesh, Matthew

    2009-01-01

    A lossless image-data-compression algorithm intended specifically for application to classification-map data is based on prediction, context modeling, and entropy coding. The algorithm was formulated, in consideration of the differences between classification maps and ordinary images of natural scenes, so as to be capable of compressing classification- map data more effectively than do general-purpose image-data-compression algorithms. Classification maps are typically generated from remote-sensing images acquired by instruments aboard aircraft (see figure) and spacecraft. A classification map is a synthetic image that summarizes information derived from one or more original remote-sensing image(s) of a scene. The value assigned to each pixel in such a map is the index of a class that represents some type of content deduced from the original image data for example, a type of vegetation, a mineral, or a body of water at the corresponding location in the scene. When classification maps are generated onboard the aircraft or spacecraft, it is desirable to compress the classification-map data in order to reduce the volume of data that must be transmitted to a ground station.

  20. Classifying Coding DNA with Nucleotide Statistics

    Directory of Open Access Journals (Sweden)

    Nicolas Carels

    2009-10-01

    Full Text Available In this report, we compared the success rate of classification of coding sequences (CDS vs. introns by Codon Structure Factor (CSF and by a method that we called Universal Feature Method (UFM. UFM is based on the scoring of purine bias (Rrr and stop codon frequency. We show that the success rate of CDS/intron classification by UFM is higher than by CSF. UFM classifies ORFs as coding or non-coding through a score based on (i the stop codon distribution, (ii the product of purine probabilities in the three positions of nucleotide triplets, (iii the product of Cytosine (C, Guanine (G, and Adenine (A probabilities in the 1st, 2nd, and 3rd positions of triplets, respectively, (iv the probabilities of G in 1st and 2nd position of triplets and (v the distance of their GC3 vs. GC2 levels to the regression line of the universal correlation. More than 80% of CDSs (true positives of Homo sapiens (>250 bp, Drosophila melanogaster (>250 bp and Arabidopsis thaliana (>200 bp are successfully classified with a false positive rate lower or equal to 5%. The method releases coding sequences in their coding strand and coding frame, which allows their automatic translation into protein sequences with 95% confidence. The method is a natural consequence of the compositional bias of nucleotides in coding sequences.

  1. Research on quality assurance classification methodology for domestic AP1000 nuclear power projects

    International Nuclear Information System (INIS)

    Bai Jinhua; Jiang Huijie; Li Jingyan

    2012-01-01

    To meet the quality assurance classification requirements of domestic nuclear safety codes and standards, this paper analyzes the quality assurance classification methodology of domestic AP1000 nuclear power projects at present, and proposes the quality assurance classification methodology for subsequent AP1000 nuclear power projects. (authors)

  2. Asteroid taxonomic classifications

    International Nuclear Information System (INIS)

    Tholen, D.J.

    1989-01-01

    This paper reports on three taxonomic classification schemes developed and applied to the body of available color and albedo data. Asteroid taxonomic classifications according to two of these schemes are reproduced

  3. Mixed quantum/classical approach to OH-stretch inelastic incoherent neutron scattering spectroscopy for ambient and supercooled liquid water and ice Ih

    Energy Technology Data Exchange (ETDEWEB)

    Shi, L.; Skinner, J. L. [Theoretical Chemistry Institute and Department of Chemistry, University of Wisconsin, Madison, Wisconsin 53706 (United States)

    2015-07-07

    OH-stretch inelastic incoherent neutron scattering (IINS) has been measured to determine the vibrational density of states (VDOS) in the OH-stretch region for liquid water, supercooled water, and ice Ih, providing complementary information to IR and Raman spectroscopies about hydrogen bonding in these phases. In this work, we extend the combined electronic-structure/molecular-dynamics (ES/MD) method, originally developed by Skinner and co-workers to simulate OH-stretch IR and Raman spectra, to the calculation of IINS spectra with small k values. The agreement between theory and experiment in the limit k → 0 is reasonable, further validating the reliability of the ES/MD method in simulating OH-stretch spectroscopy in condensed phases. The connections and differences between IINS and IR spectra are analyzed to illustrate the advantages of IINS over IR in estimating the OH-stretch VDOS.

  4. Mixed quantum/classical approach to OH-stretch inelastic incoherent neutron scattering spectroscopy for ambient and supercooled liquid water and ice Ih

    International Nuclear Information System (INIS)

    Shi, L.; Skinner, J. L.

    2015-01-01

    OH-stretch inelastic incoherent neutron scattering (IINS) has been measured to determine the vibrational density of states (VDOS) in the OH-stretch region for liquid water, supercooled water, and ice Ih, providing complementary information to IR and Raman spectroscopies about hydrogen bonding in these phases. In this work, we extend the combined electronic-structure/molecular-dynamics (ES/MD) method, originally developed by Skinner and co-workers to simulate OH-stretch IR and Raman spectra, to the calculation of IINS spectra with small k values. The agreement between theory and experiment in the limit k → 0 is reasonable, further validating the reliability of the ES/MD method in simulating OH-stretch spectroscopy in condensed phases. The connections and differences between IINS and IR spectra are analyzed to illustrate the advantages of IINS over IR in estimating the OH-stretch VDOS

  5. Hand eczema classification

    DEFF Research Database (Denmark)

    Diepgen, T L; Andersen, Klaus Ejner; Brandao, F M

    2008-01-01

    of the disease is rarely evidence based, and a classification system for different subdiagnoses of hand eczema is not agreed upon. Randomized controlled trials investigating the treatment of hand eczema are called for. For this, as well as for clinical purposes, a generally accepted classification system...... A classification system for hand eczema is proposed. Conclusions It is suggested that this classification be used in clinical work and in clinical trials....

  6. Vector Network Coding Algorithms

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...

  7. Classification with support hyperplanes

    NARCIS (Netherlands)

    G.I. Nalbantov (Georgi); J.C. Bioch (Cor); P.J.F. Groenen (Patrick)

    2006-01-01

    textabstractA new classification method is proposed, called Support Hy- perplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using

  8. Standard classification: Physics

    International Nuclear Information System (INIS)

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  9. Identification of ICD Codes Suggestive of Child Maltreatment

    Science.gov (United States)

    Schnitzer, Patricia G.; Slusher, Paula L.; Kruse, Robin L.; Tarleton, Molly M.

    2011-01-01

    Objective: In order to be reimbursed for the care they provide, hospitals in the United States are required to use a standard system to code all discharge diagnoses: the International Classification of Disease, 9th Revision, Clinical Modification (ICD-9). Although ICD-9 codes specific for child maltreatment exist, they do not identify all…

  10. Homological stabilizer codes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  11. Classification of refrigerants; Classification des fluides frigorigenes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)

  12. Diagnostic Coding for Epilepsy.

    Science.gov (United States)

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  13. Coding of Neuroinfectious Diseases.

    Science.gov (United States)

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  14. Measurements of Neuronal Soma Size and Estimated Peptide Concentrations in Addition to Cell Abundance Offer a Higher Resolution of Seasonal and Reproductive Influences of GnRH-I and GnIH in European Starlings.

    Science.gov (United States)

    Amorin, Nelson; Calisi, Rebecca M

    2015-08-01

    Hypothalamic neuropeptides involved in vertebrate reproduction, gonadotropin releasing hormone (GnRH-I) and gonadotropin-inhibitory hormone (GnIH), can vary in the abundance of immunoreactive cells as a function of the reproductive status and nest box occupation of European starlings (Sturnus vulgaris). While using the abundance of cells as an indicator of the activity of neurohormones is informative, incorporating information on cell size (readily observed using immunohistochemistry) can offer a more detailed understanding of environmentally-mediated changes in hormonal dynamics. In this study, we tested the hypothesis that the size of cells' somas and the estimated concentration of peptides in cells immunoreactive (ir) for GnRH-I and GnIH would vary throughout the breeding season and as a function of nest-box status (resident or not). In the absence of a direct assay of protein, we estimated an index of the concentration of hypothalamic peptides via the relative optical density (i.e., the difference between the mean optical density and the optical density of background staining). In support of our hypothesis, we found that GnRH-I- and GnIH-ir soma size and peptide concentration changed both in males and females throughout the breeding season. Somas were largest and estimated peptide concentration was highest mid-season when compared with earlier in the season or to the non-breeding period. For nest-box residents, GnIH-ir soma size and peptide concentration were higher during the middle of the breeding season than earlier in the breeding season, although residence in the nest box was not related to GnRH-I-ir variables. Our results confirm that previously reported changes in cell abundance mimic changes we see in GnRH-I and GnIH-ir soma size and our proxy for peptide concentration. However, investigating changes in the soma of GnRH-I-ir cells revealed a peak in size during the middle of the breeding season, a change not evident when solely examining data on the

  15. Classification, disease, and diagnosis.

    Science.gov (United States)

    Jutel, Annemarie

    2011-01-01

    Classification shapes medicine and guides its practice. Understanding classification must be part of the quest to better understand the social context and implications of diagnosis. Classifications are part of the human work that provides a foundation for the recognition and study of illness: deciding how the vast expanse of nature can be partitioned into meaningful chunks, stabilizing and structuring what is otherwise disordered. This article explores the aims of classification, their embodiment in medical diagnosis, and the historical traditions of medical classification. It provides a brief overview of the aims and principles of classification and their relevance to contemporary medicine. It also demonstrates how classifications operate as social framing devices that enable and disable communication, assert and refute authority, and are important items for sociological study.

  16. Vector Network Coding

    OpenAIRE

    Ebrahimi, Javad; Fragouli, Christina

    2010-01-01

    We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...

  17. Entropy Coding in HEVC

    OpenAIRE

    Sze, Vivienne; Marpe, Detlev

    2014-01-01

    Context-Based Adaptive Binary Arithmetic Coding (CABAC) is a method of entropy coding first introduced in H.264/AVC and now used in the latest High Efficiency Video Coding (HEVC) standard. While it provides high coding efficiency, the data dependencies in H.264/AVC CABAC make it challenging to parallelize and thus limit its throughput. Accordingly, during the standardization of entropy coding for HEVC, both aspects of coding efficiency and throughput were considered. This chapter describes th...

  18. Generalized concatenated quantum codes

    International Nuclear Information System (INIS)

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-01-01

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  19. Rateless feedback codes

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Koike-Akino, Toshiaki; Orlik, Philip

    2012-01-01

    This paper proposes a concept called rateless feedback coding. We redesign the existing LT and Raptor codes, by introducing new degree distributions for the case when a few feedback opportunities are available. We show that incorporating feedback to LT codes can significantly decrease both...... the coding overhead and the encoding/decoding complexity. Moreover, we show that, at the price of a slight increase in the coding overhead, linear complexity is achieved with Raptor feedback coding....

  20. Security classification of information

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  1. Advanced video coding systems

    CERN Document Server

    Gao, Wen

    2015-01-01

    This comprehensive and accessible text/reference presents an overview of the state of the art in video coding technology. Specifically, the book introduces the tools of the AVS2 standard, describing how AVS2 can help to achieve a significant improvement in coding efficiency for future video networks and applications by incorporating smarter coding tools such as scene video coding. Topics and features: introduces the basic concepts in video coding, and presents a short history of video coding technology and standards; reviews the coding framework, main coding tools, and syntax structure of AV

  2. Coding for dummies

    CERN Document Server

    Abraham, Nikhil

    2015-01-01

    Hands-on exercises help you learn to code like a pro No coding experience is required for Coding For Dummies,your one-stop guide to building a foundation of knowledge inwriting computer code for web, application, and softwaredevelopment. It doesn't matter if you've dabbled in coding or neverwritten a line of code, this book guides you through the basics.Using foundational web development languages like HTML, CSS, andJavaScript, it explains in plain English how coding works and whyit's needed. Online exercises developed by Codecademy, a leading online codetraining site, help hone coding skill

  3. Searching bioremediation patents through Cooperative Patent Classification (CPC).

    Science.gov (United States)

    Prasad, Rajendra

    2016-03-01

    Patent classification systems have traditionally evolved independently at each patent jurisdiction to classify patents handled by their examiners to be able to search previous patents while dealing with new patent applications. As patent databases maintained by them went online for free access to public as also for global search of prior art by examiners, the need arose for a common platform and uniform structure of patent databases. The diversity of different classification, however, posed problems of integrating and searching relevant patents across patent jurisdictions. To address this problem of comparability of data from different sources and searching patents, WIPO in the recent past developed what is known as International Patent Classification (IPC) system which most countries readily adopted to code their patents with IPC codes along with their own codes. The Cooperative Patent Classification (CPC) is the latest patent classification system based on IPC/European Classification (ECLA) system, developed by the European Patent Office (EPO) and the United States Patent and Trademark Office (USPTO) which is likely to become a global standard. This paper discusses this new classification system with reference to patents on bioremediation.

  4. Classification of Flotation Frothers

    Directory of Open Access Journals (Sweden)

    Jan Drzymala

    2018-02-01

    Full Text Available In this paper, a scheme of flotation frothers classification is presented. The scheme first indicates the physical system in which a frother is present and four of them i.e., pure state, aqueous solution, aqueous solution/gas system and aqueous solution/gas/solid system are distinguished. As a result, there are numerous classifications of flotation frothers. The classifications can be organized into a scheme described in detail in this paper. The frother can be present in one of four physical systems, that is pure state, aqueous solution, aqueous solution/gas and aqueous solution/gas/solid system. It results from the paper that a meaningful classification of frothers relies on choosing the physical system and next feature, trend, parameter or parameters according to which the classification is performed. The proposed classification can play a useful role in characterizing and evaluation of flotation frothers.

  5. Hyperspectral Image Classification Using Discriminative Dictionary Learning

    International Nuclear Information System (INIS)

    Zongze, Y; Hao, S; Kefeng, J; Huanxin, Z

    2014-01-01

    The hyperspectral image (HSI) processing community has witnessed a surge of papers focusing on the utilization of sparse prior for effective HSI classification. In sparse representation based HSI classification, there are two phases: sparse coding with an over-complete dictionary and classification. In this paper, we first apply a novel fisher discriminative dictionary learning method, which capture the relative difference in different classes. The competitive selection strategy ensures that atoms in the resulting over-complete dictionary are the most discriminative. Secondly, motivated by the assumption that spatially adjacent samples are statistically related and even belong to the same materials (same class), we propose a majority voting scheme incorporating contextual information to predict the category label. Experiment results show that the proposed method can effectively strengthen relative discrimination of the constructed dictionary, and incorporating with the majority voting scheme achieve generally an improved prediction performance

  6. Ontologies vs. Classification Systems

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2009-01-01

    What is an ontology compared to a classification system? Is a taxonomy a kind of classification system or a kind of ontology? These are questions that we meet when working with people from industry and public authorities, who need methods and tools for concept clarification, for developing meta...... data sets or for obtaining advanced search facilities. In this paper we will present an attempt at answering these questions. We will give a presentation of various types of ontologies and briefly introduce terminological ontologies. Furthermore we will argue that classification systems, e.g. product...... classification systems and meta data taxonomies, should be based on ontologies....

  7. Iris Image Classification Based on Hierarchical Visual Codebook.

    Science.gov (United States)

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection.

  8. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  9. Discussion on LDPC Codes and Uplink Coding

    Science.gov (United States)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  10. Headache attributed to intracranial pressure alterations: applicability of the International Classification of Headache Disorders ICHD-3 beta version versus ICHD-2.

    Science.gov (United States)

    Curone, M; Peccarisi, C; Bussone, G

    2015-05-01

    The association between headache and changes in intracranial pressure is strong in clinical practice. Syndromes associated with abnormalities of cerebrospinal fluid (CSF) pressure include spontaneous intracranial hypotension (SIH) and idiopathic intracranial hypertension (IIH). In 2013, the Headache Classification Committee of the International Headache Society (IHS) published the third International Classification of Headache Disorders (ICHD-3 beta version). The aim of this study was to investigate applicability of the new ICHD-3 versus ICHD-2 criteria in a clinical sample of patients with intracranial pressure (ICP) alterations. Patients admitted at our Headache Center for headache evaluation in whom a diagnosis of ICP alterations was performed were reviewed. 71 consecutive patients were studied. 40 patients (Group A) were diagnosed as IIH, 22 (Group B) as SIH, 7 (Group C) and 2 (Group D), respectively, as symptomatic intracranial hypertension and symptomatic intracranial hypotension. Main headache features were: in Group A, daily or nearly-daily headache (100 %) with diffuse/non-pulsating pain (73 %), aggravated by coughing/straining (54 %) and migrainous-associated symptoms (43 %). In Group B, an orthostatic headache (100 %) with nausea (29 %), vomiting (24 %), hearing disturbance (33 %), neck pain (48 %), hypacusia (24 %), photophobia (22 %) was reported. In Group C, a diffuse non-pulsating headache was present in 95 % with vomiting (25 %), sixth nerve palsy (14 %) and tinnitus (29 %). In Group D, an orthostatic headache with neck stiffness was reported by 100 %. Regarding applicability of ICHD-2 criteria in Group A, 73 % of the patients fitted criterion A; 100 %, criterion B; 100 %, criterion C; and 75 %, criterion D; while applying ICHD-3 beta version criteria, 100 % fitted criterion A; 97.5 %, criterion B; 100 %, criterion C; and 100 %, criterion D. In Group B, application of ICHD-2 showed 91 % patients fitting criterion A; 100 %, criterion B; 100

  11. Locally orderless registration code

    DEFF Research Database (Denmark)

    2012-01-01

    This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows.......This is code for the TPAMI paper "Locally Orderless Registration". The code requires intel threadding building blocks installed and is provided for 64 bit on mac, linux and windows....

  12. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    Shannon limit of the channel. Among the earliest discovered codes that approach the. Shannon limit were the low density parity check (LDPC) codes. The term low density arises from the property of the parity check matrix defining the code. We will now define this matrix and the role that it plays in decoding. 2. Linear Codes.

  13. Manually operated coded switch

    International Nuclear Information System (INIS)

    Barnette, J.H.

    1978-01-01

    The disclosure related to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made

  14. Coding in Muscle Disease.

    Science.gov (United States)

    Jones, Lyell K; Ney, John P

    2016-12-01

    Accurate coding is critically important for clinical practice and research. Ongoing changes to diagnostic and billing codes require the clinician to stay abreast of coding updates. Payment for health care services, data sets for health services research, and reporting for medical quality improvement all require accurate administrative coding. This article provides an overview of administrative coding for patients with muscle disease and includes a case-based review of diagnostic and Evaluation and Management (E/M) coding principles in patients with myopathy. Procedural coding for electrodiagnostic studies and neuromuscular ultrasound is also reviewed.

  15. Synthesis and Isolation of the Titanium-Scandium Endohedral Fullerenes-Sc2 TiC@Ih -C80 , Sc2 TiC@D5h -C80 and Sc2 TiC2 @Ih -C80 : Metal Size Tuning of the Ti(IV) /Ti(III) Redox Potentials.

    Science.gov (United States)

    Junghans, Katrin; Ghiassi, Kamran B; Samoylova, Nataliya A; Deng, Qingming; Rosenkranz, Marco; Olmstead, Marilyn M; Balch, Alan L; Popov, Alexey A

    2016-09-05

    The formation of endohedral metallofullerenes (EMFs) in an electric arc is reported for the mixed-metal Sc-Ti system utilizing methane as a reactive gas. Comparison of these results with those from the Sc/CH4 and Ti/CH4 systems as well as syntheses without methane revealed a strong mutual influence of all key components on the product distribution. Whereas a methane atmosphere alone suppresses the formation of empty cage fullerenes, the Ti/CH4 system forms mainly empty cage fullerenes. In contrast, the main fullerene products in the Sc/CH4 system are Sc4 C2 @C80 (the most abundant EMF from this synthesis), Sc3 C2 @C80 , isomers of Sc2 C2 @C82 , and the family Sc2 C2 n (2 n=74, 76, 82, 86, 90, etc.), as well as Sc3 CH@C80 . The Sc-Ti/CH4 system produces the mixed-metal Sc2 TiC@C2 n (2 n=68, 78, 80) and Sc2 TiC2 @C2 n (2 n=80) clusterfullerene families. The molecular structures of the new, transition-metal-containing endohedral fullerenes, Sc2 TiC@Ih -C80 , Sc2 TiC@D5h -C80 , and Sc2 TiC2 @Ih -C80 , were characterized by NMR spectroscopy. The structure of Sc2 TiC@Ih -C80 was also determined by single-crystal X-ray diffraction, which demonstrated the presence of a short Ti=C double bond. Both Sc2 TiC- and Sc2 TiC2 -containing clusterfullerenes have Ti-localized LUMOs. Encapsulation of the redox-active Ti ion inside the fullerene cage enables analysis of the cluster-cage strain in the endohedral fullerenes through electrochemical measurements. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  16. QR Codes 101

    Science.gov (United States)

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  17. A Computer Oriented Scheme for Coding Chemicals in the Field of Biomedicine.

    Science.gov (United States)

    Bobka, Marilyn E.; Subramaniam, J.B.

    The chemical coding scheme of the Medical Coding Scheme (MCS), developed for use in the Comparative Systems Laboratory (CSL), is outlined and evaluated in this report. The chemical coding scheme provides a classification scheme and encoding method for drugs and chemical terms. Using the scheme complicated chemical structures may be expressed…

  18. An algorithm for the arithmetic classification of multilattices.

    Science.gov (United States)

    Indelicato, Giuliana

    2013-01-01

    A procedure for the construction and the classification of monoatomic multilattices in arbitrary dimension is developed. The algorithm allows one to determine the location of the points of all monoatomic multilattices with a given symmetry, or to determine whether two assigned multilattices are arithmetically equivalent. This approach is based on ideas from integral matrix theory, in particular the reduction to the Smith normal form, and can be coded to provide a classification software package.

  19. Colombia: Territorial classification

    International Nuclear Information System (INIS)

    Mendoza Morales, Alberto

    1998-01-01

    The article is about the approaches of territorial classification, thematic axes, handling principles and territorial occupation, politician and administrative units and administration regions among other topics. Understanding as Territorial Classification the space distribution on the territory of the country, of the geographical configurations, the human communities, the political-administrative units and the uses of the soil, urban and rural, existent and proposed

  20. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  1. Library Classification 2020

    Science.gov (United States)

    Harris, Christopher

    2013-01-01

    In this article the author explores how a new library classification system might be designed using some aspects of the Dewey Decimal Classification (DDC) and ideas from other systems to create something that works for school libraries in the year 2020. By examining what works well with the Dewey Decimal System, what features should be carried…

  2. Spectroscopic classification of transients

    DEFF Research Database (Denmark)

    Stritzinger, M. D.; Fraser, M.; Hummelmose, N. N.

    2017-01-01

    We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017.......We report the spectroscopic classification of several transients based on observations taken with the Nordic Optical Telescope (NOT) equipped with ALFOSC, over the nights 23-25 August 2017....

  3. Upregulation of Ih expressed in IB4-negative Aδ nociceptive DRG neurons contributes to mechanical hypersensitivity associated with cervical radiculopathic pain

    Science.gov (United States)

    Liu, Da-Lu; Lu, Na; Han, Wen-Juan; Chen, Rong-Gui; Cong, Rui; Xie, Rou-Gang; Zhang, Yu-Fei; Kong, Wei-Wei; Hu, San-Jue; Luo, Ceng

    2015-01-01

    Cervical radiculopathy represents aberrant mechanical hypersensitivity. Primary sensory neuron’s ability to sense mechanical force forms mechanotransduction. However, whether this property undergoes activity-dependent plastic changes and underlies mechanical hypersensitivity associated with cervical radiculopathic pain (CRP) is not clear. Here we show a new CRP model producing stable mechanical compression of dorsal root ganglion (DRG), which induces dramatic behavioral mechanical hypersensitivity. Amongst nociceptive DRG neurons, a mechanically sensitive neuron, isolectin B4 negative Aδ-type (IB4− Aδ) DRG neuron displays spontaneous activity with hyperexcitability after chronic compression of cervical DRGs. Focal mechanical stimulation on somata of IB4- Aδ neuron induces abnormal hypersensitivity. Upregulated HCN1 and HCN3 channels and increased Ih current on this subset of primary nociceptors underlies the spontaneous activity together with neuronal mechanical hypersensitivity, which further contributes to the behavioral mechanical hypersensitivity associated with CRP. This study sheds new light on the functional plasticity of a specific subset of nociceptive DRG neurons to mechanical stimulation and reveals a novel mechanism that could underlie the mechanical hypersensitivity associated with cervical radiculopathy. PMID:26577374

  4. DOE LLW classification rationale

    International Nuclear Information System (INIS)

    Flores, A.Y.

    1991-01-01

    This report was about the rationale which the US Department of Energy had with low-level radioactive waste (LLW) classification. It is based on the Nuclear Regulatory Commission's classification system. DOE site operators met to review the qualifications and characteristics of the classification systems. They evaluated performance objectives, developed waste classification tables, and compiled dose limits on the waste. A goal of the LLW classification system was to allow each disposal site the freedom to develop limits to radionuclide inventories and concentrations according to its own site-specific characteristics. This goal was achieved with the adoption of a performance objectives system based on a performance assessment, with site-specific environmental conditions and engineered disposal systems

  5. Constructing criticality by classification

    DEFF Research Database (Denmark)

    Machacek, Erika

    2017-01-01

    " in the bureaucratic practice of classification: Experts construct material criticality in assessments as they allot information on the materials to the parameters of the assessment framework. In so doing, they ascribe a new set of connotations to the materials, namely supply risk, and their importance to clean energy......, legitimizing a criticality discourse.Specifically, the paper introduces a typology delineating the inferences made by the experts from their produced recommendations in the classification of rare earth element criticality. The paper argues that the classification is a specific process of constructing risk....... It proposes that the expert bureaucratic practice of classification legitimizes (i) the valorisation that was made in the drafting of the assessment framework for the classification, and (ii) political operationalization when enacted that might have (non-)distributive implications for the allocation of public...

  6. A comparative evaluation of sequence classification programs

    Directory of Open Access Journals (Sweden)

    Bazinet Adam L

    2012-05-01

    Full Text Available Abstract Background A fundamental problem in modern genomics is to taxonomically or functionally classify DNA sequence fragments derived from environmental sampling (i.e., metagenomics. Several different methods have been proposed for doing this effectively and efficiently, and many have been implemented in software. In addition to varying their basic algorithmic approach to classification, some methods screen sequence reads for ’barcoding genes’ like 16S rRNA, or various types of protein-coding genes. Due to the sheer number and complexity of methods, it can be difficult for a researcher to choose one that is well-suited for a particular analysis. Results We divided the very large number of programs that have been released in recent years for solving the sequence classification problem into three main categories based on the general algorithm they use to compare a query sequence against a database of sequences. We also evaluated the performance of the leading programs in each category on data sets whose taxonomic and functional composition is known. Conclusions We found significant variability in classification accuracy, precision, and resource consumption of sequence classification programs when used to analyze various metagenomics data sets. However, we observe some general trends and patterns that will be useful to researchers who use sequence classification programs.

  7. Codes and curves

    CERN Document Server

    Walker, Judy L

    2000-01-01

    When information is transmitted, errors are likely to occur. Coding theory examines efficient ways of packaging data so that these errors can be detected, or even corrected. The traditional tools of coding theory have come from combinatorics and group theory. Lately, however, coding theorists have added techniques from algebraic geometry to their toolboxes. In particular, by re-interpreting the Reed-Solomon codes, one can see how to define new codes based on divisors on algebraic curves. For instance, using modular curves over finite fields, Tsfasman, Vladut, and Zink showed that one can define a sequence of codes with asymptotically better parameters than any previously known codes. This monograph is based on a series of lectures the author gave as part of the IAS/PCMI program on arithmetic algebraic geometry. Here, the reader is introduced to the exciting field of algebraic geometric coding theory. Presenting the material in the same conversational tone of the lectures, the author covers linear codes, inclu...

  8. Greedy vs. L1 convex optimization in sparse coding

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    2015-01-01

    Sparse representation has been applied successfully in many image analysis applications, including abnormal event detection, in which a baseline is to learn a dictionary from the training data and detect anomalies from its sparse codes. During this procedure, sparse codes which can be achieved...... solutions. Considering the property of abnormal event detection, i.e., only normal videos are used as training data due to practical reasons, effective codes in classification application may not perform well in abnormality detection. Therefore, we compare the sparse codes and comprehensively evaluate...... their performance from various aspects to better understand their applicability, including computation time, reconstruction error, sparsity, detection...

  9. Classification of movement disorders.

    Science.gov (United States)

    Fahn, Stanley

    2011-05-01

    The classification of movement disorders has evolved. Even the terminology has shifted, from an anatomical one of extrapyramidal disorders to a phenomenological one of movement disorders. The history of how this shift came about is described. The history of both the definitions and the classifications of the various neurologic conditions is then reviewed. First is a review of movement disorders as a group; then, the evolving classifications for 3 of them--parkinsonism, dystonia, and tremor--are covered in detail. Copyright © 2011 Movement Disorder Society.

  10. The materiality of Code

    DEFF Research Database (Denmark)

    Soon, Winnie

    2014-01-01

    This essay studies the source code of an artwork from a software studies perspective. By examining code that come close to the approach of critical code studies (Marino, 2006), I trace the network artwork, Pupufu (Lin, 2009) to understand various real-time approaches to social media platforms (MSN......, Twitter and Facebook). The focus is not to investigate the functionalities and efficiencies of the code, but to study and interpret the program level of code in order to trace the use of various technological methods such as third-party libraries and platforms’ interfaces. These are important...... to understand the socio-technical side of a changing network environment. Through the study of code, including but not limited to source code, technical specifications and other materials in relation to the artwork production, I would like to explore the materiality of code that goes beyond technical...

  11. Coding for optical channels

    CERN Document Server

    Djordjevic, Ivan; Vasic, Bane

    2010-01-01

    This unique book provides a coherent and comprehensive introduction to the fundamentals of optical communications, signal processing and coding for optical channels. It is the first to integrate the fundamentals of coding theory and optical communication.

  12. SEVERO code - user's manual

    International Nuclear Information System (INIS)

    Sacramento, A.M. do.

    1989-01-01

    This user's manual contains all the necessary information concerning the use of SEVERO code. This computer code is related to the statistics of extremes = extreme winds, extreme precipitation and flooding hazard risk analysis. (A.C.A.S.)

  13. Synthesizing Certified Code

    OpenAIRE

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to th...

  14. FERRET data analysis code

    International Nuclear Information System (INIS)

    Schmittroth, F.

    1979-09-01

    A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples

  15. Stylize Aesthetic QR Code

    OpenAIRE

    Xu, Mingliang; Su, Hao; Li, Yafei; Li, Xi; Liao, Jing; Niu, Jianwei; Lv, Pei; Zhou, Bing

    2018-01-01

    With the continued proliferation of smart mobile devices, Quick Response (QR) code has become one of the most-used types of two-dimensional code in the world. Aiming at beautifying the appearance of QR codes, existing works have developed a series of techniques to make the QR code more visual-pleasant. However, these works still leave much to be desired, such as visual diversity, aesthetic quality, flexibility, universal property, and robustness. To address these issues, in this paper, we pro...

  16. Enhancing QR Code Security

    OpenAIRE

    Zhang, Linfan; Zheng, Shuang

    2015-01-01

    Quick Response code opens possibility to convey data in a unique way yet insufficient prevention and protection might lead into QR code being exploited on behalf of attackers. This thesis starts by presenting a general introduction of background and stating two problems regarding QR code security, which followed by a comprehensive research on both QR code itself and related issues. From the research a solution taking advantages of cloud and cryptography together with an implementation come af...

  17. Remote-Handled Transuranic Content Codes

    International Nuclear Information System (INIS)

    2001-01-01

    The Remote-Handled Transuranic (RH-TRU) Content Codes (RH-TRUCON) document represents the development of a uniform content code system for RH-TRU waste to be transported in the 72-Bcask. It will be used to convert existing waste form numbers, content codes, and site-specific identification codes into a system that is uniform across the U.S. Department of Energy (DOE) sites.The existing waste codes at the sites can be grouped under uniform content codes without any lossof waste characterization information. The RH-TRUCON document provides an all-encompassing description for each content code and compiles this information for all DOE sites. Compliance with waste generation, processing, and certification procedures at the sites (outlined in this document foreach content code) ensures that prohibited waste forms are not present in the waste. The content code gives an overall description of the RH-TRU waste material in terms of processes and packaging, as well as the generation location. This helps to provide cradle-to-grave traceability of the waste material so that the various actions required to assess its qualification as payload for the 72-B cask can be performed. The content codes also impose restrictions and requirements on the manner in which a payload can be assembled. The RH-TRU Waste Authorized Methods for Payload Control (RH-TRAMPAC), Appendix 1.3.7 of the 72-B Cask Safety Analysis Report (SAR), describes the current governing procedures applicable for the qualification of waste as payload for the 72-B cask. The logic for this classification is presented in the 72-B Cask SAR. Together, these documents (RH-TRUCON, RH-TRAMPAC, and relevant sections of the 72-B Cask SAR) present the foundation and justification for classifying RH-TRU waste into content codes. Only content codes described in thisdocument can be considered for transport in the 72-B cask. Revisions to this document will be madeas additional waste qualifies for transport. Each content code uniquely

  18. Opening up codings?

    DEFF Research Database (Denmark)

    Steensig, Jakob; Heinemann, Trine

    2015-01-01

    doing formal coding and when doing more “traditional” conversation analysis research based on collections. We are more wary, however, of the implication that coding-based research is the end result of a process that starts with qualitative investigations and ends with categories that can be coded...

  19. Gauge color codes

    DEFF Research Database (Denmark)

    Bombin Palomo, Hector

    2015-01-01

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates is optimal and only depends on the spatial dimension, not the local geometry. I also introduce a generalized, subsystem version of color codes. In 3D they allow...

  20. Refactoring test code

    NARCIS (Netherlands)

    A. van Deursen (Arie); L.M.F. Moonen (Leon); A. van den Bergh; G. Kok

    2001-01-01

    textabstractTwo key aspects of extreme programming (XP) are unit testing and merciless refactoring. Given the fact that the ideal test code / production code ratio approaches 1:1, it is not surprising that unit tests are being refactored. We found that refactoring test code is different from

  1. Update on diabetes classification.

    Science.gov (United States)

    Thomas, Celeste C; Philipson, Louis H

    2015-01-01

    This article highlights the difficulties in creating a definitive classification of diabetes mellitus in the absence of a complete understanding of the pathogenesis of the major forms. This brief review shows the evolving nature of the classification of diabetes mellitus. No classification scheme is ideal, and all have some overlap and inconsistencies. The only diabetes in which it is possible to accurately diagnose by DNA sequencing, monogenic diabetes, remains undiagnosed in more than 90% of the individuals who have diabetes caused by one of the known gene mutations. The point of classification, or taxonomy, of disease, should be to give insight into both pathogenesis and treatment. It remains a source of frustration that all schemes of diabetes mellitus continue to fall short of this goal. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Learning Apache Mahout classification

    CERN Document Server

    Gupta, Ashish

    2015-01-01

    If you are a data scientist who has some experience with the Hadoop ecosystem and machine learning methods and want to try out classification on large datasets using Mahout, this book is ideal for you. Knowledge of Java is essential.

  3. CLASSIFICATION OF VIRUSES

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. CLASSIFICATION OF VIRUSES. On basis of morphology. On basis of chemical composition. On basis of structure of genome. On basis of mode of replication. Notes:

  4. Pitch Based Sound Classification

    DEFF Research Database (Denmark)

    Nielsen, Andreas Brinch; Hansen, Lars Kai; Kjems, U

    2006-01-01

    A sound classification model is presented that can classify signals into music, noise and speech. The model extracts the pitch of the signal using the harmonic product spectrum. Based on the pitch estimate and a pitch error measure, features are created and used in a probabilistic model with soft......-max output function. Both linear and quadratic inputs are used. The model is trained on 2 hours of sound and tested on publicly available data. A test classification error below 0.05 with 1 s classification windows is achieved. Further more it is shown that linear input performs as well as a quadratic......, and that even though classification gets marginally better, not much is achieved by increasing the window size beyond 1 s....

  5. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  6. Double-blind, placebo-controlled, randomised phase II trial of IH636 grape seed proanthocyanidin extract (GSPE) in patients with radiation-induced breast induration

    International Nuclear Information System (INIS)

    Brooker, Sonja; Martin, Susan; Pearson, Ann; Bagchi, Debasis; Earl, Judith; Gothard, Lone; Hall, Emma; Porter, Lucy; Yarnold, John

    2006-01-01

    Background and purpose: Tissue hardness (induration), pain and tenderness are common late adverse effects of curative radiotherapy for early breast cancer. The purpose of this study was to test the efficacy of IH636 grape seed proanthocyanidin extract (GSPE) in patients with tissue induration after high-dose radiotherapy for early breast cancer in a double-blind placebo-controlled randomised phase II trial. Patients and methods: Sixty-six eligible research volunteers with moderate or marked breast induration at a mean 10.8 years since radiotherapy for early breast cancer were randomised to active drug (n=44) or placebo (n=22). All patients were given grape seed proanthocyanidin extract (GSPE) 100 mg three times a day orally, or corresponding placebo capsules, for 6 months. The primary endpoint was percentage change in surface area (cm 2 ) of palpable breast induration measured at the skin surface 12 months after randomisation. Secondary endpoints included change in photographic breast appearance and patient self-assessment of breast hardness, pain and tenderness. Results: At 12 months post-randomisation, ≥50% reduction in surface area (cm 2 ) of breast induration was recorded in13/44 (29.5%) GSPE and 6/22 (27%) placebo group patients (NS). At 12 months post-randomisation, there was no significant difference between treatment and control groups in terms of external assessments of tissue hardness, breast appearance or patient self-assessments of breast hardness, pain or tenderness. Conclusions: The study failed to show efficacy of orally-adminstered GSPE in patients with breast induration following radiotherapy for breast cancer

  7. Towards secondary fingerprint classification

    CSIR Research Space (South Africa)

    Msiza, IS

    2011-07-01

    Full Text Available an accuracy figure of 76.8%. This small difference between the two figures is indicative of the validity of the proposed secondary classification module. Keywords?fingerprint core; fingerprint delta; primary classifi- cation; secondary classification I..., namely, the fingerprint core and the fingerprint delta. Forensically, a fingerprint core is defined as the innermost turning point where the fingerprint ridges form a loop, while the fingerprint delta is defined as the point where these ridges form a...

  8. Expected Classification Accuracy

    Directory of Open Access Journals (Sweden)

    Lawrence M. Rudner

    2005-08-01

    Full Text Available Every time we make a classification based on a test score, we should expect some number..of misclassifications. Some examinees whose true ability is within a score range will have..observed scores outside of that range. A procedure for providing a classification table of..true and expected scores is developed for polytomously scored items under item response..theory and applied to state assessment data. A simplified procedure for estimating the..table entries is also presented.

  9. Latent classification models

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2005-01-01

    parametric family ofdistributions.  In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....

  10. Audit of Clinical Coding of Major Head and Neck Operations

    Science.gov (United States)

    Mitra, Indu; Malik, Tass; Homer, Jarrod J; Loughran, Sean

    2009-01-01

    INTRODUCTION Within the NHS, operations are coded using the Office of Population Censuses and Surveys (OPCS) classification system. These codes, together with diagnostic codes, are used to generate Healthcare Resource Group (HRG) codes, which correlate to a payment bracket. The aim of this study was to determine whether allocated procedure codes for major head and neck operations were correct and reflective of the work undertaken. HRG codes generated were assessed to determine accuracy of remuneration. PATIENTS AND METHODS The coding of consecutive major head and neck operations undertaken in a tertiary referral centre over a retrospective 3-month period were assessed. Procedure codes were initially ascribed by professional hospital coders. Operations were then recoded by the surgical trainee in liaison with the head of clinical coding. The initial and revised procedure codes were compared and used to generate HRG codes, to determine whether the payment banding had altered. RESULTS A total of 34 cases were reviewed. The number of procedure codes generated initially by the clinical coders was 99, whereas the revised codes generated 146. Of the original codes, 47 of 99 (47.4%) were incorrect. In 19 of the 34 cases reviewed (55.9%), the HRG code remained unchanged, thus resulting in the correct payment. Six cases were never coded, equating to £15,300 loss of payment. CONCLUSIONS These results highlight the inadequacy of this system to reward hospitals for the work carried out within the NHS in a fair and consistent manner. The current coding system was found to be complicated, ambiguous and inaccurate, resulting in loss of remuneration. PMID:19220944

  11. 78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-11-18

    ...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and... response to requests from the U.S. cotton industry and ICE, AMS will offer a futures classification option...

  12. The network code

    International Nuclear Information System (INIS)

    1997-01-01

    The Network Code defines the rights and responsibilities of all users of the natural gas transportation system in the liberalised gas industry in the United Kingdom. This report describes the operation of the Code, what it means, how it works and its implications for the various participants in the industry. The topics covered are: development of the competitive gas market in the UK; key points in the Code; gas transportation charging; impact of the Code on producers upstream; impact on shippers; gas storage; supply point administration; impact of the Code on end users; the future. (20 tables; 33 figures) (UK)

  13. Coding for Electronic Mail

    Science.gov (United States)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  14. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  15. NAGRADATA. Code key. Geology

    International Nuclear Information System (INIS)

    Mueller, W.H.; Schneider, B.; Staeuble, J.

    1984-01-01

    This reference manual provides users of the NAGRADATA system with comprehensive keys to the coding/decoding of geological and technical information to be stored in or retreaved from the databank. Emphasis has been placed on input data coding. When data is retreaved the translation into plain language of stored coded information is done automatically by computer. Three keys each, list the complete set of currently defined codes for the NAGRADATA system, namely codes with appropriate definitions, arranged: 1. according to subject matter (thematically) 2. the codes listed alphabetically and 3. the definitions listed alphabetically. Additional explanation is provided for the proper application of the codes and the logic behind the creation of new codes to be used within the NAGRADATA system. NAGRADATA makes use of codes instead of plain language for data storage; this offers the following advantages: speed of data processing, mainly data retrieval, economies of storage memory requirements, the standardisation of terminology. The nature of this thesaurian type 'key to codes' makes it impossible to either establish a final form or to cover the entire spectrum of requirements. Therefore, this first issue of codes to NAGRADATA must be considered to represent the current state of progress of a living system and future editions will be issued in a loose leave ringbook system which can be updated by an organised (updating) service. (author)

  16. XSOR codes users manual

    International Nuclear Information System (INIS)

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ''XSOR''. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms

  17. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    1999-01-01

    The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)

  18. DLLExternalCode

    Energy Technology Data Exchange (ETDEWEB)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  19. A New Classification Approach Based on Multiple Classification Rules

    OpenAIRE

    Zhongmei Zhou

    2014-01-01

    A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...

  20. Diabetes Mellitus Coding Training for Family Practice Residents.

    Science.gov (United States)

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  1. Toric Varieties and Codes, Error-correcting Codes, Quantum Codes, Secret Sharing and Decoding

    DEFF Research Database (Denmark)

    Hansen, Johan Peder

    We present toric varieties and associated toric codes and their decoding. Toric codes are applied to construct Linear Secret Sharing Schemes (LSSS) with strong multiplication by the Massey construction. Asymmetric Quantum Codes are obtained from toric codes by the A.R. Calderbank P.W. Shor and A.......M. Steane construction of stabilizer codes (CSS) from linear codes containing their dual codes....

  2. Vulnerabilities Classification for Safe Development on Android

    Directory of Open Access Journals (Sweden)

    Ricardo Luis D. M. Ferreira

    2016-06-01

    Full Text Available The global sales market is currently led by devices with the Android operating system. In 2015, more than 1 billion smartphones were sold, of which 81.5% were operated by the Android platform. In 2017, it is estimated that 267.78 billion applications will be downloaded from Google Play. According to Qian, 90% of applications are vulnerable, despite the recommendations of rules and standards for the safe software development. This study presents a classification of vulnerabilities, indicating the vulnerability, the safety aspect defined by the Brazilian Association of Technical Standards (Associação Brasileira de Normas Técnicas - ABNT norm NBR ISO/IEC 27002 which will be violated, which lines of code generate the vulnerability and what should be done to avoid it, and the threat agent used by each of them. This classification allows the identification of possible points of vulnerability, allowing the developer to correct the identified gaps.

  3. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  4. Improving the accuracy of operation coding in surgical discharge summaries

    Science.gov (United States)

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  5. An Optimal Linear Coding for Index Coding Problem

    OpenAIRE

    Pezeshkpour, Pouya

    2015-01-01

    An optimal linear coding solution for index coding problem is established. Instead of network coding approach by focus on graph theoric and algebraic methods a linear coding program for solving both unicast and groupcast index coding problem is presented. The coding is proved to be the optimal solution from the linear perspective and can be easily utilize for any number of messages. The importance of this work is lying mostly on the usage of the presented coding in the groupcast index coding ...

  6. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  7. The Aesthetics of Coding

    DEFF Research Database (Denmark)

    Andersen, Christian Ulrik

    2007-01-01

    Computer art is often associated with computer-generated expressions (digitally manipulated audio/images in music, video, stage design, media facades, etc.). In recent computer art, however, the code-text itself – not the generated output – has become the artwork (Perl Poetry, ASCII Art, obfuscated...... code, etc.). The presentation relates this artistic fascination of code to a media critique expressed by Florian Cramer, claiming that the graphical interface represents a media separation (of text/code and image) causing alienation to the computer’s materiality. Cramer is thus the voice of a new ‘code...... avant-garde’. In line with Cramer, the artists Alex McLean and Adrian Ward (aka Slub) declare: “art-oriented programming needs to acknowledge the conditions of its own making – its poesis.” By analysing the Live Coding performances of Slub (where they program computer music live), the presentation...

  8. Majorana fermion codes

    International Nuclear Information System (INIS)

    Bravyi, Sergey; Terhal, Barbara M; Leemhuis, Bernhard

    2010-01-01

    We initiate the study of Majorana fermion codes (MFCs). These codes can be viewed as extensions of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions in quantum wires to higher spatial dimensions and interacting fermions. The purpose of MFCs is to protect quantum information against low-weight fermionic errors, that is, operators acting on sufficiently small subsets of fermionic modes. We examine to what extent MFCs can surpass qubit stabilizer codes in terms of their stability properties. A general construction of 2D MFCs is proposed that combines topological protection based on a macroscopic code distance with protection based on fermionic parity conservation. Finally, we use MFCs to show how to transform any qubit stabilizer code to a weakly self-dual CSS code.

  9. Theory of epigenetic coding.

    Science.gov (United States)

    Elder, D

    1984-06-07

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.

  10. DISP1 code

    International Nuclear Information System (INIS)

    Vokac, P.

    1999-12-01

    DISP1 code is a simple tool for assessment of the dispersion of the fission product cloud escaping from a nuclear power plant after an accident. The code makes it possible to tentatively check the feasibility of calculations by more complex PSA3 codes and/or codes for real-time dispersion calculations. The number of input parameters is reasonably low and the user interface is simple enough to allow a rapid processing of sensitivity analyses. All input data entered through the user interface are stored in the text format. Implementation of dispersion model corrections taken from the ARCON96 code enables the DISP1 code to be employed for assessment of the radiation hazard within the NPP area, in the control room for instance. (P.A.)

  11. A method for modeling co-occurrence propensity of clinical codes with application to ICD-10-PCS auto-coding.

    Science.gov (United States)

    Subotin, Michael; Davis, Anthony R

    2016-09-01

    Natural language processing methods for medical auto-coding, or automatic generation of medical billing codes from electronic health records, generally assign each code independently of the others. They may thus assign codes for closely related procedures or diagnoses to the same document, even when they do not tend to occur together in practice, simply because the right choice can be difficult to infer from the clinical narrative. We propose a method that injects awareness of the propensities for code co-occurrence into this process. First, a model is trained to estimate the conditional probability that one code is assigned by a human coder, given than another code is known to have been assigned to the same document. Then, at runtime, an iterative algorithm is used to apply this model to the output of an existing statistical auto-coder to modify the confidence scores of the codes. We tested this method in combination with a primary auto-coder for International Statistical Classification of Diseases-10 procedure codes, achieving a 12% relative improvement in F-score over the primary auto-coder baseline. The proposed method can be used, with appropriate features, in combination with any auto-coder that generates codes with different levels of confidence. The promising results obtained for International Statistical Classification of Diseases-10 procedure codes suggest that the proposed method may have wider applications in auto-coding. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Phonological coding during reading.

    Science.gov (United States)

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  13. The aeroelastic code FLEXLAST

    Energy Technology Data Exchange (ETDEWEB)

    Visser, B. [Stork Product Eng., Amsterdam (Netherlands)

    1996-09-01

    To support the discussion on aeroelastic codes, a description of the code FLEXLAST was given and experiences within benchmarks and measurement programmes were summarized. The code FLEXLAST has been developed since 1982 at Stork Product Engineering (SPE). Since 1992 FLEXLAST has been used by Dutch industries for wind turbine and rotor design. Based on the comparison with measurements, it can be concluded that the main shortcomings of wind turbine modelling lie in the field of aerodynamics, wind field and wake modelling. (au)

  14. Cellular image classification

    CERN Document Server

    Xu, Xiang; Lin, Feng

    2017-01-01

    This book introduces new techniques for cellular image feature extraction, pattern recognition and classification. The authors use the antinuclear antibodies (ANAs) in patient serum as the subjects and the Indirect Immunofluorescence (IIF) technique as the imaging protocol to illustrate the applications of the described methods. Throughout the book, the authors provide evaluations for the proposed methods on two publicly available human epithelial (HEp-2) cell datasets: ICPR2012 dataset from the ICPR'12 HEp-2 cell classification contest and ICIP2013 training dataset from the ICIP'13 Competition on cells classification by fluorescent image analysis. First, the reading of imaging results is significantly influenced by one’s qualification and reading systems, causing high intra- and inter-laboratory variance. The authors present a low-order LP21 fiber mode for optical single cell manipulation and imaging staining patterns of HEp-2 cells. A focused four-lobed mode distribution is stable and effective in optical...

  15. Bosniak classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2016-01-01

    BACKGROUND: The Bosniak classification was originally based on computed tomographic (CT) findings. Magnetic resonance (MR) and contrast-enhanced ultrasonography (CEUS) imaging may demonstrate findings that are not depicted at CT, and there may not always be a clear correlation between the findings...... at MR and CEUS imaging and those at CT. PURPOSE: To compare diagnostic accuracy of MR, CEUS, and CT when categorizing complex renal cystic masses according to the Bosniak classification. MATERIAL AND METHODS: From February 2011 to June 2012, 46 complex renal cysts were prospectively evaluated by three...... readers. Each mass was categorized according to the Bosniak classification and CT was chosen as gold standard. Kappa was calculated for diagnostic accuracy and data was compared with pathological results. RESULTS: CT images found 27 BII, six BIIF, seven BIII, and six BIV. Forty-three cysts could...

  16. Bosniak Classification system

    DEFF Research Database (Denmark)

    Graumann, Ole; Osther, Susanne Sloth; Karstoft, Jens

    2014-01-01

    Background: The Bosniak classification is a diagnostic tool for the differentiation of cystic changes in the kidney. The process of categorizing renal cysts may be challenging, involving a series of decisions that may affect the final diagnosis and clinical outcome such as surgical management....... Purpose: To investigate the inter- and intra-observer agreement among experienced uroradiologists when categorizing complex renal cysts according to the Bosniak classification. Material and Methods: The original categories of 100 cystic renal masses were chosen as “Gold Standard” (GS), established...... to the calculated weighted κ all readers performed “very good” for both inter-observer and intra-observer variation. Most variation was seen in cysts catagorized as Bosniak II, IIF, and III. These results show that radiologists who evaluate complex renal cysts routinely may apply the Bosniak classification...

  17. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  18. QR codes for dummies

    CERN Document Server

    Waters, Joe

    2012-01-01

    Find out how to effectively create, use, and track QR codes QR (Quick Response) codes are popping up everywhere, and businesses are reaping the rewards. Get in on the action with the no-nonsense advice in this streamlined, portable guide. You'll find out how to get started, plan your strategy, and actually create the codes. Then you'll learn to link codes to mobile-friendly content, track your results, and develop ways to give your customers value that will keep them coming back. It's all presented in the straightforward style you've come to know and love, with a dash of humor thrown

  19. Tokamak Systems Code

    International Nuclear Information System (INIS)

    Reid, R.L.; Barrett, R.J.; Brown, T.G.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged

  20. IR-360 nuclear power plant safety functions and component classification

    Energy Technology Data Exchange (ETDEWEB)

    Yousefpour, F., E-mail: fyousefpour@snira.co [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of); Shokri, F.; Soltani, H. [Management of Nuclear Power Plant Construction Company (MASNA) (Iran, Islamic Republic of)

    2010-10-15

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  1. IR-360 nuclear power plant safety functions and component classification

    International Nuclear Information System (INIS)

    Yousefpour, F.; Shokri, F.; Soltani, H.

    2010-01-01

    The IR-360 nuclear power plant as a 2-loop PWR of 360 MWe power generation capacity is under design in MASNA Company. For design of the IR-360 structures, systems and components (SSCs), the codes and standards and their design requirements must be determined. It is a prerequisite to classify the IR-360 safety functions and safety grade of structures, systems and components correctly for selecting and adopting the suitable design codes and standards. This paper refers to the IAEA nuclear safety codes and standards as well as USNRC standard system to determine the IR-360 safety functions and to formulate the principles of the IR-360 component classification in accordance with the safety philosophy and feature of the IR-360. By implementation of defined classification procedures for the IR-360 SSCs, the appropriate design codes and standards are specified. The requirements of specific codes and standards are used in design process of IR-360 SSCs by design engineers of MASNA Company. In this paper, individual determination of the IR-360 safety functions and definition of the classification procedures and roles are presented. Implementation of this work which is described with example ensures the safety and reliability of the IR-360 nuclear power plant.

  2. Efficient Coding of Information: Huffman Coding -RE ...

    Indian Academy of Sciences (India)

    to a stream of equally-likely symbols so as to recover the original stream in the event of errors. The for- ... The source-coding problem is one of finding a mapping from U to a ... probability that the random variable X takes the value x written as ...

  3. NR-code: Nonlinear reconstruction code

    Science.gov (United States)

    Yu, Yu; Pen, Ue-Li; Zhu, Hong-Ming

    2018-04-01

    NR-code applies nonlinear reconstruction to the dark matter density field in redshift space and solves for the nonlinear mapping from the initial Lagrangian positions to the final redshift space positions; this reverses the large-scale bulk flows and improves the precision measurement of the baryon acoustic oscillations (BAO) scale.

  4. Acoustic classification of dwellings

    DEFF Research Database (Denmark)

    Berardi, Umberto; Rasmussen, Birgit

    2014-01-01

    insulation performance, national schemes for sound classification of dwellings have been developed in several European countries. These schemes define acoustic classes according to different levels of sound insulation. Due to the lack of coordination among countries, a significant diversity in terms...... exchanging experiences about constructions fulfilling different classes, reducing trade barriers, and finally increasing the sound insulation of dwellings.......Schemes for the classification of dwellings according to different building performances have been proposed in the last years worldwide. The general idea behind these schemes relates to the positive impact a higher label, and thus a better performance, should have. In particular, focusing on sound...

  5. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  6. Classification of iconic images

    OpenAIRE

    Zrianina, Mariia; Kopf, Stephan

    2016-01-01

    Iconic images represent an abstract topic and use a presentation that is intuitively understood within a certain cultural context. For example, the abstract topic “global warming” may be represented by a polar bear standing alone on an ice floe. Such images are widely used in media and their automatic classification can help to identify high-level semantic concepts. This paper presents a system for the classification of iconic images. It uses a variation of the Bag of Visual Words approach wi...

  7. Casemix classification systems.

    Science.gov (United States)

    Fetter, R B

    1999-01-01

    The idea of using casemix classification to manage hospital services is not new, but has been limited by available technology. It was not until after the introduction of Medicare in the United States in 1965 that serious attempts were made to measure hospital production in order to contain spiralling costs. This resulted in a system of casemix classification known as diagnosis related groups (DRGs). This paper traces the development of DRGs and their evolution from the initial version to the All Patient Refined DRGs developed in 1991.

  8. The Search for Symmetries in the Genetic Code:

    Science.gov (United States)

    Antoneli, Fernando; Forger, Michael; Hornos, José Eduardo M.

    We give a full classification of the possible schemes for obtaining the distribution of multiplets observed in the standard genetic code by symmetry breaking in the context of finite groups, based on an extended notion of partial symmetry breaking that incorporates the intuitive idea of "freezing" first proposed by Francis Crick, which is given a precise mathematical meaning.

  9. Information gathering for CLP classification

    Directory of Open Access Journals (Sweden)

    Ida Marcello

    2011-01-01

    Full Text Available Regulation 1272/2008 includes provisions for two types of classification: harmonised classification and self-classification. The harmonised classification of substances is decided at Community level and a list of harmonised classifications is included in the Annex VI of the classification, labelling and packaging Regulation (CLP. If a chemical substance is not included in the harmonised classification list it must be self-classified, based on available information, according to the requirements of Annex I of the CLP Regulation. CLP appoints that the harmonised classification will be performed for carcinogenic, mutagenic or toxic to reproduction substances (CMR substances and for respiratory sensitisers category 1 and for other hazard classes on a case-by-case basis. The first step of classification is the gathering of available and relevant information. This paper presents the procedure for gathering information and to obtain data. The data quality is also discussed.

  10. The paradox of atheoretical classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2016-01-01

    A distinction can be made between “artificial classifications” and “natural classifications,” where artificial classifications may adequately serve some limited purposes, but natural classifications are overall most fruitful by allowing inference and thus many different purposes. There is strong...... support for the view that a natural classification should be based on a theory (and, of course, that the most fruitful theory provides the most fruitful classification). Nevertheless, atheoretical (or “descriptive”) classifications are often produced. Paradoxically, atheoretical classifications may...... be very successful. The best example of a successful “atheoretical” classification is probably the prestigious Diagnostic and Statistical Manual of Mental Disorders (DSM) since its third edition from 1980. Based on such successes one may ask: Should the claim that classifications ideally are natural...

  11. A Proposal for Cardiac Arrhythmia Classification using Complexity Measures

    Directory of Open Access Journals (Sweden)

    AROTARITEI, D.

    2017-08-01

    Full Text Available Cardiovascular diseases are one of the major problems of humanity and therefore one of their component, arrhythmia detection and classification drawn an increased attention worldwide. The presence of randomness in discrete time series, like those arising in electrophysiology, is firmly connected with computational complexity measure. This connection can be used, for instance, in the analysis of RR-intervals of electrocardiographic (ECG signal, coded as binary string, to detect and classify arrhythmia. Our approach uses three algorithms (Lempel-Ziv, Sample Entropy and T-Code to compute the information complexity applied and a classification tree to detect 13 types of arrhythmia with encouraging results. To overcome the computational effort required for complexity calculus, a cloud computing solution with executable code deployment is also proposed.

  12. A search for symmetries in the genetic code

    International Nuclear Information System (INIS)

    Hornos, J.E.M.; Hornos, Y.M.M.

    1991-01-01

    A search for symmetries based on the classification theorem of Cartan for the compact simple Lie algebras is performed to verify to what extent the genetic code is a manifestation of some underlying symmetry. An exact continuous symmetry group cannot be found to reproduce the present, universal code. However a unique approximate symmetry group is compatible with codon assignment for the fundamental amino acids and the termination codon. In order to obtain the actual genetic code, the symmetry must be slightly broken. (author). 27 refs, 3 figs, 6 tabs

  13. Classification of Polarimetric SAR Data Using Dictionary Learning

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg; Dahl, Anders Lindbjerg

    2012-01-01

    This contribution deals with classification of multilook fully polarimetric synthetic aperture radar (SAR) data by learning a dictionary of crop types present in the Foulum test site. The Foulum test site contains a large number of agricultural fields, as well as lakes, forests, natural vegetation......, grasslands and urban areas, which make it ideally suited for evaluation of classification algorithms. Dictionary learning centers around building a collection of image patches typical for the classification problem at hand. This requires initial manual labeling of the classes present in the data and is thus...... a method for supervised classification. Sparse coding of these image patches aims to maintain a proficient number of typical patches and associated labels. Data is consecutively classified by a nearest neighbor search of the dictionary elements and labeled with probabilities of each class. Each dictionary...

  14. Deep Recurrent Neural Networks for Supernovae Classification

    Science.gov (United States)

    Charnock, Tom; Moss, Adam

    2017-03-01

    We apply deep recurrent neural networks, which are capable of learning complex sequential information, to classify supernovae (code available at https://github.com/adammoss/supernovae). The observational time and filter fluxes are used as inputs to the network, but since the inputs are agnostic, additional data such as host galaxy information can also be included. Using the Supernovae Photometric Classification Challenge (SPCC) data, we find that deep networks are capable of learning about light curves, however the performance of the network is highly sensitive to the amount of training data. For a training size of 50% of the representational SPCC data set (around 104 supernovae) we obtain a type-Ia versus non-type-Ia classification accuracy of 94.7%, an area under the Receiver Operating Characteristic curve AUC of 0.986 and an SPCC figure-of-merit F 1 = 0.64. When using only the data for the early-epoch challenge defined by the SPCC, we achieve a classification accuracy of 93.1%, AUC of 0.977, and F 1 = 0.58, results almost as good as with the whole light curve. By employing bidirectional neural networks, we can acquire impressive classification results between supernovae types I, II and III at an accuracy of 90.4% and AUC of 0.974. We also apply a pre-trained model to obtain classification probabilities as a function of time and show that it can give early indications of supernovae type. Our method is competitive with existing algorithms and has applications for future large-scale photometric surveys.

  15. Code of Ethics

    Science.gov (United States)

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  16. Interleaved Product LDPC Codes

    OpenAIRE

    Baldi, Marco; Cancellieri, Giovanni; Chiaraluce, Franco

    2011-01-01

    Product LDPC codes take advantage of LDPC decoding algorithms and the high minimum distance of product codes. We propose to add suitable interleavers to improve the waterfall performance of LDPC decoding. Interleaving also reduces the number of low weight codewords, that gives a further advantage in the error floor region.

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  18. Scrum Code Camps

    DEFF Research Database (Denmark)

    Pries-Heje, Lene; Pries-Heje, Jan; Dalgaard, Bente

    2013-01-01

    is required. In this paper we present the design of such a new approach, the Scrum Code Camp, which can be used to assess agile team capability in a transparent and consistent way. A design science research approach is used to analyze properties of two instances of the Scrum Code Camp where seven agile teams...

  19. RFQ simulation code

    International Nuclear Information System (INIS)

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs

  20. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  1. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Science.gov (United States)

    2013-03-26

    ... Energy Conservation Code. International Existing Building Code. International Fire Code. International... Code. International Property Maintenance Code. International Residential Code. International Swimming Pool and Spa Code International Wildland-Urban Interface Code. International Zoning Code. ICC Standards...

  2. Ecosystem classification, Chapter 2

    Science.gov (United States)

    M.J. Robin-Abbott; L.H. Pardo

    2011-01-01

    The ecosystem classification in this report is based on the ecoregions developed through the Commission for Environmental Cooperation (CEC) for North America (CEC 1997). Only ecosystems that occur in the United States are included. CEC ecoregions are described, with slight modifications, below (CEC 1997) and shown in Figures 2.1 and 2.2. We chose this ecosystem...

  3. The classification of phocomelia.

    Science.gov (United States)

    Tytherleigh-Strong, G; Hooper, G

    2003-06-01

    We studied 24 patients with 44 phocomelic upper limbs. Only 11 limbs could be grouped in the classification system of Frantz and O' Rahilly. The non-classifiable limbs were further studied and their characteristics identified. It is confirmed that phocomelia is not an intercalary defect.

  4. Principles for ecological classification

    Science.gov (United States)

    Dennis H. Grossman; Patrick Bourgeron; Wolf-Dieter N. Busch; David T. Cleland; William Platts; G. Ray; C. Robins; Gary Roloff

    1999-01-01

    The principal purpose of any classification is to relate common properties among different entities to facilitate understanding of evolutionary and adaptive processes. In the context of this volume, it is to facilitate ecosystem stewardship, i.e., to help support ecosystem conservation and management objectives.

  5. Mimicking human texture classification

    NARCIS (Netherlands)

    Rogowitz, B.E.; van Rikxoort, Eva M.; van den Broek, Egon; Pappas, T.N.; Schouten, Theo E.; Daly, S.J.

    2005-01-01

    In an attempt to mimic human (colorful) texture classification by a clustering algorithm three lines of research have been encountered, in which as test set 180 texture images (both their color and gray-scale equivalent) were drawn from the OuTex and VisTex databases. First, a k-means algorithm was

  6. Classification, confusion and misclassification

    African Journals Online (AJOL)

    The classification of objects and phenomena in science and nature has fascinated academics since Carl Linnaeus, the Swedish botanist and zoologist, created his binomial description of living things in the 1700s and probably long before in accounts of others in textbooks long since gone. It must have concerned human ...

  7. Classifications in popular music

    NARCIS (Netherlands)

    van Venrooij, A.; Schmutz, V.; Wright, J.D.

    2015-01-01

    The categorical system of popular music, such as genre categories, is a highly differentiated and dynamic classification system. In this article we present work that studies different aspects of these categorical systems in popular music. Following the work of Paul DiMaggio, we focus on four

  8. Shark Teeth Classification

    Science.gov (United States)

    Brown, Tom; Creel, Sally; Lee, Velda

    2009-01-01

    On a recent autumn afternoon at Harmony Leland Elementary in Mableton, Georgia, students in a fifth-grade science class investigated the essential process of classification--the act of putting things into groups according to some common characteristics or attributes. While they may have honed these skills earlier in the week by grouping their own…

  9. Text document classification

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana

    č. 62 (2005), s. 53-54 ISSN 0926-4981 R&D Projects: GA AV ČR IAA2075302; GA AV ČR KSK1019101; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : document representation * categorization * classification Subject RIV: BD - Theory of Information

  10. Classification in Medical Imaging

    DEFF Research Database (Denmark)

    Chen, Chen

    Classification is extensively used in the context of medical image analysis for the purpose of diagnosis or prognosis. In order to classify image content correctly, one needs to extract efficient features with discriminative properties and build classifiers based on these features. In addition...... on characterizing human faces and emphysema disease in lung CT images....

  11. Improving Student Question Classification

    Science.gov (United States)

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  12. NOUN CLASSIFICATION IN ESAHIE

    African Journals Online (AJOL)

    The present work deals with noun classification in Esahie (Kwa, Niger ... phonological information influences the noun (form) class system of Esahie. ... between noun classes and (grammatical) Gender is interrogated (in the light of ..... the (A) argument6 precedes the verb and the (P) argument7 follows the verb in a simple.

  13. Dynamic Latent Classification Model

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre

    as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...

  14. Classification of myocardial infarction

    DEFF Research Database (Denmark)

    Saaby, Lotte; Poulsen, Tina Svenstrup; Hosbond, Susanne Elisabeth

    2013-01-01

    The classification of myocardial infarction into 5 types was introduced in 2007 as an important component of the universal definition. In contrast to the plaque rupture-related type 1 myocardial infarction, type 2 myocardial infarction is considered to be caused by an imbalance between demand...

  15. Event Classification using Concepts

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2013-01-01

    The semantic gap is one of the challenges in the GOOSE project. In this paper a Semantic Event Classification (SEC) system is proposed as an initial step in tackling the semantic gap challenge in the GOOSE project. This system uses semantic text analysis, multiple feature detectors using the BoW

  16. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang

    2011-06-07

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  17. On the Organizational Dynamics of the Genetic Code

    KAUST Repository

    Zhang, Zhang; Yu, Jun

    2011-01-01

    The organization of the canonical genetic code needs to be thoroughly illuminated. Here we reorder the four nucleotides—adenine, thymine, guanine and cytosine—according to their emergence in evolution, and apply the organizational rules to devising an algebraic representation for the canonical genetic code. Under a framework of the devised code, we quantify codon and amino acid usages from a large collection of 917 prokaryotic genome sequences, and associate the usages with its intrinsic structure and classification schemes as well as amino acid physicochemical properties. Our results show that the algebraic representation of the code is structurally equivalent to a content-centric organization of the code and that codon and amino acid usages under different classification schemes were correlated closely with GC content, implying a set of rules governing composition dynamics across a wide variety of prokaryotic genome sequences. These results also indicate that codons and amino acids are not randomly allocated in the code, where the six-fold degenerate codons and their amino acids have important balancing roles for error minimization. Therefore, the content-centric code is of great usefulness in deciphering its hitherto unknown regularities as well as the dynamics of nucleotide, codon, and amino acid compositions.

  18. Validation of thermalhydraulic codes

    International Nuclear Information System (INIS)

    Wilkie, D.

    1992-01-01

    Thermalhydraulic codes require to be validated against experimental data collected over a wide range of situations if they are to be relied upon. A good example is provided by the nuclear industry where codes are used for safety studies and for determining operating conditions. Errors in the codes could lead to financial penalties, to the incorrect estimation of the consequences of accidents and even to the accidents themselves. Comparison between prediction and experiment is often described qualitatively or in approximate terms, e.g. ''agreement is within 10%''. A quantitative method is preferable, especially when several competing codes are available. The codes can then be ranked in order of merit. Such a method is described. (Author)

  19. Fracture flow code

    International Nuclear Information System (INIS)

    Dershowitz, W; Herbert, A.; Long, J.

    1989-03-01

    The hydrology of the SCV site will be modelled utilizing discrete fracture flow models. These models are complex, and can not be fully cerified by comparison to analytical solutions. The best approach for verification of these codes is therefore cross-verification between different codes. This is complicated by the variation in assumptions and solution techniques utilized in different codes. Cross-verification procedures are defined which allow comparison of the codes developed by Harwell Laboratory, Lawrence Berkeley Laboratory, and Golder Associates Inc. Six cross-verification datasets are defined for deterministic and stochastic verification of geometric and flow features of the codes. Additional datasets for verification of transport features will be documented in a future report. (13 figs., 7 tabs., 10 refs.) (authors)

  20. Tumor taxonomy for the developmental lineage classification of neoplasms

    International Nuclear Information System (INIS)

    Berman, Jules J

    2004-01-01

    The new 'Developmental lineage classification of neoplasms' was described in a prior publication. The classification is simple (the entire hierarchy is described with just 39 classifiers), comprehensive (providing a place for every tumor of man), and consistent with recent attempts to characterize tumors by cytogenetic and molecular features. A taxonomy is a list of the instances that populate a classification. The taxonomy of neoplasia attempts to list every known term for every known tumor of man. The taxonomy provides each concept with a unique code and groups synonymous terms under the same concept. A Perl script validated successive drafts of the taxonomy ensuring that: 1) each term occurs only once in the taxonomy; 2) each term occurs in only one tumor class; 3) each concept code occurs in one and only one hierarchical position in the classification; and 4) the file containing the classification and taxonomy is a well-formed XML (eXtensible Markup Language) document. The taxonomy currently contains 122,632 different terms encompassing 5,376 neoplasm concepts. Each concept has, on average, 23 synonyms. The taxonomy populates 'The developmental lineage classification of neoplasms,' and is available as an XML file, currently 9+ Megabytes in length. A representation of the classification/taxonomy listing each term followed by its code, followed by its full ancestry, is available as a flat-file, 19+ Megabytes in length. The taxonomy is the largest nomenclature of neoplasms, with more than twice the number of neoplasm names found in other medical nomenclatures, including the 2004 version of the Unified Medical Language System, the Systematized Nomenclature of Medicine Clinical Terminology, the National Cancer Institute's Thesaurus, and the International Classification of Diseases Oncolology version. This manuscript describes a comprehensive taxonomy of neoplasia that collects synonymous terms under a unique code number and assigns each

  1. NEW CLASSIFICATION OF ECOPOLICES

    Directory of Open Access Journals (Sweden)

    VOROBYOV V. V.

    2016-09-01

    Full Text Available Problem statement. Ecopolices are the newest stage of the urban planning. They have to be consideredsuchas material and energy informational structures, included to the dynamic-evolutionary matrix netsofex change processes in the ecosystems. However, there are not made the ecopolice classifications, developing on suchapproaches basis. And this determined the topicality of the article. Analysis of publications on theoretical and applied aspects of the ecopolices formation showed, that the work on them is managed mainly in the context of the latest scientific and technological achievements in the various knowledge fields. These settlements are technocratic. They are connected with the morphology of space, network structures of regional and local natural ecosystems, without independent stability, can not exist without continuous man support. Another words, they do not work in with an ecopolices idea. It is come to a head for objective, symbiotic searching of ecopolices concept with the development of their classifications. Purpose statement is to develop the objective evidence for ecopolices and to propose their new classification. Conclusion. On the base of the ecopolices classification have to lie an elements correlation idea of their general plans and men activity type according with natural mechanism of accepting, reworking and transmission of material, energy and information between geo-ecosystems, planet, man, ecopolices material part and Cosmos. New ecopolices classification should be based on the principles of multi-dimensional, time-spaced symbiotic clarity with exchange ecosystem networks. The ecopolice function with this approach comes not from the subjective anthropocentric economy but from the holistic objective of Genesis paradigm. Or, otherwise - not from the Consequence, but from the Cause.

  2. Efficient Fingercode Classification

    Science.gov (United States)

    Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang

    In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.

  3. Differential Classification of Dementia

    Directory of Open Access Journals (Sweden)

    E. Mohr

    1995-01-01

    Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.

  4. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Science.gov (United States)

    2013-09-09

    ... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: The... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S...

  5. Lattice-Like Total Perfect Codes

    Directory of Open Access Journals (Sweden)

    Araujo Carlos

    2014-02-01

    Full Text Available A contribution is made to the classification of lattice-like total perfect codes in integer lattices Λn via pairs (G, Φ formed by abelian groups G and homomorphisms Φ: Zn → G. A conjecture is posed that the cited contribution covers all possible cases. A related conjecture on the unfinished work on open problems on lattice-like perfect dominating sets in Λn with induced components that are parallel paths of length > 1 is posed as well.

  6. Radon Protection in the Technical Building Code

    International Nuclear Information System (INIS)

    Frutos, B.; Garcia, J. P.; Martin, J. L.; Olaya, M.; Serrano, J I.; Suarez, E.; Fernandez, J. A.; Rodrigo, F.

    2003-01-01

    Building construction in areas with high radon gas contamination in land requires the incorporation of certain measures in order to prevent the accumulation of this gas on the inside of buildings. These measures should be considered primarily in the design and construction phases and should take the area of the country into consideration where the construction will take place depending on the potential risk of radon entrance. Within the Technical Building Code, radon protection has been considered through general classification of the country and specific areas where building construction is to take place, in different risk categories and in the introduction of building techniques appropriate for each area. (Author) 17 refs

  7. Huffman coding in advanced audio coding standard

    Science.gov (United States)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  8. 32 CFR 2700.22 - Classification guides.

    Science.gov (United States)

    2010-07-01

    ... SECURITY INFORMATION REGULATIONS Derivative Classification § 2700.22 Classification guides. OMSN shall... direct derivative classification, shall identify the information to be protected in specific and uniform...

  9. Report number codes

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  10. Report number codes

    International Nuclear Information System (INIS)

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name

  11. Nonterminals, homomorphisms and codings in different variations of OL-systems. II. Nondeterministic systems

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Rozenberg, Grzegorz; Salomaa, Arto

    1974-01-01

    Continuing the work begun in Part I of this paper, we consider now variations of nondeterministic OL-systems. The present Part II of the paper contains a systematic classification of the effect of nonterminals, codings, weak codings, nonerasing homomorphisms and homomorphisms for all basic variat...

  12. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  13. Coded Splitting Tree Protocols

    DEFF Research Database (Denmark)

    Sørensen, Jesper Hemming; Stefanovic, Cedomir; Popovski, Petar

    2013-01-01

    This paper presents a novel approach to multiple access control called coded splitting tree protocol. The approach builds on the known tree splitting protocols, code structure and successive interference cancellation (SIC). Several instances of the tree splitting protocol are initiated, each...... instance is terminated prematurely and subsequently iterated. The combined set of leaves from all the tree instances can then be viewed as a graph code, which is decodable using belief propagation. The main design problem is determining the order of splitting, which enables successful decoding as early...

  14. Transport theory and codes

    International Nuclear Information System (INIS)

    Clancy, B.E.

    1986-01-01

    This chapter begins with a neutron transport equation which includes the one dimensional plane geometry problems, the one dimensional spherical geometry problems, and numerical solutions. The section on the ANISN code and its look-alikes covers problems which can be solved; eigenvalue problems; outer iteration loop; inner iteration loop; and finite difference solution procedures. The input and output data for ANISN is also discussed. Two dimensional problems such as the DOT code are given. Finally, an overview of the Monte-Carlo methods and codes are elaborated on

  15. Gravity inversion code

    International Nuclear Information System (INIS)

    Burkhard, N.R.

    1979-01-01

    The gravity inversion code applies stabilized linear inverse theory to determine the topography of a subsurface density anomaly from Bouguer gravity data. The gravity inversion program consists of four source codes: SEARCH, TREND, INVERT, and AVERAGE. TREND and INVERT are used iteratively to converge on a solution. SEARCH forms the input gravity data files for Nevada Test Site data. AVERAGE performs a covariance analysis on the solution. This document describes the necessary input files and the proper operation of the code. 2 figures, 2 tables

  16. IAEA Classification of Uranium Deposits

    International Nuclear Information System (INIS)

    Bruneton, Patrice

    2014-01-01

    Classifications of uranium deposits follow two general approaches, focusing on: • descriptive features such as the geotectonic position, the host rock type, the orebody morphology, …… : « geologic classification »; • or on genetic aspects: « genetic classification »

  17. Classification of Osteogenesis Imperfecta revisited

    NARCIS (Netherlands)

    van Dijk, F. S.; Pals, G.; van Rijn, R. R.; Nikkels, P. G. J.; Cobben, J. M.

    2010-01-01

    In 1979 Sillence proposed a classification of Osteogenesis Imperfecta (OI) in OI types I, II, III and IV. In 2004 and 2007 this classification was expanded with OI types V-VIII because of distinct clinical features and/or different causative gene mutations. We propose a revised classification of OI

  18. The future of general classification

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2013-01-01

    Discusses problems related to accessing multiple collections using a single retrieval language. Surveys the concepts of interoperability and switching language. Finds that mapping between more indexing languages always will be an approximation. Surveys the issues related to general classification...... and contrasts that to special classifications. Argues for the use of general classifications to provide access to collections nationally and internationally....

  19. Classification of Strawberry Fruit Shape by Machine Learning

    Science.gov (United States)

    Ishikawa, T.; Hayashi, A.; Nagamatsu, S.; Kyutoku, Y.; Dan, I.; Wada, T.; Oku, K.; Saeki, Y.; Uto, T.; Tanabata, T.; Isobe, S.; Kochi, N.

    2018-05-01

    Shape is one of the most important traits of agricultural products due to its relationships with the quality, quantity, and value of the products. For strawberries, the nine types of fruit shape were defined and classified by humans based on the sampler patterns of the nine types. In this study, we tested the classification of strawberry shapes by machine learning in order to increase the accuracy of the classification, and we introduce the concept of computerization into this field. Four types of descriptors were extracted from the digital images of strawberries: (1) the Measured Values (MVs) including the length of the contour line, the area, the fruit length and width, and the fruit width/length ratio; (2) the Ellipse Similarity Index (ESI); (3) Elliptic Fourier Descriptors (EFDs), and (4) Chain Code Subtraction (CCS). We used these descriptors for the classification test along with the random forest approach, and eight of the nine shape types were classified with combinations of MVs + CCS + EFDs. CCS is a descriptor that adds human knowledge to the chain codes, and it showed higher robustness in classification than the other descriptors. Our results suggest machine learning's high ability to classify fruit shapes accurately. We will attempt to increase the classification accuracy and apply the machine learning methods to other plant species.

  20. Recommendations for the classification of HIV associated neuromanifestations in the German DRG system.

    Science.gov (United States)

    Evers, Stefan; Fiori, W; Brockmeyer, N; Arendt, G; Husstedt, I-W

    2005-09-12

    HIV associated neuromanifestations are of growing importance in the in-patient treatment of HIV infected patients. In Germany, all in-patients have to be coded according to the ICD-10 classification and the German DRG-system. We present recommendations how to code the different primary and secondary neuromanifestations of HIV infection. These recommendations are based on the commentary of the German DRG procedures and are aimed to establish uniform coding of neuromanifestations.

  1. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  2. [Headache: classification and diagnosis].

    Science.gov (United States)

    Carbaat, P A T; Couturier, E G M

    2016-11-01

    There are many types of headache and, moreover, many people have different types of headache at the same time. Adequate treatment is possible only on the basis of the correct diagnosis. Technically and in terms of content the current diagnostics process for headache is based on the 'International Classification of Headache Disorders' (ICHD-3-beta) that was produced under the auspices of the International Headache Society. This classification is based on a distinction between primary and secondary headaches. The most common primary headache types are the tension type headache, migraine and the cluster headache. Application of uniform diagnostic concepts is essential to come to the most appropriate treatment of the various types of headache.

  3. Classification of hand eczema

    DEFF Research Database (Denmark)

    Agner, T; Aalto-Korte, K; Andersen, K E

    2015-01-01

    BACKGROUND: Classification of hand eczema (HE) is mandatory in epidemiological and clinical studies, and also important in clinical work. OBJECTIVES: The aim was to test a recently proposed classification system of HE in clinical practice in a prospective multicentre study. METHODS: Patients were...... recruited from nine different tertiary referral centres. All patients underwent examination by specialists in dermatology and were checked using relevant allergy testing. Patients were classified into one of the six diagnostic subgroups of HE: allergic contact dermatitis, irritant contact dermatitis, atopic...... system investigated in the present study was useful, being able to give an appropriate main diagnosis for 89% of HE patients, and for another 7% when using two main diagnoses. The fact that more than half of the patients had one or more additional diagnoses illustrates that HE is a multifactorial disease....

  4. Sound classification of dwellings

    DEFF Research Database (Denmark)

    Rasmussen, Birgit

    2012-01-01

    National schemes for sound classification of dwellings exist in more than ten countries in Europe, typically published as national standards. The schemes define quality classes reflecting different levels of acoustical comfort. Main criteria concern airborne and impact sound insulation between...... dwellings, facade sound insulation and installation noise. The schemes have been developed, implemented and revised gradually since the early 1990s. However, due to lack of coordination between countries, there are significant discrepancies, and new standards and revisions continue to increase the diversity...... is needed, and a European COST Action TU0901 "Integrating and Harmonizing Sound Insulation Aspects in Sustainable Urban Housing Constructions", has been established and runs 2009-2013, one of the main objectives being to prepare a proposal for a European sound classification scheme with a number of quality...

  5. Granular loess classification based

    International Nuclear Information System (INIS)

    Browzin, B.S.

    1985-01-01

    This paper discusses how loess might be identified by two index properties: the granulometric composition and the dry unit weight. These two indices are necessary but not always sufficient for identification of loess. On the basis of analyses of samples from three continents, it was concluded that the 0.01-0.5-mm fraction deserves the name loessial fraction. Based on the loessial fraction concept, a granulometric classification of loess is proposed. A triangular chart is used to classify loess

  6. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  7. CLASSIFICATION OF CRIMINAL GROUPS

    OpenAIRE

    Natalia Romanova

    2013-01-01

    New types of criminal groups are emerging in modern society.  These types have their special criminal subculture. The research objective is to develop new parameters of classification of modern criminal groups, create a new typology of criminal groups and identify some features of their subculture. Research methodology is based on the system approach that includes using the method of analysis of documentary sources (materials of a criminal case), method of conversations with themembers of the...

  8. Decimal Classification Editions

    Directory of Open Access Journals (Sweden)

    Zenovia Niculescu

    2009-01-01

    Full Text Available The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  9. Decimal Classification Editions

    OpenAIRE

    Zenovia Niculescu

    2009-01-01

    The study approaches the evolution of Dewey Decimal Classification editions from the perspective of updating the terminology, reallocating and expanding the main and auxilary structure of Dewey indexing language. The comparative analysis of DDC editions emphasizes the efficiency of Dewey scheme from the point of view of improving the informational offer, through basic index terms, revised and developed, as well as valuing the auxilary notations.

  10. Fulcrum Network Codes

    DEFF Research Database (Denmark)

    2015-01-01

    Fulcrum network codes, which are a network coding framework, achieve three objectives: (i) to reduce the overhead per coded packet to almost 1 bit per source packet; (ii) to operate the network using only low field size operations at intermediate nodes, dramatically reducing complexity...... in the network; and (iii) to deliver an end-to-end performance that is close to that of a high field size network coding system for high-end receivers while simultaneously catering to low-end ones that can only decode in a lower field size. Sources may encode using a high field size expansion to increase...... the number of dimensions seen by the network using a linear mapping. Receivers can tradeoff computational effort with network delay, decoding in the high field size, the low field size, or a combination thereof....

  11. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed; Ghanem, Bernard; Wonka, Peter

    2018-01-01

    coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements

  12. SASSYS LMFBR systems code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time

  13. OCA Code Enforcement

    Data.gov (United States)

    Montgomery County of Maryland — The Office of the County Attorney (OCA) processes Code Violation Citations issued by County agencies. The citations can be viewed by issued department, issued date...

  14. The fast code

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, L.N.; Wilson, R.E. [Oregon State Univ., Dept. of Mechanical Engineering, Corvallis, OR (United States)

    1996-09-01

    The FAST Code which is capable of determining structural loads on a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data are given at two wind speeds for the ESI-80. The FAST Code models a two-bladed HAWT with degrees of freedom for blade bending, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffnesses, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms, and azimuth averaged bin plots. It is concluded that agreement between the FAST Code and test results is good. (au)

  15. Code Disentanglement: Initial Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wohlbier, John Greaton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kelley, Timothy M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockefeller, Gabriel M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Calef, Matthew Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  16. Induction technology optimization code

    International Nuclear Information System (INIS)

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-01-01

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. (Author) 11 refs., 3 figs

  17. VT ZIP Code Areas

    Data.gov (United States)

    Vermont Center for Geographic Information — (Link to Metadata) A ZIP Code Tabulation Area (ZCTA) is a statistical geographic entity that approximates the delivery area for a U.S. Postal Service five-digit...

  18. Bandwidth efficient coding

    CERN Document Server

    Anderson, John B

    2017-01-01

    Bandwidth Efficient Coding addresses the major challenge in communication engineering today: how to communicate more bits of information in the same radio spectrum. Energy and bandwidth are needed to transmit bits, and bandwidth affects capacity the most. Methods have been developed that are ten times as energy efficient at a given bandwidth consumption as simple methods. These employ signals with very complex patterns and are called "coding" solutions. The book begins with classical theory before introducing new techniques that combine older methods of error correction coding and radio transmission in order to create narrowband methods that are as efficient in both spectrum and energy as nature allows. Other topics covered include modulation techniques such as CPM, coded QAM and pulse design.

  19. Reactor lattice codes

    International Nuclear Information System (INIS)

    Kulikowska, T.

    2001-01-01

    The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)

  20. Classifications of track structures

    International Nuclear Information System (INIS)

    Paretzke, H.G.

    1984-01-01

    When ionizing particles interact with matter they produce random topological structures of primary activations which represent the initial boundary conditions for all subsequent physical, chemical and/or biological reactions. There are two important aspects of research on such track structures, namely their experimental or theoretical determination on one hand and the quantitative classification of these complex structures which is a basic pre-requisite for the understanding of mechanisms of radiation actions. This paper deals only with the latter topic, i.e. the problems encountered in and possible approaches to quantitative ordering and grouping of these multidimensional objects by their degrees of similarity with respect to their efficiency in producing certain final radiation effects, i.e. to their ''radiation quality.'' Various attempts of taxonometric classification with respect to radiation efficiency have been made in basic and applied radiation research including macro- and microdosimetric concepts as well as track entities and stopping power based theories. In this paper no review of those well-known approaches is given but rather an outline and discussion of alternative methods new to this field of radiation research which have some very promising features and which could possibly solve at least some major classification problems

  1. Neuromuscular disease classification system

    Science.gov (United States)

    Sáez, Aurora; Acha, Begoña; Montero-Sánchez, Adoración; Rivas, Eloy; Escudero, Luis M.; Serrano, Carmen

    2013-06-01

    Diagnosis of neuromuscular diseases is based on subjective visual assessment of biopsies from patients by the pathologist specialist. A system for objective analysis and classification of muscular dystrophies and neurogenic atrophies through muscle biopsy images of fluorescence microscopy is presented. The procedure starts with an accurate segmentation of the muscle fibers using mathematical morphology and a watershed transform. A feature extraction step is carried out in two parts: 24 features that pathologists take into account to diagnose the diseases and 58 structural features that the human eye cannot see, based on the assumption that the biopsy is considered as a graph, where the nodes are represented by each fiber, and two nodes are connected if two fibers are adjacent. A feature selection using sequential forward selection and sequential backward selection methods, a classification using a Fuzzy ARTMAP neural network, and a study of grading the severity are performed on these two sets of features. A database consisting of 91 images was used: 71 images for the training step and 20 as the test. A classification error of 0% was obtained. It is concluded that the addition of features undetectable by the human visual inspection improves the categorization of atrophic patterns.

  2. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  3. Critical Care Coding for Neurologists.

    Science.gov (United States)

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  4. Lattice Index Coding

    OpenAIRE

    Natarajan, Lakshmi; Hong, Yi; Viterbo, Emanuele

    2014-01-01

    The index coding problem involves a sender with K messages to be transmitted across a broadcast channel, and a set of receivers each of which demands a subset of the K messages while having prior knowledge of a different subset as side information. We consider the specific case of noisy index coding where the broadcast channel is Gaussian and every receiver demands all the messages from the source. Instances of this communication problem arise in wireless relay networks, sensor networks, and ...

  5. Towards advanced code simulators

    International Nuclear Information System (INIS)

    Scriven, A.H.

    1990-01-01

    The Central Electricity Generating Board (CEGB) uses advanced thermohydraulic codes extensively to support PWR safety analyses. A system has been developed to allow fully interactive execution of any code with graphical simulation of the operator desk and mimic display. The system operates in a virtual machine environment, with the thermohydraulic code executing in one virtual machine, communicating via interrupts with any number of other virtual machines each running other programs and graphics drivers. The driver code itself does not have to be modified from its normal batch form. Shortly following the release of RELAP5 MOD1 in IBM compatible form in 1983, this code was used as the driver for this system. When RELAP5 MOD2 became available, it was adopted with no changes needed in the basic system. Overall the system has been used for some 5 years for the analysis of LOBI tests, full scale plant studies and for simple what-if studies. For gaining rapid understanding of system dependencies it has proved invaluable. The graphical mimic system, being independent of the driver code, has also been used with other codes to study core rewetting, to replay results obtained from batch jobs on a CRAY2 computer system and to display suitably processed experimental results from the LOBI facility to aid interpretation. For the above work real-time execution was not necessary. Current work now centers on implementing the RELAP 5 code on a true parallel architecture machine. Marconi Simulation have been contracted to investigate the feasibility of using upwards of 100 processors, each capable of a peak of 30 MIPS to run a highly detailed RELAP5 model in real time, complete with specially written 3D core neutronics and balance of plant models. This paper describes the experience of using RELAP5 as an analyzer/simulator, and outlines the proposed methods and problems associated with parallel execution of RELAP5

  6. Cracking the Gender Codes

    DEFF Research Database (Denmark)

    Rennison, Betina Wolfgang

    2016-01-01

    extensive work to raise the proportion of women. This has helped slightly, but women remain underrepresented at the corporate top. Why is this so? What can be done to solve it? This article presents five different types of answers relating to five discursive codes: nature, talent, business, exclusion...... in leadership management, we must become more aware and take advantage of this complexity. We must crack the codes in order to crack the curve....

  7. PEAR code review

    International Nuclear Information System (INIS)

    De Wit, R.; Jamieson, T.; Lord, M.; Lafortune, J.F.

    1997-07-01

    As a necessary component in the continuous improvement and refinement of methodologies employed in the nuclear industry, regulatory agencies need to periodically evaluate these processes to improve confidence in results and ensure appropriate levels of safety are being achieved. The independent and objective review of industry-standard computer codes forms an essential part of this program. To this end, this work undertakes an in-depth review of the computer code PEAR (Public Exposures from Accidental Releases), developed by Atomic Energy of Canada Limited (AECL) to assess accidental releases from CANDU reactors. PEAR is based largely on the models contained in the Canadian Standards Association (CSA) N288.2-M91. This report presents the results of a detailed technical review of the PEAR code to identify any variations from the CSA standard and other supporting documentation, verify the source code, assess the quality of numerical models and results, and identify general strengths and weaknesses of the code. The version of the code employed in this review is the one which AECL intends to use for CANDU 9 safety analyses. (author)

  8. KENO-V code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P 1 scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes

  9. Code, standard and specifications

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    Radiography also same as the other technique, it need standard. This standard was used widely and method of used it also regular. With that, radiography testing only practical based on regulations as mentioned and documented. These regulation or guideline documented in code, standard and specifications. In Malaysia, level one and basic radiographer can do radiography work based on instruction give by level two or three radiographer. This instruction was produced based on guideline that mention in document. Level two must follow the specifications mentioned in standard when write the instruction. From this scenario, it makes clearly that this radiography work is a type of work that everything must follow the rule. For the code, the radiography follow the code of American Society for Mechanical Engineer (ASME) and the only code that have in Malaysia for this time is rule that published by Atomic Energy Licensing Board (AELB) known as Practical code for radiation Protection in Industrial radiography. With the existence of this code, all the radiography must follow the rule or standard regulated automatically.

  10. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    OpenAIRE

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content ...

  11. Comparabilidad entre la novena y la décima revisión de la Clasificación Internacional de Enfermedades aplicada a la codificación de la causa de muerte en España Comparability between the ninth and tenth revisions of the International Classification of Diseases applied to coding causes of death in Spain

    Directory of Open Access Journals (Sweden)

    M. Ruiz

    2002-12-01

    cuantificar el cambio en las grandes causas de muerte en España.Objective: To analyze comparability between the ninth and tenth revisions of the International Classification of Diseases (ICD applied to coding causes of death in Spain. Methods: According to the ninth and tenth revisions of the ICD, 80,084 statistical bulletins of mortality registered in 1999 were assigned the Basic Cause of Death. The statistical bulletins corresponded to the Autonomous Communities of Andalusia, Cantabria, Murcia, Navarre and the Basque Country, and the city of Barcelona. The underlying causes of death were classified into 17 groups. Simple correspondence, the Kappa index and the comparability ratio for major causes were calculated. Results: A total of 3.6% of deaths changed group due to an increase (36.4% in infectious and parasitic diseases, mainly because of the inclusion of AIDS, and a corresponding decrease due to the exclusion of endocrine, nutritional and metabolic disorders. Furthermore, myelodysplastic syndrome was moved to the category of neoplasm. The group including nervous system diseases, eye and related diseases, and ear and mastoid apophysis diseases increased (14.7% at the expense of mental and behavior disorders, due to the inclusion of senile and presenile organic psychosis. Poorly-defined entities increased (14.1% due to the inclusion of cardiac arrest and its synonyms, together with heart failure, to the detriment of diseases of the vascular system. Diseases of the respiratory system increased (4.8% due to the inclusion of respiratory failure, previously considered as a poorly defined cause. The correspondence for all causes was 96.4% and kappa's index was 94.9%. Conclusions: The introduction of ICD-10 affects the comparability of statistical series of mortality according to cause. The results of this study allow us to identify the main modifications and to quantify the changes in the major causes of mortality in Spain.

  12. Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.

    Science.gov (United States)

    Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic

    2014-06-01

    In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.

  13. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-09-07

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  14. Maximum mutual information regularized classification

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yi; Zhao, Shiguang; Gao, Xin

    2014-01-01

    In this paper, a novel pattern classification approach is proposed by regularizing the classifier learning to maximize mutual information between the classification response and the true class label. We argue that, with the learned classifier, the uncertainty of the true class label of a data sample should be reduced by knowing its classification response as much as possible. The reduced uncertainty is measured by the mutual information between the classification response and the true class label. To this end, when learning a linear classifier, we propose to maximize the mutual information between classification responses and true class labels of training samples, besides minimizing the classification error and reducing the classifier complexity. An objective function is constructed by modeling mutual information with entropy estimation, and it is optimized by a gradient descend method in an iterative algorithm. Experiments on two real world pattern classification problems show the significant improvements achieved by maximum mutual information regularization.

  15. Environmental Monitoring, Water Quality - MO 2009 Water Quality Standards - Table G Lake Classifications and Use Designations (SHP)

    Data.gov (United States)

    NSGIC State | GIS Inventory — This data set contains Missouri Water Quality Standards (WQS) lake classifications and use designations described in the Missouri Code of State Regulations (CSR), 10...

  16. SPECTRAL AMPLITUDE CODING OCDMA SYSTEMS USING ENHANCED DOUBLE WEIGHT CODE

    Directory of Open Access Journals (Sweden)

    F.N. HASOON

    2006-12-01

    Full Text Available A new code structure for spectral amplitude coding optical code division multiple access systems based on double weight (DW code families is proposed. The DW has a fixed weight of two. Enhanced double-weight (EDW code is another variation of a DW code family that can has a variable weight greater than one. The EDW code possesses ideal cross-correlation properties and exists for every natural number n. A much better performance can be provided by using the EDW code compared to the existing code such as Hadamard and Modified Frequency-Hopping (MFH codes. It has been observed that theoretical analysis and simulation for EDW is much better performance compared to Hadamard and Modified Frequency-Hopping (MFH codes.

  17. Some new ternary linear codes

    Directory of Open Access Journals (Sweden)

    Rumen Daskalov

    2017-07-01

    Full Text Available Let an $[n,k,d]_q$ code be a linear code of length $n$, dimension $k$ and minimum Hamming distance $d$ over $GF(q$. One of the most important problems in coding theory is to construct codes with optimal minimum distances. In this paper 22 new ternary linear codes are presented. Two of them are optimal. All new codes improve the respective lower bounds in [11].

  18. ACE - Manufacturer Identification Code (MID)

    Data.gov (United States)

    Department of Homeland Security — The ACE Manufacturer Identification Code (MID) application is used to track and control identifications codes for manufacturers. A manufacturer is identified on an...

  19. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  20. Optical coding theory with Prime

    CERN Document Server

    Kwong, Wing C

    2013-01-01

    Although several books cover the coding theory of wireless communications and the hardware technologies and coding techniques of optical CDMA, no book has been specifically dedicated to optical coding theory-until now. Written by renowned authorities in the field, Optical Coding Theory with Prime gathers together in one volume the fundamentals and developments of optical coding theory, with a focus on families of prime codes, supplemented with several families of non-prime codes. The book also explores potential applications to coding-based optical systems and networks. Learn How to Construct

  1. The Aster code

    International Nuclear Information System (INIS)

    Delbecq, J.M.

    1999-01-01

    The Aster code is a 2D or 3D finite-element calculation code for structures developed by the R and D direction of Electricite de France (EdF). This dossier presents a complete overview of the characteristics and uses of the Aster code: introduction of version 4; the context of Aster (organisation of the code development, versions, systems and interfaces, development tools, quality assurance, independent validation); static mechanics (linear thermo-elasticity, Euler buckling, cables, Zarka-Casier method); non-linear mechanics (materials behaviour, big deformations, specific loads, unloading and loss of load proportionality indicators, global algorithm, contact and friction); rupture mechanics (G energy restitution level, restitution level in thermo-elasto-plasticity, 3D local energy restitution level, KI and KII stress intensity factors, calculation of limit loads for structures), specific treatments (fatigue, rupture, wear, error estimation); meshes and models (mesh generation, modeling, loads and boundary conditions, links between different modeling processes, resolution of linear systems, display of results etc..); vibration mechanics (modal and harmonic analysis, dynamics with shocks, direct transient dynamics, seismic analysis and aleatory dynamics, non-linear dynamics, dynamical sub-structuring); fluid-structure interactions (internal acoustics, mass, rigidity and damping); linear and non-linear thermal analysis; steels and metal industry (structure transformations); coupled problems (internal chaining, internal thermo-hydro-mechanical coupling, chaining with other codes); products and services. (J.S.)

  2. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  3. Headaches of otolaryngological interest: current status while awaiting revision of classification. Practical considerations and expectations.

    Science.gov (United States)

    Farri, A; Enrico, A; Farri, F

    2012-04-01

    In 1988, diagnostic criteria for headaches were drawn up by the International Headache Society (IHS) and is divided into headaches, cranial neuralgias and facial pain. The 2(nd) edition of the International Classification of Headache Disorders (ICHD) was produced in 2004, and still provides a dynamic and useful instrument for clinical practice. We have examined the current IHC, which comprises 14 groups. The first four cover primary headaches, with "benign paroxysmal vertigo of childhood" being the forms of migraine of interest to otolaryngologists; groups 5 to 12 classify "secondary headaches"; group 11 is formed of "headache or facial pain attributed to disorder of cranium, neck, eyes, ears, nose, sinuses, teeth, mouth or other facial or cranial structures"; group 13, consisting of "cranial neuralgias and central causes of facial pain" is also of relevance to otolaryngology. Neither the current classification system nor the original one has a satisfactory collocation for migraineassociated vertigo. Another critical point of the classification concerns cranio-facial pain syndromes such as Sluder's neuralgia, previously included in the 1988 classification among cluster headaches, and now included in the section on "cranial neuralgias and central causes of facial pain", even though Sluder's neuralgia has not been adequately validated. As we have highlighted in our studies, there are considerable similarities between Sluder's syndrome and cluster headaches. The main features distinguishing the two are the trend to cluster over time, found only in cluster headaches, and the distribution of pain, with greater nasal manifestations in the case of Sluder's syndrome. We believe that it is better and clearer, particularly on the basis of our clinical experience and published studies, to include this nosological entity, which is clearly distinct from an otolaryngological point of view, as a variant of cluster headache. We agree with experts in the field of headaches, such as

  4. Speech coding code- excited linear prediction

    CERN Document Server

    Bäckström, Tom

    2017-01-01

    This book provides scientific understanding of the most central techniques used in speech coding both for advanced students as well as professionals with a background in speech audio and or digital signal processing. It provides a clear connection between the whys hows and whats thus enabling a clear view of the necessity purpose and solutions provided by various tools as well as their strengths and weaknesses in each respect Equivalently this book sheds light on the following perspectives for each technology presented Objective What do we want to achieve and especially why is this goal important Resource Information What information is available and how can it be useful and Resource Platform What kind of platforms are we working with and what are their capabilities restrictions This includes computational memory and acoustic properties and the transmission capacity of devices used. The book goes on to address Solutions Which solutions have been proposed and how can they be used to reach the stated goals and ...

  5. Classification of IRAS asteroids

    International Nuclear Information System (INIS)

    Tedesco, E.F.; Matson, D.L.; Veeder, G.J.

    1989-01-01

    Albedos and spectral reflectances are essential for classifying asteroids. For example, classes E, M and P are indistinguishable without albedo data. Colorometric data are available for about 1000 asteroids but, prior to IRAS, albedo data was available for only about 200. IRAS broke this bottleneck by providing albedo data on nearly 2000 asteroids. Hence, excepting absolute magnitudes, the albedo and size are now the most common asteroid physical parameters known. In this chapter the authors present the results of analyses of IRAS-derived asteroid albedos, discuss their application to asteroid classification, and mention several studies which might be done to exploit further this data set

  6. SPORT FOOD ADDITIVE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    I. P. Prokopenko

    2015-01-01

    Full Text Available Correctly organized nutritive and pharmacological support is an important component of an athlete's preparation for competitions, an optimal shape maintenance, fast recovery and rehabilitation after traumas and defatigation. Special products of enhanced biological value (BAS for athletes nutrition are used with this purpose. Easy-to-use energy sources are administered into athlete's organism, yielded materials and biologically active substances which regulate and activate exchange reactions which proceed with difficulties during certain physical trainings. The article presents sport supplements classification which can be used before warm-up and trainings, after trainings and in competitions breaks.

  7. Radioactive facilities classification criteria

    International Nuclear Information System (INIS)

    Briso C, H.A.; Riesle W, J.

    1992-01-01

    Appropriate classification of radioactive facilities into groups of comparable risk constitutes one of the problems faced by most Regulatory Bodies. Regarding the radiological risk, the main facts to be considered are the radioactive inventory and the processes to which these radionuclides are subjected. Normally, operations are ruled by strict safety procedures. Thus, the total activity of the radionuclides existing in a given facility is the varying feature that defines its risk. In order to rely on a quantitative criterion and, considering that the Annual Limits of Intake are widely accepted references, an index based on these limits, to support decisions related to radioactive facilities, is proposed. (author)

  8. Catalogue and classification of technical safety standards, rules and regulations for nuclear power reactors and nuclear fuel cycle facilities

    International Nuclear Information System (INIS)

    Fichtner, N.; Becker, K.; Bashir, M.

    1977-01-01

    The present report is an up-dated version of the report 'Catalogue and Classification of Technical Safety Rules for Light-water Reactors and Reprocessing Plants' edited under code No EUR 5362e, August 1975. Like the first version of the report, it constitutes a catalogue and classification of standards, rules and regulations on land-based nuclear power reactors and fuel cycle facilities. The reasons for the classification system used are given and discussed

  9. Spatially coded backscatter radiography

    International Nuclear Information System (INIS)

    Thangavelu, S.; Hussein, E.M.A.

    2007-01-01

    Conventional radiography requires access to two opposite sides of an object, which makes it unsuitable for the inspection of extended and/or thick structures (airframes, bridges, floors etc.). Backscatter imaging can overcome this problem, but the indications obtained are difficult to interpret. This paper applies the coded aperture technique to gamma-ray backscatter-radiography in order to enhance the detectability of flaws. This spatial coding method involves the positioning of a mask with closed and open holes to selectively permit or block the passage of radiation. The obtained coded-aperture indications are then mathematically decoded to detect the presence of anomalies. Indications obtained from Monte Carlo calculations were utilized in this work to simulate radiation scattering measurements. These simulated measurements were used to investigate the applicability of this technique to the detection of flaws by backscatter radiography

  10. Aztheca Code; Codigo Aztheca

    Energy Technology Data Exchange (ETDEWEB)

    Quezada G, S.; Espinosa P, G. [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Ciudad de Mexico (Mexico); Centeno P, J.; Sanchez M, H., E-mail: sequga@gmail.com [UNAM, Facultad de Ingenieria, Ciudad Universitaria, Circuito Exterior s/n, 04510 Ciudad de Mexico (Mexico)

    2017-09-15

    This paper presents the Aztheca code, which is formed by the mathematical models of neutron kinetics, power generation, heat transfer, core thermo-hydraulics, recirculation systems, dynamic pressure and level models and control system. The Aztheca code is validated with plant data, as well as with predictions from the manufacturer when the reactor operates in a stationary state. On the other hand, to demonstrate that the model is applicable during a transient, an event occurred in a nuclear power plant with a BWR reactor is selected. The plant data are compared with the results obtained with RELAP-5 and the Aztheca model. The results show that both RELAP-5 and the Aztheca code have the ability to adequately predict the behavior of the reactor. (Author)

  11. The Coding Question.

    Science.gov (United States)

    Gallistel, C R

    2017-07-01

    Recent electrophysiological results imply that the duration of the stimulus onset asynchrony in eyeblink conditioning is encoded by a mechanism intrinsic to the cerebellar Purkinje cell. This raises the general question - how is quantitative information (durations, distances, rates, probabilities, amounts, etc.) transmitted by spike trains and encoded into engrams? The usual assumption is that information is transmitted by firing rates. However, rate codes are energetically inefficient and computationally awkward. A combinatorial code is more plausible. If the engram consists of altered synaptic conductances (the usual assumption), then we must ask how numbers may be written to synapses. It is much easier to formulate a coding hypothesis if the engram is realized by a cell-intrinsic molecular mechanism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Revised SRAC code system

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Ishiguro, Yukio; Kaneko, Kunio; Ido, Masaru.

    1986-09-01

    Since the publication of JAERI-1285 in 1983 for the preliminary version of the SRAC code system, a number of additions and modifications to the functions have been made to establish an overall neutronics code system. Major points are (1) addition of JENDL-2 version of data library, (2) a direct treatment of doubly heterogeneous effect on resonance absorption, (3) a generalized Dancoff factor, (4) a cell calculation based on the fixed boundary source problem, (5) the corresponding edit required for experimental analysis and reactor design, (6) a perturbation theory calculation for reactivity change, (7) an auxiliary code for core burnup and fuel management, etc. This report is a revision of the users manual which consists of the general description, input data requirements and their explanation, detailed information on usage, mathematics, contents of libraries and sample I/O. (author)

  13. Code query by example

    Science.gov (United States)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  14. The correspondence between projective codes and 2-weight codes

    NARCIS (Netherlands)

    Brouwer, A.E.; Eupen, van M.J.M.; Tilborg, van H.C.A.; Willems, F.M.J.

    1994-01-01

    The hyperplanes intersecting a 2-weight code in the same number of points obviously form the point set of a projective code. On the other hand, if we have a projective code C, then we can make a 2-weight code by taking the multiset of points E PC with multiplicity "Y(w), where W is the weight of

  15. Visualizing code and coverage changes for code review

    NARCIS (Netherlands)

    Oosterwaal, Sebastiaan; van Deursen, A.; De Souza Coelho, R.; Sawant, A.A.; Bacchelli, A.

    2016-01-01

    One of the tasks of reviewers is to verify that code modifications are well tested. However, current tools offer little support in understanding precisely how changes to the code relate to changes to the tests. In particular, it is hard to see whether (modified) test code covers the changed code.

  16. Turbo-Gallager Codes: The Emergence of an Intelligent Coding ...

    African Journals Online (AJOL)

    Today, both turbo codes and low-density parity-check codes are largely superior to other code families and are being used in an increasing number of modern communication systems including 3G standards, satellite and deep space communications. However, the two codes have certain distinctive characteristics that ...

  17. Code of Medical Ethics

    Directory of Open Access Journals (Sweden)

    . SZD-SZZ

    2017-03-01

    Full Text Available Te Code was approved on December 12, 1992, at the 3rd regular meeting of the General Assembly of the Medical Chamber of Slovenia and revised on April 24, 1997, at the 27th regular meeting of the General Assembly of the Medical Chamber of Slovenia. The Code was updated and harmonized with the Medical Association of Slovenia and approved on October 6, 2016, at the regular meeting of the General Assembly of the Medical Chamber of Slovenia.

  18. Supervised Convolutional Sparse Coding

    KAUST Repository

    Affara, Lama Ahmed

    2018-04-08

    Convolutional Sparse Coding (CSC) is a well-established image representation model especially suited for image restoration tasks. In this work, we extend the applicability of this model by proposing a supervised approach to convolutional sparse coding, which aims at learning discriminative dictionaries instead of purely reconstructive ones. We incorporate a supervised regularization term into the traditional unsupervised CSC objective to encourage the final dictionary elements to be discriminative. Experimental results show that using supervised convolutional learning results in two key advantages. First, we learn more semantically relevant filters in the dictionary and second, we achieve improved image reconstruction on unseen data.

  19. CONCEPT computer code

    International Nuclear Information System (INIS)

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated

  20. Principles of speech coding

    CERN Document Server

    Ogunfunmi, Tokunbo

    2010-01-01

    It is becoming increasingly apparent that all forms of communication-including voice-will be transmitted through packet-switched networks based on the Internet Protocol (IP). Therefore, the design of modern devices that rely on speech interfaces, such as cell phones and PDAs, requires a complete and up-to-date understanding of the basics of speech coding. Outlines key signal processing algorithms used to mitigate impairments to speech quality in VoIP networksOffering a detailed yet easily accessible introduction to the field, Principles of Speech Coding provides an in-depth examination of the

  1. Coding the Complexity of Activity in Video Recordings

    DEFF Research Database (Denmark)

    Harter, Christopher Daniel; Otrel-Cass, Kathrin

    2017-01-01

    This paper presents a theoretical approach to coding and analyzing video data on human interaction and activity, using principles found in cultural historical activity theory. The systematic classification or coding of information contained in video data on activity can be arduous and time...... Bødker’s in 1996, three possible areas of expansion to Susanne Bødker’s method for analyzing video data were found. Firstly, a technological expansion due to contemporary developments in sophisticated analysis software, since the mid 1990’s. Secondly, a conceptual expansion, where the applicability...... of using Activity Theory outside of the context of human–computer interaction, is assessed. Lastly, a temporal expansion, by facilitating an organized method for tracking the development of activities over time, within the coding and analysis of video data. To expand on the above areas, a prototype coding...

  2. Evaluation Codes from an Affine Veriety Code Perspective

    DEFF Research Database (Denmark)

    Geil, Hans Olav

    2008-01-01

    Evaluation codes (also called order domain codes) are traditionally introduced as generalized one-point geometric Goppa codes. In the present paper we will give a new point of view on evaluation codes by introducing them instead as particular nice examples of affine variety codes. Our study...... includes a reformulation of the usual methods to estimate the minimum distances of evaluation codes into the setting of affine variety codes. Finally we describe the connection to the theory of one-pointgeometric Goppa codes. Contents 4.1 Introduction...... . . . . . . . . . . . . . . . . . . . . . . . 171 4.9 Codes form order domains . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 4.10 One-point geometric Goppa codes . . . . . . . . . . . . . . . . . . . . . . . . 176 4.11 Bibliographical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178 References...

  3. Changing patient classification system for hospital reimbursement in Romania.

    Science.gov (United States)

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-06-01

    To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians' knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case-mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case-mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care.

  4. Supply chain planning classification

    Science.gov (United States)

    Hvolby, Hans-Henrik; Trienekens, Jacques; Bonde, Hans

    2001-10-01

    Industry experience a need to shift in focus from internal production planning towards planning in the supply network. In this respect customer oriented thinking becomes almost a common good amongst companies in the supply network. An increase in the use of information technology is needed to enable companies to better tune their production planning with customers and suppliers. Information technology opportunities and supply chain planning systems facilitate companies to monitor and control their supplier network. In spite if these developments, most links in today's supply chains make individual plans, because the real demand information is not available throughout the chain. The current systems and processes of the supply chains are not designed to meet the requirements now placed upon them. For long term relationships with suppliers and customers, an integrated decision-making process is needed in order to obtain a satisfactory result for all parties. Especially when customized production and short lead-time is in focus. An effective value chain makes inventory available and visible among the value chain members, minimizes response time and optimizes total inventory value held throughout the chain. In this paper a supply chain planning classification grid is presented based current manufacturing classifications and supply chain planning initiatives.

  5. Waste classification sampling plan

    International Nuclear Information System (INIS)

    Landsman, S.D.

    1998-01-01

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998

  6. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  7. Classification of smooth Fano polytopes

    DEFF Research Database (Denmark)

    Øbro, Mikkel

    A simplicial lattice polytope containing the origin in the interior is called a smooth Fano polytope, if the vertices of every facet is a basis of the lattice. The study of smooth Fano polytopes is motivated by their connection to toric varieties. The thesis concerns the classification of smooth...... Fano polytopes up to isomorphism. A smooth Fano -polytope can have at most vertices. In case of vertices an explicit classification is known. The thesis contains the classification in case of vertices. Classifications of smooth Fano -polytopes for fixed exist only for . In the thesis an algorithm...... for the classification of smooth Fano -polytopes for any given is presented. The algorithm has been implemented and used to obtain the complete classification for ....

  8. Small-scale classification schemes

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2004-01-01

    Small-scale classification schemes are used extensively in the coordination of cooperative work. This study investigates the creation and use of a classification scheme for handling the system requirements during the redevelopment of a nation-wide information system. This requirements...... classification inherited a lot of its structure from the existing system and rendered requirements that transcended the framework laid out by the existing system almost invisible. As a result, the requirements classification became a defining element of the requirements-engineering process, though its main...... effects remained largely implicit. The requirements classification contributed to constraining the requirements-engineering process by supporting the software engineers in maintaining some level of control over the process. This way, the requirements classification provided the software engineers...

  9. Electronic structure classifications using scanning tunneling microscopy conductance imaging

    International Nuclear Information System (INIS)

    Horn, K.M.; Swartzentruber, B.S.; Osbourn, G.C.; Bouchard, A.; Bartholomew, J.W.

    1998-01-01

    The electronic structure of atomic surfaces is imaged by applying multivariate image classification techniques to multibias conductance data measured using scanning tunneling microscopy. Image pixels are grouped into classes according to shared conductance characteristics. The image pixels, when color coded by class, produce an image that chemically distinguishes surface electronic features over the entire area of a multibias conductance image. Such open-quotes classedclose quotes images reveal surface features not always evident in a topograph. This article describes the experimental technique used to record multibias conductance images, how image pixels are grouped in a mathematical, classification space, how a computed grouping algorithm can be employed to group pixels with similar conductance characteristics in any number of dimensions, and finally how the quality of the resulting classed images can be evaluated using a computed, combinatorial analysis of the full dimensional space in which the classification is performed. copyright 1998 American Institute of Physics

  10. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    Science.gov (United States)

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.

  11. Active Learning for Text Classification

    OpenAIRE

    Hu, Rong

    2011-01-01

    Text classification approaches are used extensively to solve real-world challenges. The success or failure of text classification systems hangs on the datasets used to train them, without a good dataset it is impossible to build a quality system. This thesis examines the applicability of active learning in text classification for the rapid and economical creation of labelled training data. Four main contributions are made in this thesis. First, we present two novel selection strategies to cho...

  12. Unsupervised Classification Using Immune Algorithm

    OpenAIRE

    Al-Muallim, M. T.; El-Kouatly, R.

    2012-01-01

    Unsupervised classification algorithm based on clonal selection principle named Unsupervised Clonal Selection Classification (UCSC) is proposed in this paper. The new proposed algorithm is data driven and self-adaptive, it adjusts its parameters to the data to make the classification operation as fast as possible. The performance of UCSC is evaluated by comparing it with the well known K-means algorithm using several artificial and real-life data sets. The experiments show that the proposed U...

  13. Reliability of Oronasal Fistula Classification.

    Science.gov (United States)

    Sitzman, Thomas J; Allori, Alexander C; Matic, Damir B; Beals, Stephen P; Fisher, David M; Samson, Thomas D; Marcus, Jeffrey R; Tse, Raymond W

    2018-01-01

    Objective Oronasal fistula is an important complication of cleft palate repair that is frequently used to evaluate surgical quality, yet reliability of fistula classification has never been examined. The objective of this study was to determine the reliability of oronasal fistula classification both within individual surgeons and between multiple surgeons. Design Using intraoral photographs of children with repaired cleft palate, surgeons rated the location of palatal fistulae using the Pittsburgh Fistula Classification System. Intrarater and interrater reliability scores were calculated for each region of the palate. Participants Eight cleft surgeons rated photographs obtained from 29 children. Results Within individual surgeons reliability for each region of the Pittsburgh classification ranged from moderate to almost perfect (κ = .60-.96). By contrast, reliability between surgeons was lower, ranging from fair to substantial (κ = .23-.70). Between-surgeon reliability was lowest for the junction of the soft and hard palates (κ = .23). Within-surgeon and between-surgeon reliability were almost perfect for the more general classification of fistula in the secondary palate (κ = .95 and κ = .83, respectively). Conclusions This is the first reliability study of fistula classification. We show that the Pittsburgh Fistula Classification System is reliable when used by an individual surgeon, but less reliable when used among multiple surgeons. Comparisons of fistula occurrence among surgeons may be subject to less bias if they use the more general classification of "presence or absence of fistula of the secondary palate" rather than the Pittsburgh Fistula Classification System.

  14. Sparse Representation Based Binary Hypothesis Model for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Yidong Tang

    2016-01-01

    Full Text Available The sparse representation based classifier (SRC and its kernel version (KSRC have been employed for hyperspectral image (HSI classification. However, the state-of-the-art SRC often aims at extended surface objects with linear mixture in smooth scene and assumes that the number of classes is given. Considering the small target with complex background, a sparse representation based binary hypothesis (SRBBH model is established in this paper. In this model, a query pixel is represented in two ways, which are, respectively, by background dictionary and by union dictionary. The background dictionary is composed of samples selected from the local dual concentric window centered at the query pixel. Thus, for each pixel the classification issue becomes an adaptive multiclass classification problem, where only the number of desired classes is required. Furthermore, the kernel method is employed to improve the interclass separability. In kernel space, the coding vector is obtained by using kernel-based orthogonal matching pursuit (KOMP algorithm. Then the query pixel can be labeled by the characteristics of the coding vectors. Instead of directly using the reconstruction residuals, the different impacts the background dictionary and union dictionary have on reconstruction are used for validation and classification. It enhances the discrimination and hence improves the performance.

  15. Dual Coding in Children.

    Science.gov (United States)

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  16. Physical layer network coding

    DEFF Research Database (Denmark)

    Fukui, Hironori; Popovski, Petar; Yomo, Hiroyuki

    2014-01-01

    Physical layer network coding (PLNC) has been proposed to improve throughput of the two-way relay channel, where two nodes communicate with each other, being assisted by a relay node. Most of the works related to PLNC are focused on a simple three-node model and they do not take into account...

  17. Radioactive action code

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    A new coding system, 'Hazrad', for buildings and transportation containers for alerting emergency services personnel to the presence of radioactive materials has been developed in the United Kingdom. The hazards of materials in the buildings or transport container, together with the recommended emergency action, are represented by a number of codes which are marked on the building or container and interpreted from a chart carried as a pocket-size guide. Buildings would be marked with the familiar yellow 'radioactive' trefoil, the written information 'Radioactive materials' and a list of isotopes. Under this the 'Hazrad' code would be written - three symbols to denote the relative radioactive risk (low, medium or high), the biological risk (also low, medium or high) and the third showing the type of radiation emitted, alpha, beta or gamma. The response cards indicate appropriate measures to take, eg for a high biological risk, Bio3, the wearing of a gas-tight protection suit is advised. The code and its uses are explained. (U.K.)

  18. Building Codes and Regulations.

    Science.gov (United States)

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  19. Physics of codes

    International Nuclear Information System (INIS)

    Cooper, R.K.; Jones, M.E.

    1989-01-01

    The title given this paper is a bit presumptuous, since one can hardly expect to cover the physics incorporated into all the codes already written and currently being written. The authors focus on those codes which have been found to be particularly useful in the analysis and design of linacs. At that the authors will be a bit parochial and discuss primarily those codes used for the design of radio-frequency (rf) linacs, although the discussions of TRANSPORT and MARYLIE have little to do with the time structures of the beams being analyzed. The plan of this paper is first to describe rather simply the concepts of emittance and brightness, then to describe rather briefly each of the codes TRANSPORT, PARMTEQ, TBCI, MARYLIE, and ISIS, indicating what physics is and is not included in each of them. It is expected that the vast majority of what is covered will apply equally well to protons and electrons (and other particles). This material is intended to be tutorial in nature and can in no way be expected to be exhaustive. 31 references, 4 figures

  20. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  1. Ready, steady… Code!

    CERN Multimedia

    Anaïs Schaeffer

    2013-01-01

    This summer, CERN took part in the Google Summer of Code programme for the third year in succession. Open to students from all over the world, this programme leads to very successful collaborations for open source software projects.   Image: GSoC 2013. Google Summer of Code (GSoC) is a global programme that offers student developers grants to write code for open-source software projects. Since its creation in 2005, the programme has brought together some 6,000 students from over 100 countries worldwide. The students selected by Google are paired with a mentor from one of the participating projects, which can be led by institutes, organisations, companies, etc. This year, CERN PH Department’s SFT (Software Development for Experiments) Group took part in the GSoC programme for the third time, submitting 15 open-source projects. “Once published on the Google Summer for Code website (in April), the projects are open to applications,” says Jakob Blomer, one of the o...

  2. CERN Code of Conduct

    CERN Document Server

    Department, HR

    2010-01-01

    The Code is intended as a guide in helping us, as CERN contributors, to understand how to conduct ourselves, treat others and expect to be treated. It is based around the five core values of the Organization. We should all become familiar with it and try to incorporate it into our daily life at CERN.

  3. Nuclear safety code study

    Energy Technology Data Exchange (ETDEWEB)

    Hu, H.H.; Ford, D.; Le, H.; Park, S.; Cooke, K.L.; Bleakney, T.; Spanier, J.; Wilburn, N.P.; O' Reilly, B.; Carmichael, B.

    1981-01-01

    The objective is to analyze an overpower accident in an LMFBR. A simplified model of the primary coolant loop was developed in order to understand the instabilities encountered with the MELT III and SAS codes. The computer programs were translated for switching to the IBM 4331. Numerical methods were investigated for solving the neutron kinetics equations; the Adams and Gear methods were compared. (DLC)

  4. Revised C++ coding conventions

    CERN Document Server

    Callot, O

    2001-01-01

    This document replaces the note LHCb 98-049 by Pavel Binko. After a few years of practice, some simplification and clarification of the rules was needed. As many more people have now some experience in writing C++ code, their opinion was also taken into account to get a commonly agreed set of conventions

  5. Corporate governance through codes

    NARCIS (Netherlands)

    Haxhi, I.; Aguilera, R.V.; Vodosek, M.; den Hartog, D.; McNett, J.M.

    2014-01-01

    The UK's 1992 Cadbury Report defines corporate governance (CG) as the system by which businesses are directed and controlled. CG codes are a set of best practices designed to address deficiencies in the formal contracts and institutions by suggesting prescriptions on the preferred role and

  6. Error Correcting Codes -34 ...

    Indian Academy of Sciences (India)

    information and coding theory. A large scale relay computer had failed to deliver the expected results due to a hardware fault. Hamming, one of the active proponents of computer usage, was determined to find an efficient means by which computers could detect and correct their own faults. A mathematician by train-.

  7. Broadcast Coded Slotted ALOHA

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Brännström, Frederik; Graell i Amat, Alexandre

    2016-01-01

    We propose an uncoordinated medium access control (MAC) protocol, called all-to-all broadcast coded slotted ALOHA (B-CSA) for reliable all-to-all broadcast with strict latency constraints. In B-CSA, each user acts as both transmitter and receiver in a half-duplex mode. The half-duplex mode gives ...

  8. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  9. New code of conduct

    CERN Multimedia

    Laëtitia Pedroso

    2010-01-01

    During his talk to the staff at the beginning of the year, the Director-General mentioned that a new code of conduct was being drawn up. What exactly is it and what is its purpose? Anne-Sylvie Catherin, Head of the Human Resources (HR) Department, talked to us about the whys and wherefores of the project.   Drawing by Georges Boixader from the cartoon strip “The World of Particles” by Brian Southworth. A code of conduct is a general framework laying down the behaviour expected of all members of an organisation's personnel. “CERN is one of the very few international organisations that don’t yet have one", explains Anne-Sylvie Catherin. “We have been thinking about introducing a code of conduct for a long time but lacked the necessary resources until now”. The call for a code of conduct has come from different sources within the Laboratory. “The Equal Opportunities Advisory Panel (read also the "Equal opportuni...

  10. (Almost) practical tree codes

    KAUST Repository

    Khina, Anatoly

    2016-08-15

    We consider the problem of stabilizing an unstable plant driven by bounded noise over a digital noisy communication link, a scenario at the heart of networked control. To stabilize such a plant, one needs real-time encoding and decoding with an error probability profile that decays exponentially with the decoding delay. The works of Schulman and Sahai over the past two decades have developed the notions of tree codes and anytime capacity, and provided the theoretical framework for studying such problems. Nonetheless, there has been little practical progress in this area due to the absence of explicit constructions of tree codes with efficient encoding and decoding algorithms. Recently, linear time-invariant tree codes were proposed to achieve the desired result under maximum-likelihood decoding. In this work, we take one more step towards practicality, by showing that these codes can be efficiently decoded using sequential decoding algorithms, up to some loss in performance (and with some practical complexity caveats). We supplement our theoretical results with numerical simulations that demonstrate the effectiveness of the decoder in a control system setting.

  11. Decoding Codes on Graphs

    Indian Academy of Sciences (India)

    having a probability Pi of being equal to a 1. Let us assume ... equal to a 0/1 has no bearing on the probability of the. It is often ... bits (call this set S) whose individual bits add up to zero ... In the context of binary error-correct~ng codes, specifi-.

  12. The Redox Code.

    Science.gov (United States)

    Jones, Dean P; Sies, Helmut

    2015-09-20

    The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O₂ and H₂O₂ contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine.

  13. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    Science.gov (United States)

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  14. Z₂-double cyclic codes

    OpenAIRE

    Borges, J.

    2014-01-01

    A binary linear code C is a Z2-double cyclic code if the set of coordinates can be partitioned into two subsets such that any cyclic shift of the coordinates of both subsets leaves invariant the code. These codes can be identified as submodules of the Z2[x]-module Z2[x]/(x^r − 1) × Z2[x]/(x^s − 1). We determine the structure of Z2-double cyclic codes giving the generator polynomials of these codes. The related polynomial representation of Z2-double cyclic codes and its duals, and the relation...

  15. Coding for urologic office procedures.

    Science.gov (United States)

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Decoding small surface codes with feedforward neural networks

    Science.gov (United States)

    Varsamopoulos, Savvas; Criger, Ben; Bertels, Koen

    2018-01-01

    Surface codes reach high error thresholds when decoded with known algorithms, but the decoding time will likely exceed the available time budget, especially for near-term implementations. To decrease the decoding time, we reduce the decoding problem to a classification problem that a feedforward neural network can solve. We investigate quantum error correction and fault tolerance at small code distances using neural network-based decoders, demonstrating that the neural network can generalize to inputs that were not provided during training and that they can reach similar or better decoding performance compared to previous algorithms. We conclude by discussing the time required by a feedforward neural network decoder in hardware.

  17. Essential idempotents and simplex codes

    Directory of Open Access Journals (Sweden)

    Gladys Chalom

    2017-01-01

    Full Text Available We define essential idempotents in group algebras and use them to prove that every mininmal abelian non-cyclic code is a repetition code. Also we use them to prove that every minimal abelian code is equivalent to a minimal cyclic code of the same length. Finally, we show that a binary cyclic code is simplex if and only if is of length of the form $n=2^k-1$ and is generated by an essential idempotent.

  18. Can Automatic Classification Help to Increase Accuracy in Data Collection?

    Directory of Open Access Journals (Sweden)

    Frederique Lang

    2016-09-01

    Full Text Available Purpose: The authors aim at testing the performance of a set of machine learning algorithms that could improve the process of data cleaning when building datasets. Design/methodology/approach: The paper is centered on cleaning datasets gathered from publishers and online resources by the use of specific keywords. In this case, we analyzed data from the Web of Science. The accuracy of various forms of automatic classification was tested here in comparison with manual coding in order to determine their usefulness for data collection and cleaning. We assessed the performance of seven supervised classification algorithms (Support Vector Machine (SVM, Scaled Linear Discriminant Analysis, Lasso and elastic-net regularized generalized linear models, Maximum Entropy, Regression Tree, Boosting, and Random Forest and analyzed two properties: accuracy and recall. We assessed not only each algorithm individually, but also their combinations through a voting scheme. We also tested the performance of these algorithms with different sizes of training data. When assessing the performance of different combinations, we used an indicator of coverage to account for the agreement and disagreement on classification between algorithms. Findings: We found that the performance of the algorithms used vary with the size of the sample for training. However, for the classification exercise in this paper the best performing algorithms were SVM and Boosting. The combination of these two algorithms achieved a high agreement on coverage and was highly accurate. This combination performs well with a small training dataset (10%, which may reduce the manual work needed for classification tasks. Research limitations: The dataset gathered has significantly more records related to the topic of interest compared to unrelated topics. This may affect the performance of some algorithms, especially in their identification of unrelated papers. Practical implications: Although the

  19. Hyperspectral Image Classification Based on the Combination of Spatial-spectral Feature and Sparse Representation

    Directory of Open Access Journals (Sweden)

    YANG Zhaoxia

    2015-07-01

    Full Text Available In order to avoid the problem of being over-dependent on high-dimensional spectral feature in the traditional hyperspectral image classification, a novel approach based on the combination of spatial-spectral feature and sparse representation is proposed in this paper. Firstly, we extract the spatial-spectral feature by reorganizing the local image patch with the first d principal components(PCs into a vector representation, followed by a sorting scheme to make the vector invariant to local image rotation. Secondly, we learn the dictionary through a supervised method, and use it to code the features from test samples afterwards. Finally, we embed the resulting sparse feature coding into the support vector machine(SVM for hyperspectral image classification. Experiments using three hyperspectral data show that the proposed method can effectively improve the classification accuracy comparing with traditional classification methods.

  20. Rate-adaptive BCH codes for distributed source coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Larsen, Knud J.; Forchhammer, Søren

    2013-01-01

    This paper considers Bose-Chaudhuri-Hocquenghem (BCH) codes for distributed source coding. A feedback channel is employed to adapt the rate of the code during the decoding process. The focus is on codes with short block lengths for independently coding a binary source X and decoding it given its...... strategies for improving the reliability of the decoded result are analyzed, and methods for estimating the performance are proposed. In the analysis, noiseless feedback and noiseless communication are assumed. Simulation results show that rate-adaptive BCH codes achieve better performance than low...... correlated side information Y. The proposed codes have been analyzed in a high-correlation scenario, where the marginal probability of each symbol, Xi in X, given Y is highly skewed (unbalanced). Rate-adaptive BCH codes are presented and applied to distributed source coding. Adaptive and fixed checking...

  1. Classification of radioactive waste

    International Nuclear Information System (INIS)

    1994-01-01

    Radioactive wastes are generated in a number of different kinds of facilities and arise in a wide range of concentrations of radioactive materials and in a variety of physical and chemical forms. To simplify their management, a number of schemes have evolved for classifying radioactive waste according to the physical, chemical and radiological properties of significance to those facilities managing this waste. These schemes have led to a variety of terminologies, differing from country to country and even between facilities in the same country. This situation makes it difficult for those concerned to communicate with one another regarding waste management practices. This document revises and updates earlier IAEA references on radioactive waste classification systems given in IAEA Technical Reports Series and Safety Series. Guidance regarding exemption of materials from regulatory control is consistent with IAEA Safety Series and the RADWASS documents published under IAEA Safety Series. 11 refs, 2 figs, 2 tab

  2. Nonlinear estimation and classification

    CERN Document Server

    Hansen, Mark; Holmes, Christopher; Mallick, Bani; Yu, Bin

    2003-01-01

    Researchers in many disciplines face the formidable task of analyzing massive amounts of high-dimensional and highly-structured data This is due in part to recent advances in data collection and computing technologies As a result, fundamental statistical research is being undertaken in a variety of different fields Driven by the complexity of these new problems, and fueled by the explosion of available computer power, highly adaptive, non-linear procedures are now essential components of modern "data analysis," a term that we liberally interpret to include speech and pattern recognition, classification, data compression and signal processing The development of new, flexible methods combines advances from many sources, including approximation theory, numerical analysis, machine learning, signal processing and statistics The proposed workshop intends to bring together eminent experts from these fields in order to exchange ideas and forge directions for the future

  3. Automatic diabetic retinopathy classification

    Science.gov (United States)

    Bravo, María. A.; Arbeláez, Pablo A.

    2017-11-01

    Diabetic retinopathy (DR) is a disease in which the retina is damaged due to augmentation in the blood pressure of small vessels. DR is the major cause of blindness for diabetics. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This work proposes a computer based approach for the detection of DR in back-of-the-eye images based on the use of convolutional neural networks (CNNs). Our CNN uses deep architectures to classify Back-of-the-eye Retinal Photographs (BRP) in 5 stages of DR. Our method combines several preprocessing images of BRP to obtain an ACA score of 50.5%. Furthermore, we explore subproblems by training a larger CNN of our main classification task.

  4. [Coding in general practice-Will the ICD-11 be a step forward?

    Science.gov (United States)

    Kühlein, Thomas; Virtanen, Martti; Claus, Christoph; Popert, Uwe; van Boven, Kees

    2018-07-01

    Primary care physicians in Germany don't benefit from coding diagnoses-they are coding for the needs of others. For coding, they mostly are using either the thesaurus of the German Institute of Medical Documentation and Information (DIMDI) or self-made cheat-sheets. Coding quality is low but seems to be sufficient for the main use case of the resulting data, which is the morbidity adjusted risk compensation scheme that distributes financial resources between the many German health insurance companies.Neither the International Classification of Diseases and Health Related Problems (ICD-10) nor the German thesaurus as an interface terminology are adequate for coding in primary care. The ICD-11 itself will not recognizably be a step forward from the perspective of primary care. At least the browser database format will be advantageous. An implementation into the 182 different electronic health records (EHR) on the German market would probably standardize the coding process and make code finding easier. This method of coding would still be more cumbersome than the current coding with self-made cheat-sheets.The first steps towards a useful official cheat-sheet for primary care have been taken, awaiting implementation and evaluation. The International Classification of Primary Care (ICPC-2) already provides an adequate classification standard for primary care that can also be used in combination with ICD-10. A new version of ICPC (ICPC-3) is under development. As the ICPC-2 has already been integrated into the foundation layer of ICD-11 it might easily become the future standard for coding in primary care. Improving communication between the different EHR would make taking over codes from other healthcare providers possible. Another opportunity to improve the coding quality might be creating use cases for the resulting data for the primary care physicians themselves.

  5. Entanglement-assisted quantum MDS codes constructed from negacyclic codes

    Science.gov (United States)

    Chen, Jianzhang; Huang, Yuanyuan; Feng, Chunhui; Chen, Riqing

    2017-12-01

    Recently, entanglement-assisted quantum codes have been constructed from cyclic codes by some scholars. However, how to determine the number of shared pairs required to construct entanglement-assisted quantum codes is not an easy work. In this paper, we propose a decomposition of the defining set of negacyclic codes. Based on this method, four families of entanglement-assisted quantum codes constructed in this paper satisfy the entanglement-assisted quantum Singleton bound, where the minimum distance satisfies q+1 ≤ d≤ n+2/2. Furthermore, we construct two families of entanglement-assisted quantum codes with maximal entanglement.

  6. Tax reliefs in the Russian Federation, their definition, types and classification

    OpenAIRE

    Natalia Soloveva

    2012-01-01

    The present article analyzes the definition of tax allowances that is fixed in Tax Code of the Russian Federation and classification of tax allowances into tax exceptions, tax abatements and tax discharges. The article also covers the author's classification of tax allowances into direct and indirect ones, according to economic benefits obtained by taxpayers as a result of using tax allowances. In the conclusion, the author determines an exhaustive list of tax allowances in the Russian tax le...

  7. Tax reliefs in the Russian Federation, their definition, types and classification

    Directory of Open Access Journals (Sweden)

    Natalia Soloveva

    2012-12-01

    Full Text Available The present article analyzes the definition of tax allowances that is fixed in Tax Code of the Russian Federation and classification of tax allowances into tax exceptions, tax abatements and tax discharges. The article also covers the author's classification of tax allowances into direct and indirect ones, according to economic benefits obtained by taxpayers as a result of using tax allowances. In the conclusion, the author determines an exhaustive list of tax allowances in the Russian tax legislation.

  8. Hazard classification or risk assessment

    DEFF Research Database (Denmark)

    Hass, Ulla

    2013-01-01

    The EU classification of substances for e.g. reproductive toxicants is hazard based and does not to address the risk suchsubstances may pose through normal, or extreme, use. Such hazard classification complies with the consumer's right to know. It is also an incentive to careful use and storage...

  9. Seismic texture classification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Vinther, R.

    1997-12-31

    The seismic texture classification method, is a seismic attribute that can both recognize the general reflectivity styles and locate variations from these. The seismic texture classification performs a statistic analysis for the seismic section (or volume) aiming at describing the reflectivity. Based on a set of reference reflectivities the seismic textures are classified. The result of the seismic texture classification is a display of seismic texture categories showing both the styles of reflectivity from the reference set and interpolations and extrapolations from these. The display is interpreted as statistical variations in the seismic data. The seismic texture classification is applied to seismic sections and volumes from the Danish North Sea representing both horizontal stratifications and salt diapers. The attribute succeeded in recognizing both general structure of successions and variations from these. Also, the seismic texture classification is not only able to display variations in prospective areas (1-7 sec. TWT) but can also be applied to deep seismic sections. The seismic texture classification is tested on a deep reflection seismic section (13-18 sec. TWT) from the Baltic Sea. Applied to this section the seismic texture classification succeeded in locating the Moho, which could not be located using conventional interpretation tools. The seismic texture classification is a seismic attribute which can display general reflectivity styles and deviations from these and enhance variations not found by conventional interpretation tools. (LN)

  10. Efficient AUC optimization for classification

    NARCIS (Netherlands)

    Calders, T.; Jaroszewicz, S.; Kok, J.N.; Koronacki, J.; Lopez de Mantaras, R.; Matwin, S.; Mladenic, D.; Skowron, A.

    2007-01-01

    In this paper we show an efficient method for inducing classifiers that directly optimize the area under the ROC curve. Recently, AUC gained importance in the classification community as a mean to compare the performance of classifiers. Because most classification methods do not optimize this

  11. Dewey Decimal Classification: A Quagmire.

    Science.gov (United States)

    Gamaluddin, Ahmad Fouad

    1980-01-01

    A survey of 660 Pennsylvania school librarians indicates that, though there is limited professional interest in the Library of Congress Classification system, Dewey Decimal Classification (DDC) appears to be firmly entrenched. This article also discusses the relative merits of DDC, the need for a uniform system, librarianship preparation, and…

  12. Latent class models for classification

    NARCIS (Netherlands)

    Vermunt, J.K.; Magidson, J.

    2003-01-01

    An overview is provided of recent developments in the use of latent class (LC) and other types of finite mixture models for classification purposes. Several extensions of existing models are presented. Two basic types of LC models for classification are defined: supervised and unsupervised

  13. 45 CFR 601.5 - Derivative classification.

    Science.gov (United States)

    2010-10-01

    ... CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct... 45 Public Welfare 3 2010-10-01 2010-10-01 false Derivative classification. 601.5 Section 601.5... classification guide, need not possess original classification authority. (a) If a person who applies derivative...

  14. 12 CFR 403.4 - Derivative classification.

    Science.gov (United States)

    2010-01-01

    ... SAFEGUARDING OF NATIONAL SECURITY INFORMATION § 403.4 Derivative classification. (a) Use of derivative classification. (1) Unlike original classification which is an initial determination, derivative classification... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Derivative classification. 403.4 Section 403.4...

  15. 32 CFR 2001.15 - Classification guides.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Classification guides. 2001.15 Section 2001.15..., NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Classification § 2001.15 Classification guides. (a) Preparation of classification guides. Originators of classification...

  16. Validity of vascular trauma codes at major trauma centres.

    Science.gov (United States)

    Altoijry, Abdulmajeed; Al-Omran, Mohammed; Lindsay, Thomas F; Johnston, K Wayne; Melo, Magda; Mamdani, Muhammad

    2013-12-01

    The use of administrative databases in vascular injury research has been increasing, but the validity of the diagnosis codes used in this research is uncertain. We assessed the positive predictive value (PPV) of International Classification of Diseases, tenth revision (ICD-10), vascular injury codes in administrative claims data in Ontario. We conducted a retrospective validation study using the Canadian Institute for Health Information Discharge Abstract Database, an administrative database that records all hospital admissions in Canada. We evaluated 380 randomly selected hospital discharge abstracts from the 2 main trauma centres in Toronto, Ont., St.Michael's Hospital and Sunnybrook Health Sciences Centre, between Apr. 1, 2002, and Mar. 31, 2010. We then compared these records with the corresponding patients' hospital charts to assess the level of agreement for procedure coding. We calculated the PPV and sensitivity to estimate the validity of vascular injury diagnosis coding. The overall PPV for vascular injury coding was estimated to be 95% (95% confidence interval [CI] 92.3-96.8). The PPV among code groups for neck, thorax, abdomen, upper extremity and lower extremity injuries ranged from 90.8 (95% CI 82.2-95.5) to 97.4 (95% CI 91.0-99.3), whereas sensitivity ranged from 90% (95% CI 81.5-94.8) to 98.7% (95% CI 92.9-99.8). Administrative claims hospital discharge data based on ICD-10 diagnosis codes have a high level of validity when identifying cases of vascular injury. Observational Study Level III.

  17. Efficient convolutional sparse coding

    Science.gov (United States)

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  18. Coded Network Function Virtualization

    DEFF Research Database (Denmark)

    Al-Shuwaili, A.; Simone, O.; Kliewer, J.

    2016-01-01

    Network function virtualization (NFV) prescribes the instantiation of network functions on general-purpose network devices, such as servers and switches. While yielding a more flexible and cost-effective network architecture, NFV is potentially limited by the fact that commercial off......-the-shelf hardware is less reliable than the dedicated network elements used in conventional cellular deployments. The typical solution for this problem is to duplicate network functions across geographically distributed hardware in order to ensure diversity. In contrast, this letter proposes to leverage channel...... coding in order to enhance the robustness on NFV to hardware failure. The proposed approach targets the network function of uplink channel decoding, and builds on the algebraic structure of the encoded data frames in order to perform in-network coding on the signals to be processed at different servers...

  19. The NIMROD Code

    Science.gov (United States)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  20. Diabetes classification using a redundancy reduction preprocessor

    Directory of Open Access Journals (Sweden)

    Áurea Celeste Ribeiro

    Full Text Available Introduction Diabetes patients can benefit significantly from early diagnosis. Thus, accurate automated screening is becoming increasingly important due to the wide spread of that disease. Previous studies in automated screening have found a maximum accuracy of 92.6%. Methods This work proposes a classification methodology based on efficient coding of the input data, which is carried out by decreasing input data redundancy using well-known ICA algorithms, such as FastICA, JADE and INFOMAX. The classifier used in the task to discriminate diabetics from non-diaibetics is the one class support vector machine. Classification tests were performed using noninvasive and invasive indicators. Results The results suggest that redundancy reduction increases one-class support vector machine performance when discriminating between diabetics and nondiabetics up to an accuracy of 98.47% while using all indicators. By using only noninvasive indicators, an accuracy of 98.28% was obtained. Conclusion The ICA feature extraction improves the performance of the classifier in the data set because it reduces the statistical dependence of the collected data, which increases the ability of the classifier to find accurate class boundaries.

  1. Discriminative Bayesian Dictionary Learning for Classification.

    Science.gov (United States)

    Akhtar, Naveed; Shafait, Faisal; Mian, Ajmal

    2016-12-01

    We propose a Bayesian approach to learn discriminative dictionaries for sparse representation of data. The proposed approach infers probability distributions over the atoms of a discriminative dictionary using a finite approximation of Beta Process. It also computes sets of Bernoulli distributions that associate class labels to the learned dictionary atoms. This association signifies the selection probabilities of the dictionary atoms in the expansion of class-specific data. Furthermore, the non-parametric character of the proposed approach allows it to infer the correct size of the dictionary. We exploit the aforementioned Bernoulli distributions in separately learning a linear classifier. The classifier uses the same hierarchical Bayesian model as the dictionary, which we present along the analytical inference solution for Gibbs sampling. For classification, a test instance is first sparsely encoded over the learned dictionary and the codes are fed to the classifier. We performed experiments for face and action recognition; and object and scene-category classification using five public datasets and compared the results with state-of-the-art discriminative sparse representation approaches. Experiments show that the proposed Bayesian approach consistently outperforms the existing approaches.

  2. A system for heart sounds classification.

    Directory of Open Access Journals (Sweden)

    Grzegorz Redlarski

    Full Text Available The future of quick and efficient disease diagnosis lays in the development of reliable non-invasive methods. As for the cardiac diseases - one of the major causes of death around the globe - a concept of an electronic stethoscope equipped with an automatic heart tone identification system appears to be the best solution. Thanks to the advancement in technology, the quality of phonocardiography signals is no longer an issue. However, appropriate algorithms for auto-diagnosis systems of heart diseases that could be capable of distinguishing most of known pathological states have not been yet developed. The main issue is non-stationary character of phonocardiography signals as well as a wide range of distinguishable pathological heart sounds. In this paper a new heart sound classification technique, which might find use in medical diagnostic systems, is presented. It is shown that by combining Linear Predictive Coding coefficients, used for future extraction, with a classifier built upon combining Support Vector Machine and Modified Cuckoo Search algorithm, an improvement in performance of the diagnostic system, in terms of accuracy, complexity and range of distinguishable heart sounds, can be made. The developed system achieved accuracy above 93% for all considered cases including simultaneous identification of twelve different heart sound classes. The respective system is compared with four different major classification methods, proving its reliability.

  3. Ebolavirus Classification Based on Natural Vectors

    Science.gov (United States)

    Zheng, Hui; Yin, Changchuan; Hoang, Tung; He, Rong Lucy; Yang, Jie

    2015-01-01

    According to the WHO, ebolaviruses have resulted in 8818 human deaths in West Africa as of January 2015. To better understand the evolutionary relationship of the ebolaviruses and infer virulence from the relationship, we applied the alignment-free natural vector method to classify the newest ebolaviruses. The dataset includes three new Guinea viruses as well as 99 viruses from Sierra Leone. For the viruses of the family of Filoviridae, both genus label classification and species label classification achieve an accuracy rate of 100%. We represented the relationships among Filoviridae viruses by Unweighted Pair Group Method with Arithmetic Mean (UPGMA) phylogenetic trees and found that the filoviruses can be separated well by three genera. We performed the phylogenetic analysis on the relationship among different species of Ebolavirus by their coding-complete genomes and seven viral protein genes (glycoprotein [GP], nucleoprotein [NP], VP24, VP30, VP35, VP40, and RNA polymerase [L]). The topology of the phylogenetic tree by the viral protein VP24 shows consistency with the variations of virulence of ebolaviruses. The result suggests that VP24 be a pharmaceutical target for treating or preventing ebolaviruses. PMID:25803489

  4. Application Study of Fire Severity Classification

    International Nuclear Information System (INIS)

    Kim, In Hwan; Kim, Hyeong Taek; Jee, Moon Hak; Kim, Yun Jung

    2013-01-01

    This paper introduces the Fire Incidents Severity Classification Method for Korean NPPs that may be derived directly from the data fields and feasibility study for domestic uses. FEDB was characterized in more detail and assessed based on the significance of fire incidents in the updated database and five fire severity categories were defined. The logical approach to determine the fire severity starts from the most severe characteristics, namely challenging fires, and continues to define the less challenging and undetermined categories in progress. If the FEDB is utilized for Korean NPPs, the ways of Fire Severity Classification suggested in 2.4 above can be utilized for the quantitative fire risk analysis in future. The Fire Events Database (FEDB) is the primary source of fire data which are used for fire frequency in Fire PSA (Probabilistic Safety Assessment). The purpose of its development is to calculate the quantitative fire frequency at the comprehensive and consolidated source derived from the fire incident information available for Nuclear Power Plants (NPPs). Recently, the Fire Events Database (FEDB) was updated by Electric Power Research Institute (EPRI) and Nuclear Regulatory Commission (NRC) in U. S. The FEDB is intended to update the fire event history up to 2009. A significant enhancement to it is the reorganization and refinement of the database structure and data fields. It has been expanded and improved data fields, coding consistency, incident detail, data review fields, and reference data source traceability. It has been designed to better support several Fire PRA uses as well

  5. Vietnamese Document Representation and Classification

    Science.gov (United States)

    Nguyen, Giang-Son; Gao, Xiaoying; Andreae, Peter

    Vietnamese is very different from English and little research has been done on Vietnamese document classification, or indeed, on any kind of Vietnamese language processing, and only a few small corpora are available for research. We created a large Vietnamese text corpus with about 18000 documents, and manually classified them based on different criteria such as topics and styles, giving several classification tasks of different difficulty levels. This paper introduces a new syllable-based document representation at the morphological level of the language for efficient classification. We tested the representation on our corpus with different classification tasks using six classification algorithms and two feature selection techniques. Our experiments show that the new representation is effective for Vietnamese categorization, and suggest that best performance can be achieved using syllable-pair document representation, an SVM with a polynomial kernel as the learning algorithm, and using Information gain and an external dictionary for feature selection.

  6. Classification of Hyperspectral Images Using Kernel Fully Constrained Least Squares

    Directory of Open Access Journals (Sweden)

    Jianjun Liu

    2017-11-01

    Full Text Available As a widely used classifier, sparse representation classification (SRC has shown its good performance for hyperspectral image classification. Recent works have highlighted that it is the collaborative representation mechanism under SRC that makes SRC a highly effective technique for classification purposes. If the dimensionality and the discrimination capacity of a test pixel is high, other norms (e.g., ℓ 2 -norm can be used to regularize the coding coefficients, except for the sparsity ℓ 1 -norm. In this paper, we show that in the kernel space the nonnegative constraint can also play the same role, and thus suggest the investigation of kernel fully constrained least squares (KFCLS for hyperspectral image classification. Furthermore, in order to improve the classification performance of KFCLS by incorporating spatial-spectral information, we investigate two kinds of spatial-spectral methods using two regularization strategies: (1 the coefficient-level regularization strategy, and (2 the class-level regularization strategy. Experimental results conducted on four real hyperspectral images demonstrate the effectiveness of the proposed KFCLS, and show which way to incorporate spatial-spectral information efficiently in the regularization framework.

  7. Computer code FIT

    International Nuclear Information System (INIS)

    Rohmann, D.; Koehler, T.

    1987-02-01

    This is a description of the computer code FIT, written in FORTRAN-77 for a PDP 11/34. FIT is an interactive program to decude position, width and intensity of lines of X-ray spectra (max. length of 4K channels). The lines (max. 30 lines per fit) may have Gauss- or Voigt-profile, as well as exponential tails. Spectrum and fit can be displayed on a Tektronix terminal. (orig.) [de

  8. Discrete Sparse Coding.

    Science.gov (United States)

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  9. Code of Practice

    International Nuclear Information System (INIS)

    Doyle, Colin; Hone, Christopher; Nowlan, N.V.

    1984-05-01

    This Code of Practice introduces accepted safety procedures associated with the use of alpha, beta, gamma and X-radiation in secondary schools (pupils aged 12 to 18) in Ireland, and summarises good practice and procedures as they apply to radiation protection. Typical dose rates at various distances from sealed sources are quoted, and simplified equations are used to demonstrate dose and shielding calculations. The regulatory aspects of radiation protection are outlined, and references to statutory documents are given

  10. Tokamak simulation code manual

    International Nuclear Information System (INIS)

    Chung, Moon Kyoo; Oh, Byung Hoon; Hong, Bong Keun; Lee, Kwang Won

    1995-01-01

    The method to use TSC (Tokamak Simulation Code) developed by Princeton plasma physics laboratory is illustrated. In KT-2 tokamak, time dependent simulation of axisymmetric toroidal plasma and vertical stability have to be taken into account in design phase using TSC. In this report physical modelling of TSC are described and examples of application in JAERI and SERI are illustrated, which will be useful when TSC is installed KAERI computer system. (Author) 15 refs., 6 figs., 3 tabs

  11. Status of MARS Code

    Energy Technology Data Exchange (ETDEWEB)

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  12. Codes of Good Governance

    DEFF Research Database (Denmark)

    Beck Jørgensen, Torben; Sørensen, Ditte-Lene

    2013-01-01

    Good governance is a broad concept used by many international organizations to spell out how states or countries should be governed. Definitions vary, but there is a clear core of common public values, such as transparency, accountability, effectiveness, and the rule of law. It is quite likely......, transparency, neutrality, impartiality, effectiveness, accountability, and legality. The normative context of public administration, as expressed in codes, seems to ignore the New Public Management and Reinventing Government reform movements....

  13. Some debatable problems of stratigraphic classification

    Science.gov (United States)

    Gladenkov, Yury

    2014-05-01

    Russian geologists perform large-scale geological mapping in Russia and abroad. Therefore we urge unification of legends of geological maps compiled in different countries. It seems important to continuously organize discussions on problems of stratigraphic classification. 1. The stratigraphic schools (conventionally called "European" and "American") define "stratigraphy" in different ways. The former prefers "single" stratigraphy that uses data proved by many methods. The latter divides stratigraphy into several independent stratigraphers (litho-, bio-, magneto- and others). Russian geologists classify stratigraphic units into general (chronostratigraphic) and special (in accordance with a method applied). 2. There exist different interpretations of chronostratigraphy. Some stratigraphers suppose that a chronostratigraphic unit corresponds to rock strata formed during a certain time interval (it is somewhat formalistic because a length of interval is frequently unspecified). Russian specialists emphasize the historical-geological background of chronostratigraphic units. Every stratigraphic unit (global and regional) reflects a stage of geological evolution of biosphere and stratisphere. 3. In the view of Russian stratigraphers, the main stratigraphic units may have different extent: a) global (stage), b) regional (regional stage,local zone), and c) local (suite). There is no such hierarchy in the ISG. 4. Russian specialists think that local "lithostratigraphic" units (formations) which may have diachronous boundaries are not chronostratigraphic ones in strict sense (actually they are lithological bodies). In this case "lithostratigraphy" can be considered as "prostratigraphy" and employed in initial studies of sequences. Therefore, a suite is a main local unit of the Russian Code and differs from a formation, although it is somewhat similar. It does not mean that lithostratigraphy is unnecessary. Usage of marker horizons, members and other bodies is of great help

  14. Orthopedics coding and funding.

    Science.gov (United States)

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. Copyright © 2014. Published by Elsevier Masson SAS.

  15. Code Modernization of VPIC

    Science.gov (United States)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  16. MELCOR computer code manuals

    Energy Technology Data Exchange (ETDEWEB)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L. [Sandia National Labs., Albuquerque, NM (United States); Hodge, S.A.; Hyman, C.R.; Sanders, R.L. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  17. MELCOR computer code manuals

    International Nuclear Information System (INIS)

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR's phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package

  18. Quality Improvement of MARS Code and Establishment of Code Coupling

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Kim, Kyung Doo

    2010-04-01

    The improvement of MARS code quality and coupling with regulatory auditing code have been accomplished for the establishment of self-reliable technology based regulatory auditing system. The unified auditing system code was realized also by implementing the CANDU specific models and correlations. As a part of the quality assurance activities, the various QA reports were published through the code assessments. The code manuals were updated and published a new manual which describe the new models and correlations. The code coupling methods were verified though the exercise of plant application. The education-training seminar and technology transfer were performed for the code users. The developed MARS-KS is utilized as reliable auditing tool for the resolving the safety issue and other regulatory calculations. The code can be utilized as a base technology for GEN IV reactor applications

  19. Design of convolutional tornado code

    Science.gov (United States)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  20. Random linear codes in steganography

    Directory of Open Access Journals (Sweden)

    Kamil Kaczyński

    2016-12-01

    Full Text Available Syndrome coding using linear codes is a technique that allows improvement in the steganographic algorithms parameters. The use of random linear codes gives a great flexibility in choosing the parameters of the linear code. In parallel, it offers easy generation of parity check matrix. In this paper, the modification of LSB algorithm is presented. A random linear code [8, 2] was used as a base for algorithm modification. The implementation of the proposed algorithm, along with practical evaluation of algorithms’ parameters based on the test images was made.[b]Keywords:[/b] steganography, random linear codes, RLC, LSB